Enhancing Instruction through Technology.
ERIC Educational Resources Information Center
Greenleaf, Connie; Gee, Mary Kay
Following an introductory section that provides a rationale for using computers in workplace literacy classes, this guide reviews six computer programs and provides activities that teachers can use with the programs in teaching workplace literacy classes. The six computer programs reviewed are as follows: "Grammar Games,""Spell It 3,""The Way…
Collectively loading programs in a multiple program multiple data environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.
Techniques are disclosed for loading programs efficiently in a parallel computing system. In one embodiment, nodes of the parallel computing system receive a load description file which indicates, for each program of a multiple program multiple data (MPMD) job, nodes which are to load the program. The nodes determine, using collective operations, a total number of programs to load and a number of programs to load in parallel. The nodes further generate a class route for each program to be loaded in parallel, where the class route generated for a particular program includes only those nodes on which the programmore » needs to be loaded. For each class route, a node is selected using a collective operation to be a load leader which accesses a file system to load the program associated with a class route and broadcasts the program via the class route to other nodes which require the program.« less
ERIC Educational Resources Information Center
Hawi, Nazir
2010-01-01
The author has undergone a major shift in the way of teaching his undergraduate computer programming courses. In the classroom, the teacher's computer is connected to a splitter and a video projector that display the computer's screen to the entire class. Using this technology, the programming language itself is used live in class to help the…
Teaching Computational Geophysics Classes using Active Learning Techniques
NASA Astrophysics Data System (ADS)
Keers, H.; Rondenay, S.; Harlap, Y.; Nordmo, I.
2016-12-01
We give an overview of our experience in teaching two computational geophysics classes at the undergraduate level. In particular we describe The first class is for most students the first programming class and assumes that the students have had an introductory course in geophysics. In this class the students are introduced to basic Matlab skills: use of variables, basic array and matrix definition and manipulation, basic statistics, 1D integration, plotting of lines and surfaces, making of .m files and basic debugging techniques. All of these concepts are applied to elementary but important concepts in earthquake and exploration geophysics (including epicentre location, computation of travel time curves for simple layered media plotting of 1D and 2D velocity models etc.). It is important to integrate the geophysics with the programming concepts: we found that this enhances students' understanding. Moreover, as this is a 3 year Bachelor program, and this class is taught in the 2nd semester, there is little time for a class that focusses on only programming. In the second class, which is optional and can be taken in the 4th or 6th semester, but often is also taken by Master students we extend the Matlab programming to include signal processing and ordinary and partial differential equations, again with emphasis on geophysics (such as ray tracing and solving the acoustic wave equation). This class also contains a project in which the students have to write a brief paper on a topic in computational geophysics, preferably with programming examples. When teaching these classes it was found that active learning techniques, in which the students actively participate in the class, either individually, in pairs or in groups, are indispensable. We give a brief overview of the various activities that we have developed when teaching theses classes.
Microsoft C#.NET program and electromagnetic depth sounding for large loop source
NASA Astrophysics Data System (ADS)
Prabhakar Rao, K.; Ashok Babu, G.
2009-07-01
A program, in the C# (C Sharp) language with Microsoft.NET Framework, is developed to compute the normalized vertical magnetic field of a horizontal rectangular loop source placed on the surface of an n-layered earth. The field can be calculated either inside or outside the loop. Five C# classes with member functions in each class are, designed to compute the kernel, Hankel transform integral, coefficients for cubic spline interpolation between computed values and the normalized vertical magnetic field. The program computes the vertical magnetic field in the frequency domain using the integral expressions evaluated by a combination of straightforward numerical integration and the digital filter technique. The code utilizes different object-oriented programming (OOP) features. It finally computes the amplitude and phase of the normalized vertical magnetic field. The computed results are presented for geometric and parametric soundings. The code is developed in Microsoft.NET visual studio 2003 and uses various system class libraries.
ERIC Educational Resources Information Center
Emurian, Henry H.
2007-01-01
At the beginning of a Java computer programming course, nine students in an undergraduate class and nine students in a graduate class completed a web-based programmed instruction tutoring system that taught a simple computer program. All students exited the tutor with an identical level of skill, at least as determined by the tutor's required…
Data Processing: Fifteen Suggestions for Computer Training in Your Business Education Classes.
ERIC Educational Resources Information Center
Barr, Lowell L.
1980-01-01
Presents 15 suggestions for training business education students in the use of computers. Suggestions involve computer language, method of presentation, laboratory time, programing assignments, instructions and handouts, problem solving, deadlines, reviews, programming concepts, programming logic, documentation, and defensive programming. (CT)
Gender and Socioeconomic Differences in Enrollment in Computer Camps and Classes.
ERIC Educational Resources Information Center
Hess, Robert D.; Miura, Irene T.
Informal reports suggest that computer literacy (programming) is sought more often by boys than by girls and by students from middle SES backgrounds. In order to gather more systematic data on this perceived trend, questionnaires were sent to directors of summer camps and classes that offered training in programming for microcomputers.…
NASA Astrophysics Data System (ADS)
Castro, María Eugenia; Díaz, Javier; Muñoz-Caro, Camelia; Niño, Alfonso
2011-09-01
We present a system of classes, SHMatrix, to deal in a unified way with the computation of eigenvalues and eigenvectors in real symmetric and Hermitian matrices. Thus, two descendant classes, one for the real symmetric and other for the Hermitian cases, override the abstract methods defined in a base class. The use of the inheritance relationship and polymorphism allows handling objects of any descendant class using a single reference of the base class. The system of classes is intended to be the core element of more sophisticated methods to deal with large eigenvalue problems, as those arising in the variational treatment of realistic quantum mechanical problems. The present system of classes allows computing a subset of all the possible eigenvalues and, optionally, the corresponding eigenvectors. Comparison with well established solutions for analogous eigenvalue problems, as those included in LAPACK, shows that the present solution is competitive against them. Program summaryProgram title: SHMatrix Catalogue identifier: AEHZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2616 No. of bytes in distributed program, including test data, etc.: 127 312 Distribution format: tar.gz Programming language: Standard ANSI C++. Computer: PCs and workstations. Operating system: Linux, Windows. Classification: 4.8. Nature of problem: The treatment of problems involving eigensystems is a central topic in the quantum mechanical field. Here, the use of the variational approach leads to the computation of eigenvalues and eigenvectors of real symmetric and Hermitian Hamiltonian matrices. Realistic models with several degrees of freedom leads to large (sometimes very large) matrices. Different techniques, such as divide and conquer, can be used to factorize the matrices in order to apply a parallel computing approach. However, it is still interesting to have a core procedure able to tackle the computation of eigenvalues and eigenvectors once the matrix has been factorized to pieces of enough small size. Several available software packages, such as LAPACK, tackled this problem under the traditional imperative programming paradigm. In order to ease the modelling of complex quantum mechanical models it could be interesting to apply an object-oriented approach to the treatment of the eigenproblem. This approach offers the advantage of a single, uniform treatment for the real symmetric and Hermitian cases. Solution method: To reach the above goals, we have developed a system of classes: SHMatrix. SHMatrix is composed by an abstract base class and two descendant classes, one for real symmetric matrices and the other for the Hermitian case. The object-oriented characteristics of inheritance and polymorphism allows handling both cases using a single reference of the base class. The basic computing strategy applied in SHMatrix allows computing subsets of eigenvalues and (optionally) eigenvectors. The tests performed show that SHMatrix is competitive, and more efficient for large matrices, than the equivalent routines of the LAPACK package. Running time: The examples included in the distribution take only a couple of seconds to run.
Beyond Introductory Programming: Success Factors for Advanced Programming
ERIC Educational Resources Information Center
Hoskey, Arthur; Maurino, Paula San Millan
2011-01-01
Numerous studies document high drop-out and failure rates for students in computer programming classes. Studies show that even when some students pass programming classes, they still do not know how to program. Many factors have been considered to explain this problem including gender, age, prior programming experience, major, math background,…
ERIC Educational Resources Information Center
Fridge, Evorell; Bagui, Sikha
2016-01-01
The goal of this research was to investigate the effects of automated testing software on levels of student reflection and student performance. This was a self-selecting, between subjects design that examined the performance of students in introductory computer programming classes. Participants were given the option of using the Web-CAT…
ERIC Educational Resources Information Center
Bati, Tesfaye Bayu; Gelderblom, Helene; van Biljon, Judy
2014-01-01
The challenge of teaching programming in higher education is complicated by problems associated with large class teaching, a prevalent situation in many developing countries. This paper reports on an investigation into the use of a blended learning approach to teaching and learning of programming in a class of more than 200 students. A course and…
A distributed program composition system
NASA Technical Reports Server (NTRS)
Brown, Robert L.
1989-01-01
A graphical technique for creating distributed computer programs is investigated and a prototype implementation is described which serves as a testbed for the concepts. The type of programs under examination is restricted to those comprising relatively heavyweight parts that intercommunicate by passing messages of typed objects. Such programs are often presented visually as a directed graph with computer program parts as the nodes and communication channels as the edges. This class of programs, called parts-based programs, is not well supported by existing computer systems; much manual work is required to describe the program to the system, establish the communication paths, accommodate the heterogeneity of data types, and to locate the parts of the program on the various systems involved. The work described solves most of these problems by providing an interface for describing parts-based programs in this class in a way that closely models the way programmers think about them: using sketches of diagraphs. Program parts, the computational modes of the larger program system are categorized in libraries and are accessed with browsers. The process of programming has the programmer draw the program graph interactively. Heterogeneity is automatically accommodated by the insertion of type translators where necessary between the parts. Many decisions are necessary in the creation of a comprehensive tool for interactive creation of programs in this class. Possibilities are explored and the issues behind such decisions are presented. An approach to program composition is described, not a carefully implemented programming environment. However, a prototype implementation is described that can demonstrate the ideas presented.
Generalized fish life-cycle poplulation model and computer program
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeAngelis, D. L.; Van Winkle, W.; Christensen, S. W.
1978-03-01
A generalized fish life-cycle population model and computer program have been prepared to evaluate the long-term effect of changes in mortality in age class 0. The general question concerns what happens to a fishery when density-independent sources of mortality are introduced that act on age class 0, particularly entrainment and impingement at power plants. This paper discusses the model formulation and computer program, including sample results. The population model consists of a system of difference equations involving age-dependent fecundity and survival. The fecundity for each age class is assumed to be a function of both the fraction of females sexuallymore » mature and the weight of females as they enter each age class. Natural mortality for age classes 1 and older is assumed to be independent of population size. Fishing mortality is assumed to vary with the number and weight of fish available to the fishery. Age class 0 is divided into six life stages. The probability of survival for age class 0 is estimated considering both density-independent mortality (natural and power plant) and density-dependent mortality for each life stage. Two types of density-dependent mortality are included. These are cannibalism of each life stage by older age classes and intra-life-stage competition.« less
ERIC Educational Resources Information Center
Thomas, Michael K.; Ge, Xun; Greene, Barbara A.
2011-01-01
This study used technology-rich ethnography (TRE) to examine the use of game development in a high school computer programming class for the development of 21st century skills. High school students created games for elementary school students while obtaining formative feedback from their younger clients. Our experience suggests that in the…
LFSPMC: Linear feature selection program using the probability of misclassification
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Marion, B. P.
1975-01-01
The computational procedure and associated computer program for a linear feature selection technique are presented. The technique assumes that: a finite number, m, of classes exists; each class is described by an n-dimensional multivariate normal density function of its measurement vectors; the mean vector and covariance matrix for each density function are known (or can be estimated); and the a priori probability for each class is known. The technique produces a single linear combination of the original measurements which minimizes the one-dimensional probability of misclassification defined by the transformed densities.
PLANS; a finite element program for nonlinear analysis of structures. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Pifko, A.; Armen, H., Jr.; Levy, A.; Levine, H.
1977-01-01
The PLANS system, rather than being one comprehensive computer program, is a collection of finite element programs used for the nonlinear analysis of structures. This collection of programs evolved and is based on the organizational philosophy in which classes of analyses are treated individually based on the physical problem class to be analyzed. Each of the independent finite element computer programs of PLANS, with an associated element library, can be individually loaded and used to solve the problem class of interest. A number of programs have been developed for material nonlinear behavior alone and for combined geometric and material nonlinear behavior. The usage, capabilities, and element libraries of the current programs include: (1) plastic analysis of built-up structures where bending and membrane effects are significant, (2) three dimensional elastic-plastic analysis, (3) plastic analysis of bodies of revolution, and (4) material and geometric nonlinear analysis of built-up structures.
Gaming via Computer Simulation Techniques for Junior College Economics Education. Final Report.
ERIC Educational Resources Information Center
Thompson, Fred A.
A study designed to answer the need for more attractive and effective economics education involved the teaching of one junior college economics class by the conventional (lecture) method and an experimental class by computer simulation techniques. Econometric models approximating the "real world" were computer programed to enable the experimental…
Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes
ERIC Educational Resources Information Center
West, Jan; Veenstra, Anneke
2012-01-01
Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…
The BASIC Instructional Program: Conversion into MAINSAIL Language.
ERIC Educational Resources Information Center
Dageforde, Mary L.
This report summarizes the rewriting of the BASIC Instructional Program (BIP) (a "hands-on laboratory" that teaches elementary programming in the BASIC language) from SAIL (a programming language available only on PDP-10 computers) into MAINSAIL (a language designed for portability on a broad class of computers). Four sections contain…
Apple IIe Computers and Appleworks Training Mini Course Materials.
ERIC Educational Resources Information Center
Schlenker, Richard M.
The instructional materials included in this document are designed to introduce students to the Apple IIe computer and to the word processing and database portions of the AppleWorks program. The materials are intended for small groups of students, each of whom has use of a computer during class and for short periods between classes. The course…
A Computer-Based Subduction-Zone-Earthquake Exercise for Introductory-Geology Classes.
ERIC Educational Resources Information Center
Shea, James Herbert
1991-01-01
Describes the author's computer-based program for a subduction-zone-earthquake exercise. Instructions for conducting the activity and obtaining the program from the author are provided. Written in IBM QuickBasic. (PR)
Computer program analyzes Buckling Of Shells Of Revolution with various wall construction, BOSOR
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Bushnell, D.; Sobel, L. H.
1968-01-01
Computer program performs stability analyses for a wide class of shells without unduly restrictive approximations. The program uses numerical integration, finite difference of finite element techniques to solve with reasonable accuracy almost any buckling problem for shells exhibiting orthotropic behavior.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1984
1984-01-01
Offers suggestions for five computer-oriented classroom activities. They include uniting a writing class by having them collectively write a book using a word processor, examining FOR/NEXT loops, using a compound interest computer program, and developing a list of facts about computers. Includes four short programs which erase monitor screens. (JN)
Undergraduate optics program for the 21st Century
NASA Astrophysics Data System (ADS)
Palmer, James M.
2002-05-01
We have been offering a successful BS degree in optical engineering for the past ten years. We have produced more than 100 graduates, highly trained in basic optics and electronics. Our Industrial Affiliates, while very pleased with our graduates, requested that we produce some with greater mechanical engineering skills and knowledge. Our response was the creation of a new degree program, retaining the virtues of the previous one, but allowing a high degree of flexibility through the inclusion of minors within the program. The new program allows sufficient room for a variety of minors. Engineering minors identified include aerospace, computer, electrical, materials and mechanical. Science minors include astronomy, computer science, math and physics. Non-science minors accommodated include business, pre-health and pre-law. The new BSO program features: (1) Better structure and flow, more tightly coupling related classes; (2) New laboratory classes for juniors, linked to lecture classes; (3) Expanded optical deign, fabrication and testing classes; (4) New class in electronics for optics; (5) New classes in fiber optics and optical communications; (6) New capstone/senior project class for ABET compliance. This new BSO program will produce better entry-level optical scientists and engineers, and better candidates for graduate school. Our interactions with the external community will provide inputs concerning industrial needs, leading towards improved student counseling and program development. We will better serve national needs for skilled personnel in optics, and contribute even more to the optics workforce pipeline.
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Communications oriented programming of parallel iterative solutions of sparse linear systems
NASA Technical Reports Server (NTRS)
Patrick, M. L.; Pratt, T. W.
1986-01-01
Parallel algorithms are developed for a class of scientific computational problems by partitioning the problems into smaller problems which may be solved concurrently. The effectiveness of the resulting parallel solutions is determined by the amount and frequency of communication and synchronization and the extent to which communication can be overlapped with computation. Three different parallel algorithms for solving the same class of problems are presented, and their effectiveness is analyzed from this point of view. The algorithms are programmed using a new programming environment. Run-time statistics and experience obtained from the execution of these programs assist in measuring the effectiveness of these algorithms.
In-class Simulations of the Iterated Prisoner's Dilemma Game.
ERIC Educational Resources Information Center
Bodo, Peter
2002-01-01
Developed a simple computer program for the in-class simulation of the repeated prisoner's dilemma game with student-designed strategies. Describes the basic features of the software. Presents two examples using the program to teach the problems of cooperation among profit-maximizing agents. (JEH)
MPL-A program for computations with iterated integrals on moduli spaces of curves of genus zero
NASA Astrophysics Data System (ADS)
Bogner, Christian
2016-06-01
We introduce the Maple program MPL for computations with multiple polylogarithms. The program is based on homotopy invariant iterated integrals on moduli spaces M0,n of curves of genus 0 with n ordered marked points. It includes the symbol map and procedures for the analytic computation of period integrals on M0,n. It supports the automated computation of a certain class of Feynman integrals.
ERIC Educational Resources Information Center
Heldman, Bill
2010-01-01
With few exceptions, students interact with technology in one way or another every day. And yet, in most U.S. schools, the term "computer science" (CS) refers only to generic skills classes, such as keyboarding and computer applications. Even most Web programming classes usually teach students only how to use conventional graphical user…
Embedding global and collective in a torus network with message class map based tree path selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Dong; Coteus, Paul W.; Eisley, Noel A.
Embodiments of the invention provide a method, system and computer program product for embedding a global barrier and global interrupt network in a parallel computer system organized as a torus network. The computer system includes a multitude of nodes. In one embodiment, the method comprises taking inputs from a set of receivers of the nodes, dividing the inputs from the receivers into a plurality of classes, combining the inputs of each of the classes to obtain a result, and sending said result to a set of senders of the nodes. Embodiments of the invention provide a method, system and computermore » program product for embedding a collective network in a parallel computer system organized as a torus network. In one embodiment, the method comprises adding to a torus network a central collective logic to route messages among at least a group of nodes in a tree structure.« less
Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Heidelberger, Philip; Senger, Robert M; Salapura, Valentina; Steinmacher-Burow, Burkhard; Sugawara, Yutaka; Takken, Todd E
2013-08-27
Embodiments of the invention provide a method, system and computer program product for embedding a global barrier and global interrupt network in a parallel computer system organized as a torus network. The computer system includes a multitude of nodes. In one embodiment, the method comprises taking inputs from a set of receivers of the nodes, dividing the inputs from the receivers into a plurality of classes, combining the inputs of each of the classes to obtain a result, and sending said result to a set of senders of the nodes. Embodiments of the invention provide a method, system and computer program product for embedding a collective network in a parallel computer system organized as a torus network. In one embodiment, the method comprises adding to a torus network a central collective logic to route messages among at least a group of nodes in a tree structure.
Computer Programs for Technical Communicators: The Compelling Curriculum. Working Draft.
ERIC Educational Resources Information Center
Selfe, Cynthia L.; Wahlstrom, Billie J.
A series of computer programs have been developed at Michigan Technological University for use with technical writing and technical communications classes. The first type of program in the series, CURIE II, includes process-based modules, each of which corresponds to one of the following assignments: memoranda, resumes, feasibility reports,…
Impact of Classroom Computer Use on Computer Anxiety.
ERIC Educational Resources Information Center
Lambert, Matthew E.; And Others
Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…
Multi-dimensional Rankings, Program Termination, and Complexity Bounds of Flowchart Programs
NASA Astrophysics Data System (ADS)
Alias, Christophe; Darte, Alain; Feautrier, Paul; Gonnord, Laure
Proving the termination of a flowchart program can be done by exhibiting a ranking function, i.e., a function from the program states to a well-founded set, which strictly decreases at each program step. A standard method to automatically generate such a function is to compute invariants for each program point and to search for a ranking in a restricted class of functions that can be handled with linear programming techniques. Previous algorithms based on affine rankings either are applicable only to simple loops (i.e., single-node flowcharts) and rely on enumeration, or are not complete in the sense that they are not guaranteed to find a ranking in the class of functions they consider, if one exists. Our first contribution is to propose an efficient algorithm to compute ranking functions: It can handle flowcharts of arbitrary structure, the class of candidate rankings it explores is larger, and our method, although greedy, is provably complete. Our second contribution is to show how to use the ranking functions we generate to get upper bounds for the computational complexity (number of transitions) of the source program. This estimate is a polynomial, which means that we can handle programs with more than linear complexity. We applied the method on a collection of test cases from the literature. We also show the links and differences with previous techniques based on the insertion of counters.
ERIC Educational Resources Information Center
Wielard, Valerie Michelle
2013-01-01
The primary objective of this project was to learn what effect a computer program would have on academic achievement and attitude toward science of college students enrolled in a biology class for non-science majors. It became apparent that the instructor also had an effect on attitudes toward science. The researcher designed a computer program,…
Creative Computer Detective: The Basics of Teaching Desktop Publishing.
ERIC Educational Resources Information Center
Slothower, Jodie
Teaching desktop publishing (dtp) in college journalism classes is most effective when the instructor integrates into specific courses four types of software--a word processor, a draw program, a paint program and a layout program. In a course on design and layout, the instructor can demonstrate with the computer how good design can be created and…
Blindness and Computer Networking at iTEC [Information Technology Education Center].
ERIC Educational Resources Information Center
Goins, Shannon
A new program to train blind and visually impaired individuals to design and run a computer network has been developed. The program offers the Microsoft Certified Systems Engineer (MCSE) training. The program, which began in February 2001, recently graduated its first class of students, who are currently completing 1-month internships to complete…
A Web-Based Tutor for Java™: Evidence of Meaningful Learning
ERIC Educational Resources Information Center
Emurian, Henry H.
2006-01-01
Students in a graduate class and an undergraduate class in Information Systems completed a Web-based programmed instruction tutor that taught a simple Java applet as the first technical training exercise in a computer programming course. The tutor is a competency-based instructional system for individualized distance learning. When a student…
Give Your Technology Program a Little "Class"!
ERIC Educational Resources Information Center
Vengersammy, Ormilla
2009-01-01
The Orange County Library System (OCLS) began to offer basic technology classes in July 2000. The computers were funded through a grant awarded by the Bill & Melinda Gates Foundation. Over time, the library staff noticed that the demand for the classes increased, so the offering of classes also increased. When the author arrived at OCLS, her…
Computer-Aided Construction at Designing Reinforced Concrete Columns as Per Ec
NASA Astrophysics Data System (ADS)
Zielińska, M.; Grębowski, K.
2015-02-01
The article presents the authors' computer program for designing and dimensioning columns in reinforced concrete structures taking into account phenomena affecting their behaviour and information referring to design as per EC. The computer program was developed with the use of C++ programming language. The program guides the user through particular dimensioning stages: from introducing basic data such as dimensions, concrete class, reinforcing steel class and forces affecting the column, through calculating the creep coefficient taking into account the impact of imperfection depending on the support scheme and also the number of mating members at load shit, buckling length, to generating the interaction curve graph. The final result of calculations provides two dependence points calculated as per methods of nominal stiffness and nominal curvature. The location of those points relative to the limit curve determines whether the column load capacity is assured or has been exceeded. The content of the study describes in detail the operation of the computer program and the methodology and phenomena which are indispensable at designing axially and eccentrically the compressed members of reinforced concrete structures as per the European standards.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Socket Widget Class ("Class" is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network connections for graphical-user-interface (GUI) computer programs. UNIX Transmission Control Protocol/Internet Protocol (TCP/IP) socket programming libraries require many method calls to configure, operate, and destroy sockets. Most X Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Socket Widget Class encapsulates UNIX TCP/IP socket-management tasks within the framework of an X Windows widget. Using the widget framework, X Windows GUI programs can treat one or more network socket instances in the same manner as that of other graphical widgets, making it easier to program sockets. Wrapping ISP socket programming libraries inside a widget framework enables a programmer to treat a network interface as though it were a GUI.
A Computer Program for Preliminary Data Analysis
Dennis L. Schweitzer
1967-01-01
ABSTRACT. -- A computer program written in FORTRAN has been designed to summarize data. Class frequencies, means, and standard deviations are printed for as many as 100 independent variables. Cross-classifications of an observed dependent variable and of a dependent variable predicted by a multiple regression equation can also be generated.
Distriblets: Java-Based Distributed Computing on the Web.
ERIC Educational Resources Information Center
Finkel, David; Wills, Craig E.; Brennan, Brian; Brennan, Chris
1999-01-01
Describes a system for using the World Wide Web to distribute computational tasks to multiple hosts on the Web that is written in Java programming language. Describes the programs written to carry out the load distribution, the structure of a "distriblet" class, and experiences in using this system. (Author/LRW)
ERIC Educational Resources Information Center
Fischman, Josh
2007-01-01
In this article, the author talks about Classroom Presenter, a computer program that aids in student participation during class discussions and makes boring lectures more interactive. The program was created by Richard J. Anderson, a professor of computer science at the University of Washington, in Seattle. Classroom Presenter is now in use in…
Inheritance on processes, exemplified on distributed termination detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomsen, K.S.
1987-02-01
A multiple inheritance mechanism on processes is designed and presented within the framework of a small object oriented language. Processes are described in classes, and the different action parts of a process inherited from different classes are executed in a coroutine-like style called alternation. The inheritance mechanism is a useful tool for factorizing the description of common aspects of processes. This is demonstrated within the domain of distributed programming by using the inheritance mechanism to factorize the description of distributed termination detection algorithms from the description of the distributed main computations for which termination is to be detected. A clearmore » separation of concerns is obtained, and arbitrary combinations of terminations detection algorithms and main computations can be formed. The same termination detection classes can also be used for more general purposes within distributed programming, such as detecting termination of each phase in a multi-phase main computation.« less
SURE reliability analysis: Program and mathematics
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; White, Allan L.
1988-01-01
The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
Bayesian Latent Class Analysis Tutorial.
Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca
2018-01-01
This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.
1983-09-01
be illustrated by example. If ’z’ is the name of an individual and ’C’ is the name of a class (set), then ’ zEC ’ means that the individual denoted by ’z...will abbreviate this un z. Conversely, if C is a single element class, then un-1 C selects the unique member of that class: un-1C = Lz( zEC ). It is...Professor Peter Henderson1 Department of Computer Science SUNY at Stony Brook Long Island, NY 11794 Dr. Olle Olsson Department of Computer Science
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
ERIC Educational Resources Information Center
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
Online Secondary Research in the Advertising Research Class: A Friendly Introduction to Computing.
ERIC Educational Resources Information Center
Adler, Keith
In an effort to promote computer literacy among advertising students, an assignment was devised that required the use of online database search techniques to find secondary research materials. The search program, chosen for economical reasons, was "Classroom Instruction Program" offered by Dialog Information Services. Available for a…
Computer System Resource Requirements of Novice Programming Students.
ERIC Educational Resources Information Center
Nutt, Gary J.
The characteristics of jobs that constitute the mix for lower division FORTRAN classes in a university were investigated. Samples of these programs were also benchmarked on a larger central site computer and two minicomputer systems. It was concluded that a carefully chosen minicomputer system could offer service at least the equivalent of the…
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
The SURE reliability analysis program
NASA Technical Reports Server (NTRS)
Butler, R. W.
1986-01-01
The SURE program is a new reliability tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
The SURE Reliability Analysis Program
NASA Technical Reports Server (NTRS)
Butler, R. W.
1986-01-01
The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
Instructional Design: Its Relevance for CALL.
ERIC Educational Resources Information Center
England, Elaine
1989-01-01
Describes an interdisciplinary (language and educational technology departments) instructional design program that is intended to develop back-up computer programs for students taking supplementary English as a second language classes. The program encompasses training programs, the psychology of screen reading, task analysis, and color cueing.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, B.
The Energy Research program may be on the verge of abdicating an important role it has traditionally played in the development and use of state-of-the-art computer systems. The lack of easy access to Class VI systems coupled to the easy availability of local, user-friendly systems is conspiring to drive many investigators away from forefront research in computational science and in the use of state-of-the-art computers for more discipline-oriented problem solving. The survey conducted under the auspices of this contract clearly demonstrates a significant suppressed demand for actual Class VI hours totaling the full capacity of one such system. The currentmore » usage is about a factor of 15 below this level. There is also a need for about 50% more capacity in the current mini/midi availability. Meeting the needs of the ER community for this level of computing power and capacity is most probably best achieved through the establishment of a central Class VI capability at some site linked through a nationwide network to the various ER laboratories and universities and interfaced with the local user-friendly systems at those remote sites.« less
ERIC Educational Resources Information Center
Tosun, Nilgün; Suçsuz, Nursen; Yigit, Birol
2006-01-01
The purpose of this research was to investigate the effects of the computer-assisted and computer-based instructional methods on students achievement at computer classes and on their attitudes towards using computers. The study, which was completed in 6 weeks, were carried out with 94 sophomores studying in formal education program of Primary…
Attitudes toward Advanced and Multivariate Statistics When Using Computers.
ERIC Educational Resources Information Center
Kennedy, Robert L.; McCallister, Corliss Jean
This study investigated the attitudes toward statistics of graduate students who studied advanced statistics in a course in which the focus of instruction was the use of a computer program in class. The use of the program made it possible to provide an individualized, self-paced, student-centered, and activity-based course. The three sections…
ERIC Educational Resources Information Center
Esit, Omer
2011-01-01
This study investigated the effectiveness of an intelligent computer-assisted language learning (ICALL) program on Turkish learners' vocabulary learning. Within the scope of this research, an ICALL application with a morphological analyser (Your Verbal Zone, YVZ) was developed and used in an English language preparatory class to measure its…
Computer Literacy for Student Teachers and Elementary Children through Programming.
ERIC Educational Resources Information Center
Elliott, John D.
This paper argues that teachers and students should be creators as well as consumers of computer software, not only so that they might better appreciate the skills involved, but also to improve teacher effectiveness in the classroom. As part of their own class work, 13 second-year student teachers developed a branching program using simple basic…
ERIC Educational Resources Information Center
Smith, Ruth Baynard
1994-01-01
Intermediate level academically talented students learn essential elements of computer programming by working with robots at enrichment workshops at Dwight-Englewood School in Englewood, New Jersey. The children combine creative thinking and problem-solving skills to program the robots' microcomputers to perform a variety of movements. (JDD)
A Set of Computer Projects for an Electromagnetic Fields Class.
ERIC Educational Resources Information Center
Gleeson, Ronald F.
1989-01-01
Presented are three computer projects: vector analysis, electric field intensities at various distances, and the Biot-Savart law. Programing suggestions and project results are provided. One month is suggested for each project. (MVL)
NASA Astrophysics Data System (ADS)
Gurov, V. V.
2017-01-01
Software tools for educational purposes, such as e-lessons, computer-based testing system, from the point of view of reliability, have a number of features. The main ones among them are the need to ensure a sufficiently high probability of their faultless operation for a specified time, as well as the impossibility of their rapid recovery by the way of replacing it with a similar running program during the classes. The article considers the peculiarities of reliability evaluation of programs in contrast to assessments of hardware reliability. The basic requirements to reliability of software used for carrying out practical and laboratory classes in the form of computer-based training programs are given. The essential requirements applicable to the reliability of software used for conducting the practical and laboratory studies in the form of computer-based teaching programs are also described. The mathematical tool based on Markov chains, which allows to determine the degree of debugging of the training program for use in the educational process by means of applying the graph of the software modules interaction, is presented.
Computer Access and Flowcharting as Variables in Learning Computer Programming.
ERIC Educational Resources Information Center
Ross, Steven M.; McCormick, Deborah
Manipulation of flowcharting was crossed with in-class computer access to examine flowcharting effects in the traditional lecture/laboratory setting and in a classroom setting where online time was replaced with manual simulation. Seventy-two high school students (24 male and 48 female) enrolled in a computer literacy course served as subjects.…
Students' learning of clinical sonography: use of computer-assisted instruction and practical class.
Wood, A K; Dadd, M J; Lublin, J R
1996-08-01
The application of information technology to teaching radiology will profoundly change the way learning is mediated to students. In this project, the integration of veterinary medical students' knowledge of sonography was promoted by a computer-assisted instruction program and a subsequent practical class. The computer-assisted instruction program emphasized the physical principles of clinical sonography and contained simulations and user-active experiments. In the practical class, the students used an actual sonographic machine for the first time and made images of a tissue-equivalent phantom. Students' responses to questionnaires were analyzed. On completing the overall project, 96% of the students said that they now understood sonographic concepts very or reasonably well, and 98% had become very or moderately interested in clinical sonography. The teaching and learning initiatives enhanced an integrated approach to learning, stimulated student interest and curiosity, improved understanding of sonographic principles, and contributed to an increased confidence and skill in using sonographic equipment.
Computer Applications in Assessment and Counseling.
ERIC Educational Resources Information Center
Veldman, Donald J.; Menaker, Shirley L.
Public school counselors and psychologists can expect valuable assistance from computer-based assessment and counseling techniques within a few years, as programs now under development become generally available for the typical computers now used by schools for grade-reporting and class-scheduling. Although routine information-giving and gathering…
Single-Sex Computer Classes: An Effective Alternative.
ERIC Educational Resources Information Center
Swain, Sandra L.; Harvey, Douglas M.
2002-01-01
Advocates single-sex computer instruction as a temporary alternative educational program to provide middle school and secondary school girls with access to computers, to present girls with opportunities to develop positive attitudes towards technology, and to make available a learning environment conducive to girls gaining technological skills.…
Smolarski, D C; Whitehead, T
2000-04-01
In this paper, we describe our recent approaches to introducing students in a beginning computer science class to the study of ethical issues related to computer science and technology. This consists of three components: lectures on ethics and technology, in-class discussion of ethical scenarios, and a reflective paper on a topic related to ethics or the impact of technology on society. We give both student reactions to these aspects, and instructor perspective on the difficulties and benefits in exposing students to these ideas.
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
Classification of wetlands vegetation using small scale color infrared imagery
NASA Technical Reports Server (NTRS)
Williamson, F. S. L.
1975-01-01
A classification system for Chesapeake Bay wetlands was derived from the correlation of film density classes and actual vegetation classes. The data processing programs used were developed by the Laboratory for the Applications of Remote Sensing. These programs were tested for their value in classifying natural vegetation, using digitized data from small scale aerial photography. Existing imagery and the vegetation map of Farm Creek Marsh were used to determine the optimal number of classes, and to aid in determining if the computer maps were a believable product.
Practical Micro-Computer Uses in Physical Education at George Mason University.
ERIC Educational Resources Information Center
Stein, Julian U.
Both the Apple and TRS-80 microcomputer units are being used in the department of physical education at George Mason University (Virginia). As a first step, a computer program was developed and used in conjunction with an aerobic and personal conditioning class. (The capabilities of this specific program are discussed, and the ways in which it was…
ERIC Educational Resources Information Center
Impelluso, Thomas J.
2009-01-01
Cognitive Load Theory (CLT) was used as a foundation to redesign a computer programming class for mechanical engineers, in which content was delivered with hybrid/distance technology. The effort confirmed the utility of CLT in course design. And it demonstrates that hybrid/distance learning is not merely a tool of convenience, but one, which, when…
One-to-One Computing in Public Schools: Lessons from "Laptops for All" Programs
ERIC Educational Resources Information Center
Abell Foundation, 2008
2008-01-01
The basic tenet of one-to-one computing is that the student and teacher have Internet-connected, wireless computing devices in the classroom and optimally at home as well. Also known as "ubiquitous computing," this strategy assumes that every teacher and student has her own computing device and obviates the need for moving classes to…
ERIC Educational Resources Information Center
Moore, John W., Ed.
1983-01-01
Describes use of Warnier-Orr program design method for preparing general chemistry tutorial on ideal gas calculations. This program (BASIC-PLUS) is available from the author. Also describes a multipurpose computerized class record system at the University of Toledo. (JN)
CAI at CSDF: Organizational Strategies.
ERIC Educational Resources Information Center
Irwin, Margaret G.
1982-01-01
The computer assisted instruction (CAI) program at the California School for the Deaf, at Fremont, features individual Apple computers in classrooms as well as in CAI labs. When the whole class uses computers simultaneously, the teacher can help individuals, identify group weaknesses, note needs of the materials, and help develop additional CAI…
Poems by Computer: Introducing Poetry in a High-Tech Society.
ERIC Educational Resources Information Center
Styne, Marlys M.
Poetry was used in a college English class to teach figurative language, connotation, denotation, and the need for close attention to vocabulary. However, students were often bored by traditional poetry. Using computer programs like "Compupoem,""Poetrywriter,""Lifesongs," and "Haikuku," students were introduced to computer poetry and created their…
NASA Astrophysics Data System (ADS)
Henderson, Jean Foster
The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that incorporate an open laboratory setting are just as effective on student achievement and attitudes as classroom structures that incorporate a closed laboratory setting. The results also suggest that math background is a strong predictor of student achievement in CS 1.
Use of a Technology-Enhanced Version of the Good Behavior Game in an Elementary School Setting
ERIC Educational Resources Information Center
Lynne, Shauna; Radley, Keith C.; Dart, Evan H.; Tingstrom, Daniel H.; Barry, Christopher T.; Lum, John D. K.
2017-01-01
The purpose of this study was to investigate the effectiveness of a variation of the Good Behavior Game (GBG) in which teachers used ClassDojo to manage each team's progress. ClassDojo is a computer-based program that enables teachers to award students with points for demonstrating target behaviors. Dependent variables included class-wide…
Cognitive and Affective Variables and Their Relationships to Performance in a Lotus 1-2-3 Class.
ERIC Educational Resources Information Center
Guster, Dennis; Batt, Richard
1989-01-01
Describes study of two-year college students that was conducted to determine whether variables that were predictors of success in a programing class were also predictors of success in a package-oriented computer class using Lotus 1-2-3. Diagraming skill, critical thinking ability, spatial discrimination, and test anxiety level were examined. (11…
X-Windows Information Sharing Protocol Widget Class
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Information Sharing Protocol (ISP) Widget Class ("Class") is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing ISP graphical-user-interface (GUI) computer programs. ISP programming tasks require many method calls to identify, query, and interpret the connections and messages exchanged between a client and an ISP server. Most X-Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Information Sharing Protocol (ISP) Widget Class encapsulates the client side of the ISP programming libraries within the framework of an X-Windows widget. Using the widget framework, X-Windows GUI programs can interact with ISP services in an abstract way and in the same manner as that of other graphical widgets, making it easier to write ISP GUI client programs. Wrapping ISP client services inside a widget framework enables a programmer to treat an ISP server interface as though it were a GUI. Moreover, an alternate subclass could implement another communication protocol in the same sort of widget.
Dyadic Instruction for Middle School Students: Liking Promotes Learning
Hartl, Amy C.; DeLay, Dawn; Laursen, Brett; Denner, Jill; Werner, Linda; Campe, Shannon; Ortiz, Eloy
2015-01-01
This study examines whether friendship facilitates or hinders learning in a dyadic instructional setting. Working in 80 same-sex pairs, 160 (60 girls, 100 boys) middle school students (M = 12.13 years old) were taught a new computer programming language and programmed a game. Students spent 14 to 30 (M = 22.7) hours in a programming class. At the beginning and the end of the project, each participant separately completed (a) computer programming knowledge assessments and (b) questionnaires rating their affinity for their partner. Results support the proposition that liking promotes learning: Greater partner affinity predicted greater subsequent increases in computer programming knowledge for both partners. One partner’s initial programming knowledge also positively predicted the other partner’s subsequent partner affinity. PMID:26688658
Computers in the English Classroom.
ERIC Educational Resources Information Center
Scioli, Frances; And Others
Intended to provide help for teachers and supervisors in using computers in English classes as an enhancement of the instructional program, this guide is organized in three parts. Part 1 focuses on the many management issues related to computer use. This section of the guide presents ideas for helping students with limited keyboarding skills as…
ERIC Educational Resources Information Center
Hubbard, Aleata Kimberly
2017-01-01
In this dissertation, I explored the pedagogical content knowledge of in-service high school educators recently assigned to teach computer science for the first time. Teachers were participating in a professional development program where they co-taught introductory computing classes with tech industry professionals. The study was motivated by…
Computer Gaming at Every Age: A Comparative Evaluation of Alice
ERIC Educational Resources Information Center
Seals, Cheryl D.; McMillian, Yolanda; Rouse, Kenneth; Agarwal, Ravikant; Johnson, Andrea Williams; Gilbert, Juan E.; Chapman, Richard
2008-01-01
This research has two thrusts of teaching object oriented programming to very young audiences and of increasing student excitement about computing applications with the long-term goal of increasing involvement in technology classes, in the use of computer applications and interest in technology careers. The goal of this work was to provide…
Using E-mail in a Math/Computer Core Course.
ERIC Educational Resources Information Center
Gurwitz, Chaya
This paper notes the advantages of using e-mail in computer literacy classes, and discusses the results of incorporating an e-mail assignment in the "Introduction to Mathematical Reasoning and Computer Programming" core course at Brooklyn College (New York). The assignment consisted of several steps. The students first read and responded…
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
Evaluating Technology in Schools: Implications of a Research Study.
ERIC Educational Resources Information Center
Kell, Diane; And Others
This report provides an overview of the issues related to a recently completed study of the use of computers in primary classrooms in six school districts and some special program classes, i.e., K-2 English-as-a-Second-Language and bilingual classes at one study site and Chapter 1 pullout classes for grades K-5 at another site. The sites used were…
Computer-aided decision making.
Keith M. Reynolds; Daniel L. Schmoldt
2006-01-01
Several major classes of software technologies have been used in decisionmaking for forest management applications over the past few decades. These computer-based technologies include mathematical programming, expert systems, network models, multi-criteria decisionmaking, and integrated systems. Each technology possesses unique advantages and disadvantages, and has...
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Al-Sahaf, Harith; Zhang, Mengjie; Johnston, Mark
2016-01-01
In the computer vision and pattern recognition fields, image classification represents an important yet difficult task. It is a challenge to build effective computer models to replicate the remarkable ability of the human visual system, which relies on only one or a few instances to learn a completely new class or an object of a class. Recently we proposed two genetic programming (GP) methods, one-shot GP and compound-GP, that aim to evolve a program for the task of binary classification in images. The two methods are designed to use only one or a few instances per class to evolve the model. In this study, we investigate these two methods in terms of performance, robustness, and complexity of the evolved programs. We use ten data sets that vary in difficulty to evaluate these two methods. We also compare them with two other GP and six non-GP methods. The results show that one-shot GP and compound-GP outperform or achieve results comparable to competitor methods. Moreover, the features extracted by these two methods improve the performance of other classifiers with handcrafted features and those extracted by a recently developed GP-based method in most cases.
ERIC Educational Resources Information Center
Collier, Herbert I.
1978-01-01
Energy conservation programs at Louisiana State University reduced energy use 23 percent. The programs involved computer controlled power management systems, adjustment of building temperatures and lighting levels to prescribed standards, consolidation of night classes, centralization of chilled water systems, and manual monitoring of heating and…
Introduction to computers: Reference guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ligon, F.V.
1995-04-01
The ``Introduction to Computers`` program establishes formal partnerships with local school districts and community-based organizations, introduces computer literacy to precollege students and their parents, and encourages students to pursue Scientific, Mathematical, Engineering, and Technical careers (SET). Hands-on assignments are given in each class, reinforcing the lesson taught. In addition, the program is designed to broaden the knowledge base of teachers in scientific/technical concepts, and Brookhaven National Laboratory continues to act as a liaison, offering educational outreach to diverse community organizations and groups. This manual contains the teacher`s lesson plans and the student documentation to this introduction to computer course.
ERIC Educational Resources Information Center
Swan, Karen; Kratcoski, Annette; Mazzer, Pat; Schenker, Jason
2005-01-01
This article describes an ongoing situated professional development program in which teachers bring their intact classes for an extended stay in a ubiquitous computing environment equipped with a variety of state-of-the-art computing devices. The experience is unique in that it not only situates teacher learning about technology integration in…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-17
... any person wishing to bring a laptop computer into the Forrestal Building will be required to obtain a...; VRF water-source heat pumps at or greater than 135,000 Btu/h; and computer room air conditioners. DOE...-created classes of variable refrigerant flow air conditioners and heat pumps, ASHRAE 127 for computer room...
Simulating electron energy loss spectroscopy with the MNPBEM toolbox
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich
2014-03-01
Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles, and corrects a few minor bugs and inconsistencies. Summary of revisions: New classes “eelsstat” and “eelsret” for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles have been added. A few minor errors in the implementation of dipole excitation have been corrected. Running time: Depending on surface discretization between seconds and hours.
Health and Wellness After School.
ERIC Educational Resources Information Center
Kolbe, Grace C.; Berkin, Beverly
2000-01-01
Although after-school programs offer many activities--from cooking classes to computer technology, homework assistance, and sports--they also provide an effective environment for health education and wellness instruction, especially pregnancy prevention. Exemplary programs for middle- and high-schoolers in Palm Beach County, Florida, are…
Adding Interactivity to a Non-Interative Class
ERIC Educational Resources Information Center
Rogers, Gary; Krichen, Jack
2004-01-01
The IT 3050 course at Capella University is an introduction to fundamental computer networking. This course is one of the required courses in the Bachelor of Science in Information Technology program. In order to provide a more enriched learning environment for learners, Capella has significantly modified this class (and others) by infusing it…
Object-Oriented Programming in High Schools the Turing Way.
ERIC Educational Resources Information Center
Holt, Richard C.
This paper proposes an approach to introducing object-oriented concepts to high school computer science students using the Object-Oriented Turing (OOT) language. Students can learn about basic object-oriented (OO) principles such as classes and inheritance by using and expanding a collection of classes that draw pictures like circles and happy…
Review of the Program. Report No. R-60.
ERIC Educational Resources Information Center
Pennsylvania State Univ., University Park. Computer-Assisted Instruction Lab.
The nine-year history of the Computer Assisted Instruction Laboratory, College of Education, Pennsylvania State University, is traced. Some 30 projects in curriculum development in teacher education, public school classes, and adult vocational education are described, along with several advances in computer-assisted instruction (CAI).…
Hypercluster Parallel Processor
NASA Technical Reports Server (NTRS)
Blech, Richard A.; Cole, Gary L.; Milner, Edward J.; Quealy, Angela
1992-01-01
Hypercluster computer system includes multiple digital processors, operation of which coordinated through specialized software. Configurable according to various parallel-computing architectures of shared-memory or distributed-memory class, including scalar computer, vector computer, reduced-instruction-set computer, and complex-instruction-set computer. Designed as flexible, relatively inexpensive system that provides single programming and operating environment within which one can investigate effects of various parallel-computing architectures and combinations on performance in solution of complicated problems like those of three-dimensional flows in turbomachines. Hypercluster software and architectural concepts are in public domain.
NASA Astrophysics Data System (ADS)
Bayu Bati, Tesfaye; Gelderblom, Helene; van Biljon, Judy
2014-01-01
The challenge of teaching programming in higher education is complicated by problems associated with large class teaching, a prevalent situation in many developing countries. This paper reports on an investigation into the use of a blended learning approach to teaching and learning of programming in a class of more than 200 students. A course and learning environment was designed by integrating constructivist learning models of Constructive Alignment, Conversational Framework and the Three-Stage Learning Model. Design science research is used for the course redesign and development of the learning environment, and action research is integrated to undertake participatory evaluation of the intervention. The action research involved the Students' Approach to Learning survey, a comparative analysis of students' performance, and qualitative data analysis of data gathered from various sources. The paper makes a theoretical contribution in presenting a design of a blended learning solution for large class teaching of programming grounded in constructivist learning theory and use of free and open source technologies.
Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.
Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas
2004-08-01
The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.
Feasibility of a home-delivered Internet obesity prevention program for fourth-grade students.
Owens, Scott; Lambert, Laurel; McDonough, Suzanne; Green, Kenneth; Loftin, Mark
2009-08-01
This pilot study examined the feasibility of an interactive obesity prevention program delivered to a class of fourth-grade students utilizing daily e-mail messages sent to the students' home computers. The study involved a single intact class of 22 students, 17 (77%) of whom submitted parental permission documentation and received e-mail messages each school day over the course of one month. Concerns regarding Internet safety and children's use of e-mail were addressed fairly easily. Cost/benefit issues for the school did not seem prohibitive. Providing e-mail access to students without a home computer was accomplished by loaning them personal digital assistant (PDA) devices. In larger interventions, loaning PDAs is probably not feasible economically, although cell phones may be an acceptable alternative. It was concluded that this type of interactive obesity prevention program is feasible from most perspectives. Data from a larger scale effectiveness study is still needed.
Performance assessment of small-package-class nonintrusive inspection systems
NASA Astrophysics Data System (ADS)
Spradling, Michael L.; Hyatt, Roger
1997-02-01
The DoD Counterdrug Technology Development Program has addressed the development and demonstration of technology to enhance nonintrusive inspection of small packages such as passenger baggage, commercially delivered parcels, and breakbulk cargo items. Within the past year they have supported several small package-class nonintrusive inspection system performance assessment activities. All performance assessment programs involved the use of a red/blue team concept and were conducted in accordance with approved assessment protocols. This paper presents a discussion related to the systematic performance assessment of small package-class nonintrusive inspection technologies, including transmission, backscatter and computed tomography x-ray imaging, and protocol-related considerations for the assessment of these systems.
1984-09-01
Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development
A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class
ERIC Educational Resources Information Center
Yuen, Timothy T.; Robbins, Kay A.
2014-01-01
Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…
Computer Assisted Testing at the Education Resource Center.
ERIC Educational Resources Information Center
Uffelman, Robert L.
The development of the Computer Assisted Testing (CAT) System at the University of Delaware is described. The introduction presents the background leading up to interactive terminal testing in 1973. Documentation for the system includes CAT System programs, format of questions for constructing test item pools, format for entering class lists,…
Improving Transfer of Learning in a Computer Based Classroom.
ERIC Educational Resources Information Center
Davis, Jay Bee
This report describes a program for improving the transfer of the learning of different techniques used in computer applications. The targeted population consisted of sophomores and juniors in a suburban high school in a middle class community. The problem was documented through teacher surveys, student surveys, anecdotal records and behavioral…
Computer-Simulated Psychotherapy as an Aid in Teaching Clinical Psychology.
ERIC Educational Resources Information Center
Suler, John R.
1987-01-01
Describes how Elisa, a widely known computer program which simulates the responses of a psychotherapist, can be used as a teaching aid in undergraduate clinical psychology classes. Provides information on conducting the exercise, integrating it into the course syllabus, and evaluating its impact on students. (JDH)
Teaching Pascal's Triangle from a Computer Science Perspective
ERIC Educational Resources Information Center
Skurnick, Ronald
2004-01-01
Pascal's Triangle is named for the seventeenth-century French philosopher and mathematician Blaise Pascal (the same person for whom the computer programming language is named). Students are generally introduced to Pascal's Triangle in an algebra or precalculus class in which the Binomial Theorem is presented. This article, presents a new method…
Girls and Computer Technology: Barrier or Key?
ERIC Educational Resources Information Center
Gipson, Joella
1997-01-01
Discusses the disparity in numbers of girls and boys taking math, science, and computer classes in elementary and secondary schools, and examines steps being taken to better prepare girls, especially minority girls, for an increasingly technical society. A program in Michigan is described that involved a school and business partnership. (LRW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swindeman, M. J.; Jetter, R. I.; Sham, T. -L.
One of the objectives of the high temperature design methodology activities is to develop and validate both improvements and the basic features of ASME Boiler and Pressure Vessel Code, Section III, Rules for Construction of Nuclear Facility Components, Division 5, High Temperature Reactors, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to aid assessment procedures of components under specified loading conditions in accordance with the elevated temperature design requirements for Division 5 Class A components. There are many features and alternative paths of varying complexity in HBB. The initial focus ofmore » this computer program is a basic path through the various options for a single reference material, 316H stainless steel. However, the computer program is being structured for eventual incorporation all of the features and permitted materials of HBB. This report will first provide a description of the overall computer program, particular challenges in developing numerical procedures for the assessment, and an overall approach to computer program development. This is followed by a more comprehensive appendix, which is the draft computer program manual for the program development. The strain limits rules have been implemented in the computer program. The evaluation of creep-fatigue damage will be implemented in future work scope.« less
Digital Simulation in Education.
ERIC Educational Resources Information Center
Braun, Ludwig
Simulation as a mode of computer use in instruction has been neglected by educators. This paper briefly explores the circumstances in which simulations are useful and presents several examples of simulation programs currently being used in high-school biology, chemistry, physics, and social studies classes. One program, STERIL, which simulates…
"Extreme Programming" in a Bioinformatics Class
ERIC Educational Resources Information Center
Kelley, Scott; Alger, Christianna; Deutschman, Douglas
2009-01-01
The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…
ERIC Educational Resources Information Center
PytlikZillig, Lisa M.; Horn, Christy A.; Bruning, Roger; Bell, Stephanie; Liu, Xiongyi; Siwatu, Kamau O.; Bodvarsson, Mary C.; Kim, Doyoung; Carlson, Deborah
2011-01-01
Two frequently-used discussion protocols were investigated as part of a program to implement teaching cases in undergraduate educational psychology classes designed for preservice teachers. One protocol involved synchronous face-to-face (FTF) discussion of teaching cases, which occurred in class after students had individually completed written…
NASA Technical Reports Server (NTRS)
Chang, I. C.
1984-01-01
A new computer program is presented for calculating the quasi-steady transonic flow past a helicopter rotor blade in hover as well as in forward flight. The program is based on the full potential equations in a blade attached frame of reference and is capable of treating a very general class of rotor blade geometries. Computed results show good agreement with available experimental data for both straight and swept tip blade geometries.
User's guide to STIPPAN: A panel method program for slotted tunnel interference prediction
NASA Technical Reports Server (NTRS)
Kemp, W. B., Jr.
1985-01-01
Guidelines are presented for use of the computer program STIPPAN to simulate the subsonic flow in a slotted wind tunnel test section with a known model disturbance. Input data requirements are defined in detail and other aspects of the program usage are discussed in more general terms. The program is written for use in a CDC CYBER 200 class vector processing system.
Benko, Matúš; Gfrerer, Helmut
2018-01-01
In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.
ERIC Educational Resources Information Center
Schaumburg, Heike
The goal of this study was to find out if the difference between boys and girls in computer literacy can be leveled out in a laptop program where each student has his/her own mobile computer to work with at home and at school. Ninth grade students (n=113) from laptop and non-laptop classes in a German high school were tested for their computer…
Development of an ADP Training Program to Serve the EPA Data Processing Community.
1976-07-29
divide, compute , perform and alter statements; data representation and conversion; table processing; and indexed sequential and random access file...processing. The course workshop will include the testing of coded exercises and problems on a computer system. CLASS SIZE: Individualized METHODS/CONDUCT...familiarization with computer concepts will be helpful. OBJECTIVES OF CURRICULUM After completing this course, the student should have developed a working
Problem Solving Software for Math Classes.
ERIC Educational Resources Information Center
Troutner, Joanne
1987-01-01
Described are 10 computer software programs for problem solving related to mathematics. Programs described are: (1) Box Solves Story Problems; (2) Safari Search; (3) Puzzle Tanks; (4) The King's Rule; (5) The Factory; (6) The Royal Rules; (7) The Enchanted Forest; (8) Gears; (9) The Super Factory; and (10) Creativity Unlimited. (RH)
ERIC Educational Resources Information Center
Heiner, Cecily
2009-01-01
Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This dissertation analyzes 411 questions from an introductory Java programming course by reducing the…
Exchanging Education and Culture.
ERIC Educational Resources Information Center
Gustafson, Christine; Knowlton, Leslie
1993-01-01
An eight-week residential program at the University of California at Irvine aims to increase representation of Native American students in high-tech fields and to encourage transfer of Native American students to four-year programs. Students spend four weeks in intensive computer science classes and four weeks serving as interns at sponsoring…
Programming and Problem Solving.
ERIC Educational Resources Information Center
Elias, Barbara P.
A study was conducted to examine computer programming as a problem solving activity. Thirteen fifth grade children were selected by their teacher from an above average class to use Apple IIe microcomputers. The investigator conducted sessions of 40-50 minutes with the children in groups of two or three. Four problems, incorporating the programming…
NASA Technical Reports Server (NTRS)
Mitchell, Paul H.
1991-01-01
F77NNS (FORTRAN 77 Neural Network Simulator) computer program simulates popular back-error-propagation neural network. Designed to take advantage of vectorization when used on computers having this capability, also used on any computer equipped with ANSI-77 FORTRAN Compiler. Problems involving matching of patterns or mathematical modeling of systems fit class of problems F77NNS designed to solve. Program has restart capability so neural network solved in stages suitable to user's resources and desires. Enables user to customize patterns of connections between layers of network. Size of neural network F77NNS applied to limited only by amount of random-access memory available to user.
An Evaluation of Computer-Managed Education Technology at New York City Community College.
ERIC Educational Resources Information Center
Chitayat, Linda
The Computer-Managed Education Technology (COMET) program was designed to improve group instruction through the use of technological aids in the classroom. Specific objectives included: (1) improving feedback on student comprehension during a class period; (2) facilitating the administration and grading of homework and quizzes; (3) providing for…
Computer Aided Learning of Mathematics: Software Evaluation
ERIC Educational Resources Information Center
Yushau, B.; Bokhari, M. A.; Wessels, D. C. J.
2004-01-01
Computer Aided Learning of Mathematics (CALM) has been in use for some time in the Prep-Year Mathematics Program at King Fahd University of Petroleum & Minerals. Different kinds of software (both locally designed and imported) have been used in the quest of optimizing the recitation/problem session hour of the mathematics classes. This paper…
An Analysis of a Computer Assisted Learning System: Student Perception and Reactions.
ERIC Educational Resources Information Center
Gibbs, W. J.; And Others
Within the Mathematics of Finance classes at the Smeal College of Business Administration at Penn State University, lectures are developed using Asymetric's Toolbook program and are presented through a computer system. This approach was implemented because it has the potential to convey effectively concepts that are ordinarily difficult to…
The Computer: An Art Tool for the Visually Gifted. A Curriculum Guide.
ERIC Educational Resources Information Center
Suter, Thomas E.; Bibbey, Melissa R.
This curriculum guide, developed and used in Wheelersburg (Ohio) with visually talented students, shows how such students can be taught to utilize computers as an art medium and tool. An initial section covers program implementation including setup, class structure and scheduling, teaching strategies, and housecleaning and maintenance. Seventeen…
NASA Technical Reports Server (NTRS)
Kemp, William B., Jr.
1990-01-01
Guidelines are presented for use of the computer program PANCOR to assess the interference due to tunnel walls and model support in a slotted wind tunnel test section at subsonic speeds. Input data requirements are described in detail and program output and general program usage are described. The program is written for effective automatic vectorization on a CDC CYBER 200 class vector processing system.
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
Computer program for supersonic Kernel-function flutter analysis of thin lifting surfaces
NASA Technical Reports Server (NTRS)
Cunningham, H. J.
1974-01-01
This report describes a computer program (program D2180) that has been prepared to implement the analysis described in (N71-10866) for calculating the aerodynamic forces on a class of harmonically oscillating planar lifting surfaces in supersonic potential flow. The planforms treated are the delta and modified-delta (arrowhead) planforms with subsonic leading and supersonic trailing edges, and (essentially) pointed tips. The resulting aerodynamic forces are applied in a Galerkin modal flutter analysis. The required input data are the flow and planform parameters including deflection-mode data, modal frequencies, and generalized masses.
Supercomputer optimizations for stochastic optimal control applications
NASA Technical Reports Server (NTRS)
Chung, Siu-Leung; Hanson, Floyd B.; Xu, Huihuang
1991-01-01
Supercomputer optimizations for a computational method of solving stochastic, multibody, dynamic programming problems are presented. The computational method is valid for a general class of optimal control problems that are nonlinear, multibody dynamical systems, perturbed by general Markov noise in continuous time, i.e., nonsmooth Gaussian as well as jump Poisson random white noise. Optimization techniques for vector multiprocessors or vectorizing supercomputers include advanced data structures, loop restructuring, loop collapsing, blocking, and compiler directives. These advanced computing techniques and superconducting hardware help alleviate Bellman's curse of dimensionality in dynamic programming computations, by permitting the solution of large multibody problems. Possible applications include lumped flight dynamics models for uncertain environments, such as large scale and background random aerospace fluctuations.
RighTime: A real time clock correcting program for MS-DOS-based computer systems
NASA Technical Reports Server (NTRS)
Becker, G. Thomas
1993-01-01
A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.
Cisco Networking Academy Program for high school students: Formative & summative evaluation
NASA Astrophysics Data System (ADS)
Cranford-Wesley, Deanne
This study examined the effectiveness of the Cisco Network Technology Program in enhancing students' technology skills as measured by classroom strategies, student motivation, student attitude, and student learning. Qualitative and quantitative methods were utilized to determine the effectiveness of this program. The study focused on two 11th grade classrooms at Hamtramck High School. Hamtramck, an inner-city community located in Detroit, is racially and ethnically diverse. The majority of students speak English as a second language; more than 20 languages are represented in the school district. More than 70% of the students are considered to be economically at risk. Few students have computers at home, and their access to the few computers at school is limited. Purposive sampling was conducted for this study. The sample consisted of 40 students, all of whom were trained in Cisco Networking Technologies. The researcher examined viable learning strategies in teaching a Cisco Networking class that focused on a web-based approach. Findings revealed that the Cisco Networking Academy Program was an excellent vehicle for teaching networking skills and, therefore, helping to enhance computer skills for the participating students. However, only a limited number of students were able to participate in the program, due to limited computer labs and lack of qualified teaching personnel. In addition, the cumbersome technical language posed an obstacle to students' success in networking. Laboratory assignments were preferred by 90% of the students over lecture and PowerPoint presentations. Practical applications, lab projects, interactive assignments, PowerPoint presentations, lectures, discussions, readings, research, and assessment all helped to increase student learning and proficiency and to enrich the classroom experience. Classroom strategies are crucial to student success in the networking program. Equipment must be updated and utilized to ensure that students are applying practical skills to networking concepts. The results also suggested a high level of motivation and retention in student participants. Students in both classes scored 80% proficiency on the Achievement Motivation Profile Assessment. The identified standard proficiency score was 70%, and both classes exceeded the standard.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Karla
Although the high-performance computing (HPC) community increasingly embraces object-oriented programming (OOP), most HPC OOP projects employ the C++ programming language. Until recently, Fortran programmers interested in mining the benefits of OOP had to emulate OOP in Fortran 90/95. The advent of widespread compiler support for Fortran 2003 now facilitates explicitly constructing object-oriented class hierarchies via inheritance and leveraging related class behaviors such as dynamic polymorphism. Although C++ allows a class to inherit from multiple parent classes, Fortran and several other OOP languages restrict or prohibit explicit multiple inheritance relationships in order to circumvent several pitfalls associated with them. Nonetheless, whatmore » appears as an intrinsic feature in one language can be modeled as a user-constructed design pattern in another language. The present paper demonstrates how to apply the facade structural design pattern to support a multiple inheritance class relationship in Fortran 2003. As a result, the design unleashes the power of the associated class relationships for modeling complicated data structures yet avoids the ambiguities that plague some multiple inheritance scenarios.« less
A performance comparison of the Cray-2 and the Cray X-MP
NASA Technical Reports Server (NTRS)
Schmickley, Ronald; Bailey, David H.
1986-01-01
A suite of thirteen large Fortran benchmark codes were run on Cray-2 and Cray X-MP supercomputers. These codes were a mix of compute-intensive scientific application programs (mostly Computational Fluid Dynamics) and some special vectorized computation exercise programs. For the general class of programs tested on the Cray-2, most of which were not specially tuned for speed, the floating point operation rates varied under a variety of system load configurations from 40 percent up to 125 percent of X-MP performance rates. It is concluded that the Cray-2, in the original system configuration studied (without memory pseudo-banking) will run untuned Fortran code, on average, about 70 percent of X-MP speeds.
Using Computerized Outlines in Teaching American Government.
ERIC Educational Resources Information Center
Janda, Kenneth
Because writing an outline on the chalk board wastes time and space, a computer program, a videoprojector, and a standard motion picture screen were used to outline a lecture to a large history class. PC-Outline is an outline program for IBM-compatible microcomputers which aids in processing outlines by inserting Roman numerals, letters, and…
ERIC Educational Resources Information Center
Milet, Lynn K.; Harvey, Francis A.
Hypermedia and object oriented programming systems (OOPs) represent examples of "open" computer environments that allow the user access to parts of the code or operating system. Both systems share fundamental intellectual concepts (objects, messages, methods, classes, and inheritance), so that an understanding of hypermedia can help in…
Care 3 phase 2 report, maintenance manual
NASA Technical Reports Server (NTRS)
Bryant, L. A.; Stiffler, J. J.
1982-01-01
CARE 3 (Computer-Aided Reliability Estimation, version three) is a computer program designed to help estimate the reliability of complex, redundant systems. Although the program can model a wide variety of redundant structures, it was developed specifically for fault-tolerant avionics systems--systems distinguished by the need for extremely reliable performance since a system failure could well result in the loss of human life. It substantially generalizes the class of redundant configurations that could be accommodated, and includes a coverage model to determine the various coverage probabilities as a function of the applicable fault recovery mechanisms (detection delay, diagnostic scheduling interval, isolation and recovery delay, etc.). CARE 3 further generalizes the class of system structures that can be modeled and greatly expands the coverage model to take into account such effects as intermittent and transient faults, latent faults, error propagation, etc.
A Methodological Study of a Computer-Managed Instructional Program in High School Physics.
ERIC Educational Resources Information Center
Denton, Jon James
The purpose of this study was to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides in physics at the secondary school level. The sample consisted of three classes. Of these, two were randomly selected to serve as the treatment groups, e.g., individualized instruction and…
ERIC Educational Resources Information Center
Barak, Miri; Harward, Judson; Kocur, George; Lerman, Steven
2007-01-01
Within the framework of MIT's course 1.00: Introduction to Computers and Engineering Problem Solving, this paper describes an innovative project entitled: "Studio 1.00" that integrates lectures with in-class demonstrations, active learning sessions, and on-task feedback, through the use of wireless laptop computers. This paper also describes a…
ERIC Educational Resources Information Center
Geigel, Joan; And Others
A self-paced program designed to integrate the use of computers and physics courseware into the regular classroom environment is offered for physics high school teachers in this module on projectile and circular motion. A diversity of instructional strategies including lectures, demonstrations, videotapes, computer simulations, laboratories, and…
ERIC Educational Resources Information Center
Klieger, Aviva; Ben-Hur, Yehuda; Bar-Yossef, Nurit
2010-01-01
The study examines the professional development of junior-high-school teachers participating in the Israeli "Katom" (Computer for Every Class, Student and Teacher) Program, begun in 2004. A three-circle support and training model was developed for teachers' professional development. The first circle applies to all teachers in the…
Resequencing Skills and Concepts in Applied Calculus Using the Computer as a Tool.
ERIC Educational Resources Information Center
Heid, M. Kathleen
1988-01-01
During the first 12 weeks of an applied calculus course, two classes of college students studied calculus concepts using graphical and symbol-manipulation computer programs to perform routine manipulations. Three weeks were spent on skill development. Students showed better understanding of concepts and performed almost as well on routine skills.…
SouthPro : a computer program for managing uneven-aged loblolly pine stands
Benedict Schulte; Joseph Buongiorno; Ching-Rong Lin; Kenneth E. Skog
1998-01-01
SouthPro is a Microsoft Excel add-in program that simulates the management, growth, and yield of uneven-aged loblolly pine stands in the Southern United States. The built-in growth model of this program was calibrated from 991 uneven-aged plots in seven states, covering most growing conditions and sites. Stands are described by the number of trees in 13 size classes...
Free oscilloscope web app using a computer mic, built-in sound library, or your own files
NASA Astrophysics Data System (ADS)
Ball, Edward; Ruiz, Frances; Ruiz, Michael J.
2017-07-01
We have developed an online oscilloscope program which allows users to see waveforms by utilizing their computer microphones, selecting from our library of over 30 audio files, and opening any *.mp3 or *.wav file on their computers. The oscilloscope displays real-time signals against time. The oscilloscope has been calibrated so one can make accurate frequency measurements of periodic waves to within 1%. The web app is ideal for computer projection in class.
Parallel language constructs for tensor product computations on loosely coupled architectures
NASA Technical Reports Server (NTRS)
Mehrotra, Piyush; Vanrosendale, John
1989-01-01
Distributed memory architectures offer high levels of performance and flexibility, but have proven awkard to program. Current languages for nonshared memory architectures provide a relatively low level programming environment, and are poorly suited to modular programming, and to the construction of libraries. A set of language primitives designed to allow the specification of parallel numerical algorithms at a higher level is described. Tensor product array computations are focused on along with a simple but important class of numerical algorithms. The problem of programming 1-D kernal routines is focused on first, such as parallel tridiagonal solvers, and then how such parallel kernels can be combined to form parallel tensor product algorithms is examined.
COMOC 2: Two-dimensional aerodynamics sequence, computer program user's guide
NASA Technical Reports Server (NTRS)
Manhardt, P. D.; Orzechowski, J. A.; Baker, A. J.
1977-01-01
The COMOC finite element fluid mechanics computer program system is applicable to diverse problem classes. The two dimensional aerodynamics sequence was established for solution of the potential and/or viscous and turbulent flowfields associated with subsonic flight of elementary two dimensional isolated airfoils. The sequence is constituted of three specific flowfield options in COMOC for two dimensional flows. These include the potential flow option, the boundary layer option, and the parabolic Navier-Stokes option. By sequencing through these options, it is possible to computationally construct a weak-interaction model of the aerodynamic flowfield. This report is the user's guide to operation of COMOC for the aerodynamics sequence.
Gradinger, Petra; Yanagida, Takuya; Strohmeier, Dagmar; Spiel, Christiane
2016-01-01
We investigated whether the general anti-bullying program ViSC sustainably prevents cyberbullying and cyber-victimization. A longitudinal randomized control group design was used to examine (i) program effectiveness immediately after a 1 year implementation phase and (ii) sustainable program effects 6 months later taking several moderators on the class level (class climate and ethnic diversity) and on the individual level (gender, age, internet usage, traditional bullying/victimization) into account. Effectiveness (e.g., the change between waves 2 and 1) was examined in 2,042 students (47.6% girls), aged 11.7 years (SD = 0.88) enrolled in 18 schools and 103 classes. Sustainability (e.g., the change between waves 3 and 2) was examined in a sub-sample of 6 schools and 35 classes comprising 659 students. The self-assessment multiple-item scales showed longitudinal and multiple group invariance. Factor scores were extracted to compute difference scores for effectiveness (Posttest minus Pretest) and sustainability (Follow-up test minus Posttest) for cyberbullying and cyber-victimization. Multilevel Modeling was applied to examine (i) the effectiveness and (ii) the sustainability of the ViSC intervention controlling for several individual and class level variables. Controlling for covariates, it was demonstrated that the ViSC program is effective in preventing cyberbullying and cyber-victimization and that the effects are sustainable after 6 months. The consequences for cyberbullying prevention are discussed. © 2016 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Raju, I. S.
1986-01-01
The Q3DG is a computer program developed to perform a quasi-three-dimensional stress analysis for composite laminates which may contain delaminations. The laminates may be subjected to mechanical, thermal, and hygroscopic loads. The program uses the finite element method and models the laminates with eight-noded parabolic isoparametric elements. The program computes the strain-energy-release components and the total strain-energy release in all three modes for delamination growth. A rectangular mesh and data file generator, DATGEN, is included. The DATGEN program can be executed interactively and is user friendly. The documentation includes sections dealing with the Q3D analysis theory, derivation of element stiffness matrices and consistent load vectors for the parabolic element. Several sample problems with the input for Q3DG and output from the program are included. The capabilities of the DATGEN program are illustrated with examples of interactive sessions. A microfiche of all the examples is included. The Q3DG and DATGEN programs have been implemented on CYBER 170 class computers. Q3DG and DATGEN were developed at the Langley Research Center during the early eighties and documented in 1984 to 1985.
Programming and Operations Lab 1--Intermediate, Data Processing Technology: 8025.23.
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
The following course outline has been prepared as a guide toward helping the student develop an understanding of operating principles and procedures necessary in processing data electronically. Students who have met the objectives of Designing the Computer Program should be admitted to this course. The class meets 2 hours per day for 90 clock…
ERIC Educational Resources Information Center
Rotunda, Rob J.; West, Laura; Epstein, Joel
2003-01-01
Alcohol and drug use education and prevention continue to be core educational issues. In seeking to inform students at all levels about drug use, the present exploratory study highlights the potential educational use of interactive computer programs for this purpose. Seventy-three college students from two substance abuse classes interacted for at…
Processing data from soil assessment surveys with the computer program SOILS.
John W. Hazard; Jeralyn Snellgrove; J. Michael Geist
1985-01-01
Program SOILS processes data from soil assessment surveys following a design adopted by the Pacific Northwest Region of the USDA Forest Service. It accepts measurements from line transects and associated soil subsamples and generates estimates of the percentages of the sampled area falling in each soil condition class. Total disturbance is calculated by combining...
Evaluation of the Child Care Class for Older Adults.
ERIC Educational Resources Information Center
Gallegos, Sandra
In 1986, the Ability Based on Older Dependable Experience (ABODE) Program was developed at De Anza College to train older adults to serve as a temporary source of child care on an emergency basis. The program was sponsored by Tandem Computers, Incorporated, out of a desire to provide better employee benefits with respect to child care. The program…
Programming in a Robotics Context in the Kindergarten Classroom: The Impact on Sequencing Skills
ERIC Educational Resources Information Center
Kazakoff, Elizabeth; Bers, Marina
2012-01-01
This paper examines the impact of computer programming of robots on sequencing ability in early childhood and the relationship between sequencing skills, class size, and teacher's comfort level and experience with technology. Fifty-eight children participated in the study, 54 of whom were included in data analysis. This study was conducted in two…
An Empirical Consideration of the Use of R in Actively Constructing Sampling Distributions
ERIC Educational Resources Information Center
Vaughn, Brandon K.
2009-01-01
In this paper, an interactive teaching approach to introduce the concept of sampling distributions using the statistical software program, R, is shown. One advantage of this approach is that the program R is freely available via the internet. Instructors can easily demonstrate concepts in class, outfit entire computer labs, and/or assign the…
NASA Technical Reports Server (NTRS)
Nosenchuck, D. M.; Littman, M. G.
1986-01-01
The Navier-Stokes computer (NSC) has been developed for solving problems in fluid mechanics involving complex flow simulations that require more speed and capacity than provided by current and proposed Class VI supercomputers. The machine is a parallel processing supercomputer with several new architectural elements which can be programmed to address a wide range of problems meeting the following criteria: (1) the problem is numerically intensive, and (2) the code makes use of long vectors. A simulation of two-dimensional nonsteady viscous flows is presented to illustrate the architecture, programming, and some of the capabilities of the NSC.
ERIC Educational Resources Information Center
Summers, Edward G.; Hagen, Debra
1987-01-01
A computer software graphics/text program was used with a group of behavior-disordered and learning-disabled students (ages 10-13) to design and produce Christmas cards and related print products as a class project. Sample lesson plans and products, as well as sample order forms and price lists, are given. (JW)
ERIC Educational Resources Information Center
New Orleans Public Schools, LA.
Secondary school teachers incorporating the use of a computer in algebra, trigonometry, advanced mathematics, chemistry, or physics classes are the individuals for whom this book is intended. The content included in it is designed to aid the learning of programing techniques and basic scientific or mathematical principles, and to offer some…
ERIC Educational Resources Information Center
Mallios, Nikolaos; Vassilakopoulos, Michael Gr.
2015-01-01
One of the most intriguing objectives when teaching computer science in mid-adolescence high school students is attracting and mainly maintaining their concentration within the limits of the class. A number of theories have been proposed and numerous methodologies have been applied, aiming to assist in the implementation of a personalized learning…
Using Testbanking To Implement Classroom Management/Extension through the Use of Computers.
ERIC Educational Resources Information Center
Thommen, John D.
Testbanking provides teachers with an effective, low-cost, time-saving opportunity to improve the testing aspect of their classes. Testbanking, which involves the use of a testbank program and a computer, allows teachers to develop and generate tests and test-forms with a minimum of effort. Teachers who test using true and false, multiple choice,…
ERIC Educational Resources Information Center
Gratz, Zandra S.; And Others
A study was conducted at a large, state-supported college in the Northeast to establish a mechanism by which a popular software package, Statistical Package for the Social Sciences (SPSS), could be used in psychology program statistics courses in such a way that no prior computer expertise would be needed on the part of the faculty or the…
NASA Technical Reports Server (NTRS)
Conel, J. E.
1975-01-01
A computer program (Program SPHERE) solving the inhomogeneous equation of heat conduction with radiation boundary condition on a thermally homogeneous sphere is described. The source terms are taken to be exponential functions of the time. Thermal properties are independent of temperature. The solutions are appropriate to studying certain classes of planetary thermal history. Special application to the moon is discussed.
ERIC Educational Resources Information Center
Hubin, W. N.
1982-01-01
Various microcomputer-generated astronomy graphs are presented, including those of constellations and planetary motions. Graphs were produced on a computer-driver plotter and then reproduced for class use. Copies of the programs that produced the graphs are available from the author. (Author/JN)
NASA Technical Reports Server (NTRS)
Sim, A. G.
1973-01-01
A brief study was made to assess the applicability of the Newton-Raphson digital computer program as a routine technique for extracting aerodynamic derivatives from flight tests of lifting body types of vehicles. Lateral-direction flight data from flight tests of the HL-10 lifting body reserch vehicle were utilized. The results in general, show the computer program to be a reliable and expedient means for extracting derivatives for this class of vehicles as a standard procedure. This result was true even when stability augmentation was used. As a result of the study, a credible set of HL-10 lateral-directional derivatives was obtained from flight data. These derivatives are compared with results from wind-tunnel tests.
NASA Technical Reports Server (NTRS)
Pearson, Don; Hamm, Dustin; Kubena, Brian; Weaver, Jonathan K.
2010-01-01
An updated version of the Platform Independent Software Components for the Exploration of Space (PISCES) software library is available. A previous version was reported in Library for Developing Spacecraft-Mission-Planning Software (MSC-22983), NASA Tech Briefs, Vol. 25, No. 7 (July 2001), page 52. To recapitulate: This software provides for Web-based, collaborative development of computer programs for planning trajectories and trajectory- related aspects of spacecraft-mission design. The library was built using state-of-the-art object-oriented concepts and software-development methodologies. The components of PISCES include Java-language application programs arranged in a hierarchy of classes that facilitates the reuse of the components. As its full name suggests, the PISCES library affords platform-independence: The Java language makes it possible to use the classes and application programs with a Java virtual machine, which is available in most Web-browser programs. Another advantage is expandability: Object orientation facilitates expansion of the library through creation of a new class. Improvements in the library since the previous version include development of orbital-maneuver- planning and rendezvous-launch-window application programs, enhancement of capabilities for propagation of orbits, and development of a desktop user interface.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Process Validation Table (PVT) Widget Class ( Class is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network registration services for Information Sharing Protocol (ISP) graphical-user-interface (GUI) computer programs. Heretofore, ISP PVT programming tasks have required many method calls to identify, query, and interpret the connections and messages exchanged between a client and a PVT server. Normally, programmers have utilized direct access to UNIX socket libraries to implement the PVT protocol queries, necessitating the use of many lines of source code to perform frequent tasks. Now, the X-Windows PVT Widget Class encapsulates ISP client server network registration management tasks within the framework of an X Windows widget. Use of the widget framework enables an X Windows GUI program to interact with PVT services in an abstract way and in the same manner as that of other graphical widgets, making it easier to program PVT clients. Wrapping the PVT services inside the widget framework enables a programmer to treat a PVT server interface as though it were a GUI. Moreover, an alternate subclass could implement another service in a widget of the same type. This program was written by Matthew R. Barry of United Space Alliance for Johnson Space Center. For further information, contact the Johnson Technology Transfer Office at (281) 483-3809. MSC-23582 Shuttle Data Center File- Processing Tool in Java A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
NASA Technical Reports Server (NTRS)
Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.
1982-01-01
This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.
ERIC Educational Resources Information Center
Duda, Richard O.; Shortliffe, Edward H.
1983-01-01
Discusses a class of artificial intelligence computer programs (often called "expert systems" because they address problems normally thought to require human specialists for their solution) intended to serve as consultants for decision making. Also discusses accomplishments (including information systematization in medical diagnosis and…
Emulating multiple inheritance in Fortran 2003/2008
Morris, Karla
2015-01-24
Although the high-performance computing (HPC) community increasingly embraces object-oriented programming (OOP), most HPC OOP projects employ the C++ programming language. Until recently, Fortran programmers interested in mining the benefits of OOP had to emulate OOP in Fortran 90/95. The advent of widespread compiler support for Fortran 2003 now facilitates explicitly constructing object-oriented class hierarchies via inheritance and leveraging related class behaviors such as dynamic polymorphism. Although C++ allows a class to inherit from multiple parent classes, Fortran and several other OOP languages restrict or prohibit explicit multiple inheritance relationships in order to circumvent several pitfalls associated with them. Nonetheless, whatmore » appears as an intrinsic feature in one language can be modeled as a user-constructed design pattern in another language. The present paper demonstrates how to apply the facade structural design pattern to support a multiple inheritance class relationship in Fortran 2003. As a result, the design unleashes the power of the associated class relationships for modeling complicated data structures yet avoids the ambiguities that plague some multiple inheritance scenarios.« less
Computer problem-solving coaches for introductory physics: Design and usability studies
NASA Astrophysics Data System (ADS)
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-06-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.
An Object-Oriented Network-Centric Software Architecture for Physical Computing
NASA Astrophysics Data System (ADS)
Palmer, Richard
1997-08-01
Recent developments in object-oriented computer languages and infrastructure such as the Internet, Web browsers, and the like provide an opportunity to define a more productive computational environment for scientific programming that is based more closely on the underlying mathematics describing physics than traditional programming languages such as FORTRAN or C++. In this talk I describe an object-oriented software architecture for representing physical problems that includes classes for such common mathematical objects as geometry, boundary conditions, partial differential and integral equations, discretization and numerical solution methods, etc. In practice, a scientific program written using this architecture looks remarkably like the mathematics used to understand the problem, is typically an order of magnitude smaller than traditional FORTRAN or C++ codes, and hence easier to understand, debug, describe, etc. All objects in this architecture are ``network-enabled,'' which means that components of a software solution to a physical problem can be transparently loaded from anywhere on the Internet or other global network. The architecture is expressed as an ``API,'' or application programmers interface specification, with reference embeddings in Java, Python, and C++. A C++ class library for an early version of this API has been implemented for machines ranging from PC's to the IBM SP2, meaning that phidentical codes run on all architectures.
ERIC Educational Resources Information Center
Bey, Anis; Jermann, Patrick; Dillenbourg, Pierre
2018-01-01
Computer-graders have been in regular use in the context of MOOCs (Massive Open Online Courses). The automatic grading of programs presents an opportunity to assess and provide tailored feedback to large classes, while featuring at the same time a number of benefits like: immediate feedback, unlimited submissions, as well as low cost of feedback.…
Dynamic programming algorithms for biological sequence comparison.
Pearson, W R; Miller, W
1992-01-01
Efficient dynamic programming algorithms are available for a broad class of protein and DNA sequence comparison problems. These algorithms require computer time proportional to the product of the lengths of the two sequences being compared [O(N2)] but require memory space proportional only to the sum of these lengths [O(N)]. Although the requirement for O(N2) time limits use of the algorithms to the largest computers when searching protein and DNA sequence databases, many other applications of these algorithms, such as calculation of distances for evolutionary trees and comparison of a new sequence to a library of sequence profiles, are well within the capabilities of desktop computers. In particular, the results of library searches with rapid searching programs, such as FASTA or BLAST, should be confirmed by performing a rigorous optimal alignment. Whereas rapid methods do not overlook significant sequence similarities, FASTA limits the number of gaps that can be inserted into an alignment, so that a rigorous alignment may extend the alignment substantially in some cases. BLAST does not allow gaps in the local regions that it reports; a calculation that allows gaps is very likely to extend the alignment substantially. Although a Monte Carlo evaluation of the statistical significance of a similarity score with a rigorous algorithm is much slower than the heuristic approach used by the RDF2 program, the dynamic programming approach should take less than 1 hr on a 386-based PC or desktop Unix workstation. For descriptive purposes, we have limited our discussion to methods for calculating similarity scores and distances that use gap penalties of the form g = rk. Nevertheless, programs for the more general case (g = q+rk) are readily available. Versions of these programs that run either on Unix workstations, IBM-PC class computers, or the Macintosh can be obtained from either of the authors.
Equations of motion for coupled n-body systems
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1980-01-01
Computer program, developed to analyze spacecraft attitude dynamics, can be applied to large class of problems involving objects that can be simplified into component parts. Systems of coupled rigid bodies, point masses, symmetric wheels, and elastically flexible bodies can be analyzed. Program derives complete set of non-linear equations of motion in vectordyadic format. Numerical solutions may be printed out. Program is in FORTRAN IV for batch execution and has been implemented on IBM 360.
ERIC Educational Resources Information Center
Leonard, B. Charles; Denton, Jon J.
A study sought to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides to account for the idiosyncratic variations among students in physics classes at the secondary school level. The students in the treatment groups were oriented toward the practices of selecting…
Kon, Haruka; Kobayashi, Hiroshi; Sakurai, Naoki; Watanabe, Kiyoshi; Yamaga, Yoshiro; Ono, Takahiro
2017-11-01
The aim of the present study was to clarify differences between personal computer (PC)/mobile device combination and PC-only user patterns. We analyzed access frequency and time spent on a complete denture preclinical website in order to maximize website effectiveness. Fourth-year undergraduate students (N=41) in the preclinical complete denture laboratory course were invited to participate in this survey during the final week of the course to track login data. Students accessed video demonstrations and quizzes via our e-learning site/course program, and were instructed to view online demonstrations before classes. When the course concluded, participating students filled out a questionnaire about the program, their opinions, and devices they had used to access the site. Combination user access was significantly more frequent than PC-only during supplementary learning time, indicating that students with mobile devices studied during lunch breaks and before morning classes. Most students had favorable opinions of the e-learning site, but a few combination users commented that some videos were too long and that descriptive answers were difficult on smartphones. These results imply that mobile devices' increased accessibility encouraged learning by enabling more efficient time use between classes. They also suggest that e-learning system improvements should cater to mobile device users by reducing video length and including more short-answer questions. © 2016 John Wiley & Sons Australia, Ltd.
Design of object-oriented distributed simulation classes
NASA Technical Reports Server (NTRS)
Schoeffler, James D. (Principal Investigator)
1995-01-01
Distributed simulation of aircraft engines as part of a computer aided design package is being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for 'Numerical Propulsion Simulation System'. NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT 'Actor' model of a concurrent object and uses 'connectors' to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.
Design of Object-Oriented Distributed Simulation Classes
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1995-01-01
Distributed simulation of aircraft engines as part of a computer aided design package being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for "Numerical Propulsion Simulation System". NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT "Actor" model of a concurrent object and uses "connectors" to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.
ERIC Educational Resources Information Center
Bukofser, J. Holly
2013-01-01
This research involved a 14-month qualitative case study of a fourth-grade bilingual class embarking on a one-to-one netbook computer initiative. Students in this program had unlimited use of the netbooks during the school day. The main foci of the data collected were student achievement, teacher instruction, and student and teacher perceptions of…
Parallel computation with the force
NASA Technical Reports Server (NTRS)
Jordan, H. F.
1985-01-01
A methodology, called the force, supports the construction of programs to be executed in parallel by a force of processes. The number of processes in the force is unspecified, but potentially very large. The force idea is embodied in a set of macros which produce multiproceossor FORTRAN code and has been studied on two shared memory multiprocessors of fairly different character. The method has simplified the writing of highly parallel programs within a limited class of parallel algorithms and is being extended to cover a broader class. The individual parallel constructs which comprise the force methodology are discussed. Of central concern are their semantics, implementation on different architectures and performance implications.
Automatic Computer Mapping of Terrain
NASA Technical Reports Server (NTRS)
Smedes, H. W.
1971-01-01
Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.
Fault detection and initial state verification by linear programming for a class of Petri nets
NASA Technical Reports Server (NTRS)
Rachell, Traxon; Meyer, David G.
1992-01-01
The authors present an algorithmic approach to determining when the marking of a LSMG (live safe marked graph) or a LSFC (live safe free choice) net is in the set of live safe markings M. Hence, once the marking of a net is determined to be in M, then if at some time thereafter the marking of this net is determined not to be in M, this indicates a fault. It is shown how linear programming can be used to determine if m is an element of M. The worst-case computational complexity of each algorithm is bounded by the number of linear programs necessary to compute.
Using CASE Software to Teach Undergraduates Systems Analysis and Design.
ERIC Educational Resources Information Center
Wilcox, Russell E.
1988-01-01
Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)
ERIC Educational Resources Information Center
Burke, Edmund B.
1994-01-01
Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…
Message Passing vs. Shared Address Space on a Cluster of SMPs
NASA Technical Reports Server (NTRS)
Shan, Hongzhang; Singh, Jaswinder Pal; Oliker, Leonid; Biswas, Rupak
2000-01-01
The convergence of scalable computer architectures using clusters of PCs (or PC-SMPs) with commodity networking has become an attractive platform for high end scientific computing. Currently, message-passing and shared address space (SAS) are the two leading programming paradigms for these systems. Message-passing has been standardized with MPI, and is the most common and mature programming approach. However message-passing code development can be extremely difficult, especially for irregular structured computations. SAS offers substantial ease of programming, but may suffer from performance limitations due to poor spatial locality, and high protocol overhead. In this paper, we compare the performance of and programming effort, required for six applications under both programming models on a 32 CPU PC-SMP cluster. Our application suite consists of codes that typically do not exhibit high efficiency under shared memory programming. due to their high communication to computation ratios and complex communication patterns. Results indicate that SAS can achieve about half the parallel efficiency of MPI for most of our applications: however, on certain classes of problems SAS performance is competitive with MPI. We also present new algorithms for improving the PC cluster performance of MPI collective operations.
The Argonne Leadership Computing Facility 2010 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less
Programming Probabilistic Structural Analysis for Parallel Processing Computer
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.
1991-01-01
The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.
NASA Astrophysics Data System (ADS)
Zhao, Mi; Li, ZhiWu; Hu, HeSuan
2010-09-01
This article develops a deadlock prevention policy for a class of generalised Petri nets, which can well model a large class of flexible manufacturing systems. The analysis of such a system leads us to characterise the deadlock situations in terms of the insufficiently marked siphons in its generalised Petri-net model. The proposed policy is carried out in an iterative way. At each step a minimal siphon is derived from a maximal deadly marked siphon that is found by solving a mixed integer programming (MIP) problem. An algorithm is formalised that can efficiently compute such a minimal siphon from a maximal one. A monitor is added for a derived minimal siphon such that it is max-controlled if it is elementary with respect to the siphons that have been derived. The liveness of the controlled system is decided by the fact that no siphon can be derived due to the MIP solution. After a liveness-enforcing net supervisor computed without complete siphon enumeration, the output-arcs of the additional monitors are rearranged such that the monitors act while restricting the system less. Examples are presented to demonstrate the proposed method.
Lesko, Mehdi M; Woodford, Maralyn; White, Laura; O'Brien, Sarah J; Childs, Charmaine; Lecky, Fiona E
2010-08-06
The purpose of Abbreviated Injury Scale (AIS) is to code various types of Traumatic Brain Injuries (TBI) based on their anatomical location and severity. The Marshall CT Classification is used to identify those subgroups of brain injured patients at higher risk of deterioration or mortality. The purpose of this study is to determine whether and how AIS coding can be translated to the Marshall Classification Initially, a Marshall Class was allocated to each AIS code through cross-tabulation. This was agreed upon through several discussion meetings with experts from both fields (clinicians and AIS coders). Furthermore, in order to make this translation possible, some necessary assumptions with regards to coding and classification of mass lesions and brain swelling were essential which were all approved and made explicit. The proposed method involves two stages: firstly to determine all possible Marshall Classes which a given patient can attract based on allocated AIS codes; via cross-tabulation and secondly to assign one Marshall Class to each patient through an algorithm. This method can be easily programmed in computer softwares and it would enable future important TBI research programs using trauma registry data.
2010-01-01
Background The purpose of Abbreviated Injury Scale (AIS) is to code various types of Traumatic Brain Injuries (TBI) based on their anatomical location and severity. The Marshall CT Classification is used to identify those subgroups of brain injured patients at higher risk of deterioration or mortality. The purpose of this study is to determine whether and how AIS coding can be translated to the Marshall Classification Methods Initially, a Marshall Class was allocated to each AIS code through cross-tabulation. This was agreed upon through several discussion meetings with experts from both fields (clinicians and AIS coders). Furthermore, in order to make this translation possible, some necessary assumptions with regards to coding and classification of mass lesions and brain swelling were essential which were all approved and made explicit. Results The proposed method involves two stages: firstly to determine all possible Marshall Classes which a given patient can attract based on allocated AIS codes; via cross-tabulation and secondly to assign one Marshall Class to each patient through an algorithm. Conclusion This method can be easily programmed in computer softwares and it would enable future important TBI research programs using trauma registry data. PMID:20691038
Calculative techniques for transonic flows about certain classes of wing-body combinations, phase 2
NASA Technical Reports Server (NTRS)
Stahara, S. S.; Spreiter, J. R.
1972-01-01
Theoretical analysis and associated computer programs were developed for predicting properties of transonic flows about certain classes of wing-body combinations. The procedures used are based on the transonic equivalence rule and employ either an arbitrarily-specified solution or the local linerization method for determining the nonlifting transonic flow about the equivalent body. The class of wind planform shapes include wings having sweptback trailing edges and finite tip chord. Theoretical results are presented for surface and flow-field pressure distributions for both nonlifting and lifting situations at Mach number one.
ERIC Educational Resources Information Center
Amodeo, Luiza B.; Emslie, Julia Rosa
Mathematics anxiety and competence of 57 Anglo and Hispanic pre-service teachers were measured before and after a 30-hour workshop using the training program EQUALS. Students were divided into three groups: elementary, secondary, and library media. Students in the library media class served as the control group; the other two groups were the…
Chris B. LeDoux; John E. Baumgras; R. Bryan Selbe
1989-01-01
PROFIT-PC is a menu driven, interactive PC (personal computer) program that estimates optimum product mix and maximum net harvesting revenue based on projected product yields and stump-to-mill timber harvesting costs. Required inputs include the number of trees/acre by species and 2 inches diameter at breast-height class, delivered product prices by species and product...
Users guide to ACORn: a comprehensive Ozark regeneration simulator.
Daniel C. Dey; Michael Ter-Mikaelian; Paul S. Johnson; Stephen R. Shifley
1996-01-01
Describes how to use the ACORn computer program for predicting number of trees per acre and stocking percent by species and diameter classes 21 years after complete overstory removal of oak stands in the Ozark Highlands of Missouri and adjacent States.
Parallel and Portable Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Lee, S. R.; Cummings, J. C.; Nolen, S. D.; Keen, N. D.
1997-08-01
We have developed a multi-group, Monte Carlo neutron transport code in C++ using object-oriented methods and the Parallel Object-Oriented Methods and Applications (POOMA) class library. This transport code, called MC++, currently computes k and α eigenvalues of the neutron transport equation on a rectilinear computational mesh. It is portable to and runs in parallel on a wide variety of platforms, including MPPs, clustered SMPs, and individual workstations. It contains appropriate classes and abstractions for particle transport and, through the use of POOMA, for portable parallelism. Current capabilities are discussed, along with physics and performance results for several test problems on a variety of hardware, including all three Accelerated Strategic Computing Initiative (ASCI) platforms. Current parallel performance indicates the ability to compute α-eigenvalues in seconds or minutes rather than days or weeks. Current and future work on the implementation of a general transport physics framework (TPF) is also described. This TPF employs modern C++ programming techniques to provide simplified user interfaces, generic STL-style programming, and compile-time performance optimization. Physics capabilities of the TPF will be extended to include continuous energy treatments, implicit Monte Carlo algorithms, and a variety of convergence acceleration techniques such as importance combing.
Price schedules coordination for electricity pool markets
NASA Astrophysics Data System (ADS)
Legbedji, Alexis Motto
2002-04-01
We consider the optimal coordination of a class of mathematical programs with equilibrium constraints, which is formally interpreted as a resource-allocation problem. Many decomposition techniques were proposed to circumvent the difficulty of solving large systems with limited computer resources. The considerable improvement in computer architecture has allowed the solution of large-scale problems with increasing speed. Consequently, interest in decomposition techniques has waned. Nonetheless, there is an important class of applications for which decomposition techniques will still be relevant, among others, distributed systems---the Internet, perhaps, being the most conspicuous example---and competitive economic systems. Conceptually, a competitive economic system is a collection of agents that have similar or different objectives while sharing the same system resources. In theory, constructing a large-scale mathematical program and solving it centrally, using currently available computing power can optimize such systems of agents. In practice, however, because agents are self-interested and not willing to reveal some sensitive corporate data, one cannot solve these kinds of coordination problems by simply maximizing the sum of agent's objective functions with respect to their constraints. An iterative price decomposition or Lagrangian dual method is considered best suited because it can operate with limited information. A price-directed strategy, however, can only work successfully when coordinating or equilibrium prices exist, which is not generally the case when a weak duality is unavoidable. Showing when such prices exist and how to compute them is the main subject of this thesis. Among our results, we show that, if the Lagrangian function of a primal program is additively separable, price schedules coordination may be attained. The prices are Lagrange multipliers, and are also the decision variables of a dual program. In addition, we propose a new form of augmented or nonlinear pricing, which is an example of the use of penalty functions in mathematical programming. Applications are drawn from mathematical programming problems of the form arising in electric power system scheduling under competition.
Physics education through computational tools: the case of geometrical and physical optics
NASA Astrophysics Data System (ADS)
Rodríguez, Y.; Santana, A.; Mendoza, L. M.
2013-09-01
Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.
DNA-programmed dynamic assembly of quantum dots for molecular computation.
He, Xuewen; Li, Zhi; Chen, Muzi; Ma, Nan
2014-12-22
Despite the widespread use of quantum dots (QDs) for biosensing and bioimaging, QD-based bio-interfaceable and reconfigurable molecular computing systems have not yet been realized. DNA-programmed dynamic assembly of multi-color QDs is presented for the construction of a new class of fluorescence resonance energy transfer (FRET)-based QD computing systems. A complete set of seven elementary logic gates (OR, AND, NOR, NAND, INH, XOR, XNOR) are realized using a series of binary and ternary QD complexes operated by strand displacement reactions. The integration of different logic gates into a half-adder circuit for molecular computation is also demonstrated. This strategy is quite versatile and straightforward for logical operations and would pave the way for QD-biocomputing-based intelligent molecular diagnostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
21 CFR 892.5650 - Manual radionuclide applicator system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... system. (a) Identification. A manual radionuclide applicator system is a manually operated device... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Manual radionuclide applicator system. 892.5650... planning computer programs, and accessories. (b) Classification. Class I (general controls). The device is...
21 CFR 892.5650 - Manual radionuclide applicator system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... system. (a) Identification. A manual radionuclide applicator system is a manually operated device... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Manual radionuclide applicator system. 892.5650... planning computer programs, and accessories. (b) Classification. Class I (general controls). The device is...
21 CFR 892.5650 - Manual radionuclide applicator system.
Code of Federal Regulations, 2010 CFR
2010-04-01
... system. (a) Identification. A manual radionuclide applicator system is a manually operated device... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Manual radionuclide applicator system. 892.5650... planning computer programs, and accessories. (b) Classification. Class I (general controls). The device is...
21 CFR 892.5650 - Manual radionuclide applicator system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... system. (a) Identification. A manual radionuclide applicator system is a manually operated device... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Manual radionuclide applicator system. 892.5650... planning computer programs, and accessories. (b) Classification. Class I (general controls). The device is...
21 CFR 892.5650 - Manual radionuclide applicator system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... system. (a) Identification. A manual radionuclide applicator system is a manually operated device... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Manual radionuclide applicator system. 892.5650... planning computer programs, and accessories. (b) Classification. Class I (general controls). The device is...
One of My Favorite Assignments: Automated Teller Machine Simulation.
ERIC Educational Resources Information Center
Oberman, Paul S.
2001-01-01
Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)
On Undecidability Aspects of Resilient Computations and Implications to Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S
2014-01-01
Future Exascale computing systems with a large number of processors, memory elements and interconnection links, are expected to experience multiple, complex faults, which affect both applications and operating-runtime systems. A variety of algorithms, frameworks and tools are being proposed to realize and/or verify the resilience properties of computations that guarantee correct results on failure-prone computing systems. We analytically show that certain resilient computation problems in presence of general classes of faults are undecidable, that is, no algorithms exist for solving them. We first show that the membership verification in a generic set of resilient computations is undecidable. We describe classesmore » of faults that can create infinite loops or non-halting computations, whose detection in general is undecidable. We then show certain resilient computation problems to be undecidable by using reductions from the loop detection and halting problems under two formulations, namely, an abstract programming language and Turing machines, respectively. These two reductions highlight different failure effects: the former represents program and data corruption, and the latter illustrates incorrect program execution. These results call for broad-based, well-characterized resilience approaches that complement purely computational solutions using methods such as hardware monitors, co-designs, and system- and application-specific diagnosis codes.« less
Shen, Peiping; Zhang, Tongli; Wang, Chunfeng
2017-01-01
This article presents a new approximation algorithm for globally solving a class of generalized fractional programming problems (P) whose objective functions are defined as an appropriate composition of ratios of affine functions. To solve this problem, the algorithm solves an equivalent optimization problem (Q) via an exploration of a suitably defined nonuniform grid. The main work of the algorithm involves checking the feasibility of linear programs associated with the interesting grid points. It is proved that the proposed algorithm is a fully polynomial time approximation scheme as the ratio terms are fixed in the objective function to problem (P), based on the computational complexity result. In contrast to existing results in literature, the algorithm does not require the assumptions on quasi-concavity or low-rank of the objective function to problem (P). Numerical results are given to illustrate the feasibility and effectiveness of the proposed algorithm.
NASA Technical Reports Server (NTRS)
Medan, R. T.; Ray, K. S.
1974-01-01
A description of and users manual are presented for a U.S.A. FORTRAN 4 computer program which evaluates spanwise and chordwise loading distributions, lift coefficient, pitching moment coefficient, and other stability derivatives for thin wings in linearized, steady, subsonic flow. The program is based on a kernel function method lifting surface theory and is applicable to a large class of planforms including asymmetrical ones and ones with mixed straight and curved edges.
Zhang, Huaguang; Song, Ruizhuo; Wei, Qinglai; Zhang, Tieyan
2011-12-01
In this paper, a novel heuristic dynamic programming (HDP) iteration algorithm is proposed to solve the optimal tracking control problem for a class of nonlinear discrete-time systems with time delays. The novel algorithm contains state updating, control policy iteration, and performance index iteration. To get the optimal states, the states are also updated. Furthermore, the "backward iteration" is applied to state updating. Two neural networks are used to approximate the performance index function and compute the optimal control policy for facilitating the implementation of HDP iteration algorithm. At last, we present two examples to demonstrate the effectiveness of the proposed HDP iteration algorithm.
A general numerical analysis program for the superconducting quasiparticle mixer
NASA Technical Reports Server (NTRS)
Hicks, R. G.; Feldman, M. J.; Kerr, A. R.
1986-01-01
A user-oriented computer program SISCAP (SIS Computer Analysis Program) for analyzing SIS mixers is described. The program allows arbitrary impedance terminations to be specified at all LO harmonics and sideband frequencies. It is therefore able to treat a much more general class of SIS mixers than the widely used three-frequency analysis, for which the harmonics are assumed to be short-circuited. An additional program, GETCHI, provides the necessary input data to program SISCAP. The SISCAP program performs a nonlinear analysis to determine the SIS junction voltage waveform produced by the local oscillator. The quantum theory of mixing is used in its most general form, treating the large signal properties of the mixer in the time domain. A small signal linear analysis is then used to find the conversion loss and port impedances. The noise analysis includes thermal noise from the termination resistances and shot noise from the periodic LO current. Quantum noise is not considered. Many aspects of the program have been adequately verified and found accurate.
Parallel language constructs for tensor product computations on loosely coupled architectures
NASA Technical Reports Server (NTRS)
Mehrotra, Piyush; Van Rosendale, John
1989-01-01
A set of language primitives designed to allow the specification of parallel numerical algorithms at a higher level is described. The authors focus on tensor product array computations, a simple but important class of numerical algorithms. They consider first the problem of programming one-dimensional kernel routines, such as parallel tridiagonal solvers, and then look at how such parallel kernels can be combined to form parallel tensor product algorithms.
Rebecca Ralston; Joseph Buongiorno; Benedict Schulte; Jeremy Fried
2003-01-01
WestPro is an add-in program designed to work with Microsoft Excel to simulate the growth of uneven-aged Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco) stands in the Pacific Northwest region of the United States. Given the initial stand state, defined as the number of softwood and hardwood trees per acre by diameter class, WestPro predicts the...
NASA Technical Reports Server (NTRS)
Buntine, Wray
1994-01-01
IND computer program introduces Bayesian and Markov/maximum-likelihood (MML) methods and more-sophisticated methods of searching in growing trees. Produces more-accurate class-probability estimates important in applications like diagnosis. Provides range of features and styles with convenience for casual user, fine-tuning for advanced user or for those interested in research. Consists of four basic kinds of routines: data-manipulation, tree-generation, tree-testing, and tree-display. Written in C language.
A Survey of Object Oriented Languages in Programming Environments.
1987-06-01
subset of natural languages might be more effective , and make the human-computer interface more friendly. 19 .. .. . . -.. -, " ,. o...and complexty of Ada. He meant that the language contained too many features that made it complicated to use effectively . Much of the complexity comes...by sending messages to a class instance. A small subset of the methods in Smalltalk-80 are not expressed in the !-’ Smalhalk-80 programming language
A general methodology for maximum likelihood inference from band-recovery data
Conroy, M.J.; Williams, B.K.
1984-01-01
A numerical procedure is described for obtaining maximum likelihood estimates and associated maximum likelihood inference from band- recovery data. The method is used to illustrate previously developed one-age-class band-recovery models, and is extended to new models, including the analysis with a covariate for survival rates and variable-time-period recovery models. Extensions to R-age-class band- recovery, mark-recapture models, and twice-yearly marking are discussed. A FORTRAN program provides computations for these models.
NASA Astrophysics Data System (ADS)
Baytak, Ahmet
Among educational researchers and practitioners, there is a growing interest in employing computer games for pedagogical purposes. The present research integrated a technology education class and a science class where 5 th graders learned about environmental issues by designing games that involved environmental concepts. The purposes of this study were to investigate how designing computer games affected the development of students' environmental knowledge, programming knowledge, environmental awareness and interest in computers. It also explored the nature of the artifacts developed and the types of knowledge represented therein. A case study (Yin, 2003) was employed within the context of a 5 th grade elementary science classroom. Fifth graders designed computer games about environmental issues to present to 2nd graders by using Scratch software. The analysis of this study was based on multiple data sources: students' pre- and post-test scores on environmental awareness, their environmental knowledge, their interest in computer science, and their game design. Included in the analyses were also data from students' computer games, participant observations, and structured interviews. The results of the study showed that students were able to successfully design functional games that represented their understanding of environment, even though the gain between pre- and post-environmental knowledge test and environmental awareness survey were minimal. The findings indicate that all students were able to use various game characteristics and programming concepts, but their prior experience with the design software affected their representations. The analyses of the interview transcriptions and games show that students improved their programming skills and that they wanted to do similar projects for other subject areas in the future. Observations showed that game design appeared to lead to knowledge-building, interaction and collaboration among students. This, in turn, encouraged students to test and improve their designs. Sharing the games, it was found, has both positive and negative effects on the students' game design process and the representation of students' understandings of the domain subject.
A new order-theoretic characterisation of the polytime computable functions☆
Avanzini, Martin; Eguchi, Naohi; Moser, Georg
2015-01-01
We propose a new order-theoretic characterisation of the class of polytime computable functions. To this avail we define the small polynomial path order (sPOP⁎ for short). This termination order entails a new syntactic method to analyse the innermost runtime complexity of term rewrite systems fully automatically: for any rewrite system compatible with sPOP⁎ that employs recursion up to depth d, the (innermost) runtime complexity is polynomially bounded of degree d. This bound is tight. Thus we obtain a direct correspondence between a syntactic (and easily verifiable) condition of a program and the asymptotic worst-case complexity of the program. PMID:26412933
Using Technology to Create Safer Schools.
ERIC Educational Resources Information Center
Townley, Arthur J.; Martinez, Kenneth
1995-01-01
Although classes to create student self-esteem and antigang programs are gaining in popularity, most school districts have not used available technology to help create safer campuses. Increased availability of telephones and two-way radios would enhance school security, along with incorporation of newer technologies such as computers, digitized…
Computer-Assisted Instruction Case Study: The Introductory Marketing Course.
ERIC Educational Resources Information Center
Skinner, Steven J.; Grimm, Jim L.
1979-01-01
Briefly reviews research on the effectiveness of CAI in instruction, and describes a study comparing the performance of students using one program for basic marketing--TRMP (Tutorial Review of Marketing Principles)--with or without a study guide, the study guide alone, and a traditional class. (BBM)
Department-Generated Microcomputer Software.
ERIC Educational Resources Information Center
Mantei, Erwin J.
1986-01-01
Explains how self-produced software can be used to perform rapid number analysis or number-crunching duties in geology classes. Reviews programs in mineralogy and petrology and identifies areas in geology where computers can be used effectively. Discusses the advantages and benefits of integrating department-generated software into a geology…
Mars Science Laboratory Workstation Test Set
NASA Technical Reports Server (NTRS)
Henriquez, David A.; Canham, Timothy K.; Chang, Johnny T.; Villaume, Nathaniel
2009-01-01
The Mars Science Laboratory developed the Workstation TestSet (WSTS) is a computer program that enables flight software development on virtual MSL avionics. The WSTS is the non-real-time flight avionics simulator that is designed to be completely software-based and run on a workstation class Linux PC.
HI-TIE: The University, the High School, and Engineering
ERIC Educational Resources Information Center
Ward, Robert C.; Maxwell, Lee M.
1975-01-01
Describes four years experience at Colorado State University with courses introducing high school students to engineering, including a Fortran IV computer programming course in which tapings of actual campus classroom sessions, supplemented with homework assignments, class roles, quizzes, and examinations were used. Benefits of the transitional…
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales
Riccardi, Demian M.; Parks, Jerry M.; Johs, Alexander; ...
2015-03-20
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. We tested the core; it is well-documented and easy to install across computational platforms. Our goal for the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, anmore » abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.« less
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales.
Riccardi, Demian; Parks, Jerry M; Johs, Alexander; Smith, Jeremy C
2015-04-27
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. The core is well-tested, well-documented, and easy to install across computational platforms. The goal of the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, an abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riccardi, Demian M.; Parks, Jerry M.; Johs, Alexander
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. We tested the core; it is well-documented and easy to install across computational platforms. Our goal for the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, anmore » abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.« less
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.
Best bang for your buck: GPU nodes for GROMACS biomolecular simulations
Páll, Szilárd; Fechner, Martin; Esztermann, Ansgar; de Groot, Bert L.; Grubmüller, Helmut
2015-01-01
The molecular dynamics simulation package GROMACS runs efficiently on a wide variety of hardware from commodity workstations to high performance computing clusters. Hardware features are well‐exploited with a combination of single instruction multiple data, multithreading, and message passing interface (MPI)‐based single program multiple data/multiple program multiple data parallelism while graphics processing units (GPUs) can be used as accelerators to compute interactions off‐loaded from the CPU. Here, we evaluate which hardware produces trajectories with GROMACS 4.6 or 5.0 in the most economical way. We have assembled and benchmarked compute nodes with various CPU/GPU combinations to identify optimal compositions in terms of raw trajectory production rate, performance‐to‐price ratio, energy efficiency, and several other criteria. Although hardware prices are naturally subject to trends and fluctuations, general tendencies are clearly visible. Adding any type of GPU significantly boosts a node's simulation performance. For inexpensive consumer‐class GPUs this improvement equally reflects in the performance‐to‐price ratio. Although memory issues in consumer‐class GPUs could pass unnoticed as these cards do not support error checking and correction memory, unreliable GPUs can be sorted out with memory checking tools. Apart from the obvious determinants for cost‐efficiency like hardware expenses and raw performance, the energy consumption of a node is a major cost factor. Over the typical hardware lifetime until replacement of a few years, the costs for electrical power and cooling can become larger than the costs of the hardware itself. Taking that into account, nodes with a well‐balanced ratio of CPU and consumer‐class GPU resources produce the maximum amount of GROMACS trajectory over their lifetime. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26238484
Best bang for your buck: GPU nodes for GROMACS biomolecular simulations.
Kutzner, Carsten; Páll, Szilárd; Fechner, Martin; Esztermann, Ansgar; de Groot, Bert L; Grubmüller, Helmut
2015-10-05
The molecular dynamics simulation package GROMACS runs efficiently on a wide variety of hardware from commodity workstations to high performance computing clusters. Hardware features are well-exploited with a combination of single instruction multiple data, multithreading, and message passing interface (MPI)-based single program multiple data/multiple program multiple data parallelism while graphics processing units (GPUs) can be used as accelerators to compute interactions off-loaded from the CPU. Here, we evaluate which hardware produces trajectories with GROMACS 4.6 or 5.0 in the most economical way. We have assembled and benchmarked compute nodes with various CPU/GPU combinations to identify optimal compositions in terms of raw trajectory production rate, performance-to-price ratio, energy efficiency, and several other criteria. Although hardware prices are naturally subject to trends and fluctuations, general tendencies are clearly visible. Adding any type of GPU significantly boosts a node's simulation performance. For inexpensive consumer-class GPUs this improvement equally reflects in the performance-to-price ratio. Although memory issues in consumer-class GPUs could pass unnoticed as these cards do not support error checking and correction memory, unreliable GPUs can be sorted out with memory checking tools. Apart from the obvious determinants for cost-efficiency like hardware expenses and raw performance, the energy consumption of a node is a major cost factor. Over the typical hardware lifetime until replacement of a few years, the costs for electrical power and cooling can become larger than the costs of the hardware itself. Taking that into account, nodes with a well-balanced ratio of CPU and consumer-class GPU resources produce the maximum amount of GROMACS trajectory over their lifetime. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Kurkovsky, Stan
2013-06-01
Computer games have been accepted as an engaging and motivating tool in the computer science (CS) curriculum. However, designing and implementing a playable game is challenging, and is best done in advanced courses. Games for mobile devices, on the other hand, offer the advantage of being simpler and, thus, easier to program for lower level students. Learning context of mobile game development can be used to reinforce many core programming topics, such as loops, classes, and arrays. Furthermore, it can also be used to expose students in introductory computing courses to a wide range of advanced topics in order to illustrate that CS can be much more than coding. This paper describes the author's experience with using mobile game development projects in CS I and II, how these projects were integrated into existing courses at several universities, and the lessons learned from this experience.
AUTOCLASS III - AUTOMATIC CLASS DISCOVERY FROM DATA
NASA Technical Reports Server (NTRS)
Cheeseman, P. C.
1994-01-01
The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5.4, VAX/Ultrix v4.1, and MIPS/Ultrix v4, rev. 179; and on the Macintosh personal computer. The minimum Macintosh required is the IIci. This program will not run under CMU Common Lisp or VAX/VMS DEC Common Lisp. A minimum of 8Mb of RAM is required for Macintosh platforms and 16Mb for workstations. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 3.5 inch diskette in Macintosh format. An electronic copy of the documentation is included on the distribution medium. AUTOCLASS was developed between March 1988 and March 1992. It was initially released in May 1991. Sun is a trademark of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. DEC, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation. Macintosh is a trademark of Apple Computer, Inc. Allegro CL is a registered trademark of Franz, Inc.
Genetics Reasoning with Multiple External Representations.
ERIC Educational Resources Information Center
Tsui, Chi-Yan; Treagust, David F.
2003-01-01
Explores a case study of a class of 10th grade students whose learning of genetics involved activities using BioLogica, a computer program that features multiple external representations (MERs). Findings indicate that the MERs in BioLogica contributed to students' development of genetics reasoning by engendering their motivation and interest but…
Robotics and Children: Science Achievement and Problem Solving.
ERIC Educational Resources Information Center
Wagner, Susan Preston
1999-01-01
Compared the impact of robotics (computer-powered manipulative) to a battery-powered manipulative (novelty control) and traditionally taught science class on science achievement and problem solving of fourth through sixth graders. Found that the robotics group had higher scores on programming logic-problem solving than did the novelty control…
2010-03-01
allows the programmer to use the English language in an expressive manor while still maintaining the logical structure of a programming language ( Pressman ...and Choudhury Tanzeem. 2000. Face Recognition for Smart Environments, IEEE Computer, pp. 50–55. Pressman , Roger. 2010. Software Engineering A
Teaching Quality Object-Oriented Programming
ERIC Educational Resources Information Center
Feldman, Yishai A.
2005-01-01
Computer science students need to learn how to write high-quality software. An important methodology for achieving quality is design-by-contract, in which code is developed together with its specification, which is given as class invariants and method pre- and postconditions. This paper describes practical experience in teaching design-by-contract…
ERIC Educational Resources Information Center
Hollenbeck, Michelle D.
1997-01-01
For the past five years, Andover, Kansas middle-schoolers in an amateur radio club and class have sent and received Morse code messages, assembled and soldered circuit boards, designed and built antenna systems, and used computer programs to analyze radio communications problems. A successful bond issue financed a ham shack enabling students to…
CAI in Advanced Literature Class.
ERIC Educational Resources Information Center
Hinton, Norman
1981-01-01
Ways that computer assisted instruction (CAI) can be useful in teaching English at upperclass and graduate levels are considered, with illustrations from PLATO lessons that have been composed and programmed. One lesson takes advantage of PLATO's graphic design capabilities, which enabled the teacher to design the runic figures and to show them in…
Locating Hate Speech in the Networked Writing Classroom.
ERIC Educational Resources Information Center
Catalano, Tim
Many instructors are planning to teach their writing classes in the networked computer classroom. Through the use of electronic mail, course listservs, and chat programs, the instructor is offered the opportunity to facilitate a more egalitarian classroom discourse that creates a strong sense of community, not only between students, but also…
Survey of Biochemical Education in Japanese Universities.
ERIC Educational Resources Information Center
Kagawa, Yasuo
1995-01-01
Reports findings of questionnaires sent to faculty in charge of biochemical education in medical schools and other programs from dentistry to agriculture. Total class hours have declined since 1984. New trends include bioethics and computer-assisted learning. Tables show trends in lecture hours, lecture content, laboratory hours, core subject…
The Implementation of Web Conferencing Technologies in Online Graduate Classes
ERIC Educational Resources Information Center
Zotti, Robert
2017-01-01
This dissertation examines the implementation of web conferencing technology in online graduate courses within management, engineering, and computer science programs. Though the spread of learning management systems over the past two decades has been dramatic, the use of web conferencing technologies has curiously lagged. The real-time…
Project Bank: Word Processing on Campus.
ERIC Educational Resources Information Center
Hlavin, Robert F.
Project Bank was initiated at Triton College (Illinois) to increase student awareness of the merits of word processing as it affects their class work and related assignments; to make faculty aware of advances in word processing programs; and to increase the utilization of the college's computer laboratory. All fall 1985 incoming freshmen were…
Padhi, Radhakant; Unnikrishnan, Nishant; Wang, Xiaohua; Balakrishnan, S N
2006-12-01
Even though dynamic programming offers an optimal control solution in a state feedback form, the method is overwhelmed by computational and storage requirements. Approximate dynamic programming implemented with an Adaptive Critic (AC) neural network structure has evolved as a powerful alternative technique that obviates the need for excessive computations and storage requirements in solving optimal control problems. In this paper, an improvement to the AC architecture, called the "Single Network Adaptive Critic (SNAC)" is presented. This approach is applicable to a wide class of nonlinear systems where the optimal control (stationary) equation can be explicitly expressed in terms of the state and costate variables. The selection of this terminology is guided by the fact that it eliminates the use of one neural network (namely the action network) that is part of a typical dual network AC setup. As a consequence, the SNAC architecture offers three potential advantages: a simpler architecture, lesser computational load and elimination of the approximation error associated with the eliminated network. In order to demonstrate these benefits and the control synthesis technique using SNAC, two problems have been solved with the AC and SNAC approaches and their computational performances are compared. One of these problems is a real-life Micro-Electro-Mechanical-system (MEMS) problem, which demonstrates that the SNAC technique is applicable to complex engineering systems.
Marshal Wrubel and the Electronic Computer as an Astronomical Instrument
NASA Astrophysics Data System (ADS)
Mutschlecner, J. P.; Olsen, K. H.
1998-05-01
In 1960, Marshal H. Wrubel, professor of astrophysics at Indiana University, published an influential review paper under the title, "The Electronic Computer as an Astronomical Instrument." This essay pointed out the enormous potential of the electronic computer as an instrument of observational and theoretical research in astronomy, illustrated programming concepts, and made specific recommendations for the increased use of computers in astronomy. He noted that, with a few scattered exceptions, computer use by the astronomical community had heretofore been "timid and sporadic." This situation was to improve dramatically in the next few years. By the late 1950s, general-purpose, high-speed, "mainframe" computers were just emerging from the experimental, developmental stage, but few were affordable by or available to academic and research institutions not closely associated with large industrial or national defense programs. Yet by 1960 Wrubel had spent a decade actively pioneering and promoting the imaginative application of electronic computation within the astronomical community. Astronomy upper-level undergraduate and graduate students at Indiana were introduced to computing, and Ph.D. candidates who he supervised applied computer techniques to problems in theoretical astrophysics. He wrote an early textbook on programming, taught programming classes, and helped establish and direct the Research Computing Center at Indiana, later named the Wrubel Computing Center in his honor. He and his students created a variety of algorithms and subroutines and exchanged these throughout the astronomical community by distributing the Astronomical Computation News Letter. Nationally as well as internationally, Wrubel actively cooperated with other groups interested in computing applications for theoretical astrophysics, often through his position as secretary of the IAU commission on Stellar Constitution.
NASA Astrophysics Data System (ADS)
Halkos, George E.; Tsilika, Kyriaki D.
2011-09-01
In this paper we examine the property of asymptotic stability in several dynamic economic systems, modeled in ordinary differential equation formulations of time parameter t. Asymptotic stability ensures intertemporal equilibrium for the economic quantity the solution stands for, regardless of what the initial conditions happen to be. Existence of economic equilibrium in continuous time models is checked via a Symbolic language, the Xcas program editor. Using stability theorems of differential equations as background a brief overview of symbolic capabilities of free software Xcas is given. We present computational experience with a programming style for stability results of ordinary linear and nonlinear differential equations. Numerical experiments on traditional applications of economic dynamics exhibit the simplicity clarity and brevity of input and output of our computer codes.
NASA Technical Reports Server (NTRS)
Giddings, L.; Boston, S.
1976-01-01
A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.
How to get students to love (or not hate) MATLAB and programming
NASA Astrophysics Data System (ADS)
Reckinger, Shanon; Reckinger, Scott
2014-11-01
An effective programming course geared toward engineering students requires the utilization of modern teaching philosophies. A newly designed course that focuses on programming in MATLAB involves flipping the classroom and integrating various active teaching techniques. Vital aspects of the new course design include: lengthening in-class contact hours, Process-Oriented Guided Inquiry Learning (POGIL) method worksheets (self-guided instruction), student created video content posted on YouTube, clicker questions (used in class to practice reading and debugging code), programming exams that don't require computers, integrating oral exams into the classroom, fostering an environment for formal and informal peer learning, and designing in a broader theme to tie together assignments. However, possibly the most important piece to this programming course puzzle: the instructor needs to be able to find programming mistakes very fast and then lead individuals and groups through the steps to find their mistakes themselves. The effectiveness of the new course design is demonstrated through pre- and post- concept exam results and student evaluation feedback. Students reported that the course was challenging and required a lot of effort, but left largely positive feedback.
A reduced successive quadratic programming strategy for errors-in-variables estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tjoa, I.-B.; Biegler, L. T.; Carnegie-Mellon Univ.
Parameter estimation problems in process engineering represent a special class of nonlinear optimization problems, because the maximum likelihood structure of the objective function can be exploited. Within this class, the errors in variables method (EVM) is particularly interesting. Here we seek a weighted least-squares fit to the measurements with an underdetermined process model. Thus, both the number of variables and degrees of freedom available for optimization increase linearly with the number of data sets. Large optimization problems of this type can be particularly challenging and expensive to solve because, for general-purpose nonlinear programming (NLP) algorithms, the computational effort increases atmore » least quadratically with problem size. In this study we develop a tailored NLP strategy for EVM problems. The method is based on a reduced Hessian approach to successive quadratic programming (SQP), but with the decomposition performed separately for each data set. This leads to the elimination of all variables but the model parameters, which are determined by a QP coordination step. In this way the computational effort remains linear in the number of data sets. Moreover, unlike previous approaches to the EVM problem, global and superlinear properties of the SQP algorithm apply naturally. Also, the method directly incorporates inequality constraints on the model parameters (although not on the fitted variables). This approach is demonstrated on five example problems with up to 102 degrees of freedom. Compared to general-purpose NLP algorithms, large improvements in computational performance are observed.« less
Satellite Power Systems (SPS) concept definition study. Volume 6: In-depth element investigation
NASA Technical Reports Server (NTRS)
Hanley, G. M.
1980-01-01
The fabrication parameters of GaAs MESFET solid-state amplifiers considering a power added conversion efficiency of at least 80% and power gains of at least 10dB were determined. Operating frequency was 2.45 GHz although 914 MHz was also considered. Basic circuit to be considered was either Class C or Class E amplification. Two modeling programs were utilized. The results of several computer calculations considering differing loads, temperatures, and efficiencies are presented. Parametric data in both tabular and plotted form are presented.
Nonlinear Structured Growth Mixture Models in Mplus and OpenMx
Grimm, Kevin J.; Ram, Nilam; Estabrook, Ryne
2014-01-01
Growth mixture models (GMMs; Muthén & Muthén, 2000; Muthén & Shedden, 1999) are a combination of latent curve models (LCMs) and finite mixture models to examine the existence of latent classes that follow distinct developmental patterns. GMMs are often fit with linear, latent basis, multiphase, or polynomial change models because of their common use, flexibility in modeling many types of change patterns, the availability of statistical programs to fit such models, and the ease of programming. In this paper, we present additional ways of modeling nonlinear change patterns with GMMs. Specifically, we show how LCMs that follow specific nonlinear functions can be extended to examine the presence of multiple latent classes using the Mplus and OpenMx computer programs. These models are fit to longitudinal reading data from the Early Childhood Longitudinal Study-Kindergarten Cohort to illustrate their use. PMID:25419006
NASA Astrophysics Data System (ADS)
Barak, Miri; Harward, Judson; Kocur, George; Lerman, Steven
2007-08-01
Within the framework of MIT's course 1.00: Introduction to Computers and Engineering Problem Solving, this paper describes an innovative project entitled: Studio 1.00 that integrates lectures with in-class demonstrations, active learning sessions, and on-task feedback, through the use of wireless laptop computers. This paper also describes a related evaluation study that investigated the effectiveness of different instructional strategies, comparing traditional teaching with two models of the studio format. Students' learning outcomes, specifically, their final grades and conceptual understanding of computational methods and programming, were examined. Findings indicated that Studio-1.00, in both its extensive- and partial-active learning modes, enhanced students' learning outcomes in Java programming. Comparing to the traditional courses, more students in the studio courses received "A" as their final grade and less failed. Moreover, students who regularly attended the active learning sessions were able to conceptualize programming principles better than their peers. We have also found two weaknesses in the teaching format of Studio-1.00 that can guide future versions of the course.
The Center of Excellence for Hypersonics Training and Research at the University of Texas at Austin
NASA Technical Reports Server (NTRS)
Dolling, David S.
1993-01-01
Over the period of this grant (1986-92), 23 graduate students were supported by the Center and received education and training in hypersonics through MS and Ph.D. programs. An additional 8 Ph.D. candidates and 2 MS candidates, with their own fellowship support, were attracted to The University of Texas and were recruited into the hypersonics program because of the Center. Their research, supervised by the 10 faculty involved in the Center, resulted in approximately 50 publications and presentations in journals and at national and international technical conferences. To provide broad-based training, a new hypersonics curriculum was created, enabling students to take 8 core classes in theoretical, computational, and experimental hypersonics, and other option classes over a two to four semester period. The Center also developed an active continuing education program. The Hypersonics Short Course was taught 3 times, twice in the USA and once in Europe. Approximately 300 persons were attracted to hear lectures by more than 25 of the leading experts in the field. In addition, a hypersonic aerodynamics short course was offered through AIAA, as well as short courses on computational fluid dynamics (CFD) and advanced CFD. The existence of the Center also enabled faculty to leverage a substantial volume of additional funds from other agencies, for research and graduate student training. Overall, this was a highly successful and highly visible program.
Nutrition Education Needs of Early Childhood Teachers.
ERIC Educational Resources Information Center
Forsythe, Hazel; Wesley, Myrna
This study sought to determine the needs of early childhood teachers in Kentucky for education to help them manage children's nutrition in early childhood programs. The study also sought to determine whether formal classes, self-study via computer, or site-based inservice workshops is the most desirable format for teacher nutrition education. A…
Students' Attitudes in a Graduate Statistics Class.
ERIC Educational Resources Information Center
Kennedy, Robert L.; Broadston, Pamela M.
This study investigated the attitudes toward statistics of graduate students who used a computer program as part of the instructional effort, which allowed for an individualized, self-paced, student-centered activity-based course. The 9 sections involved in the study were offered in 2001 through 2003, and there were 75 participants for whom there…
ERIC Educational Resources Information Center
Shlechter, Theodore M.; And Others
1992-01-01
Examines the effectiveness of SIMNET (Simulation Networking), a virtual reality training simulation system, combined with a program of role-playing activities for helping Army classes to master the conditional knowledge needed for successful field performance. The value of active forms of learning for promoting higher order cognitive thinking is…
Assessing Cognitive Load Theory to Improve Student Learning for Mechanical Engineers
ERIC Educational Resources Information Center
Impelluso, Thomas J.
2009-01-01
A computer programming class for students of mechanical engineering was redesigned and assessed: Cognitive Load Theory was used to redesign the content; online technologies were used to redesign the delivery. Student learning improved and the dropout rate was reduced. This article reports on both attitudinal and objective assessment: comparing…
38 CFR 21.4270 - Measurement of courses.
Code of Federal Regulations, 2010 CFR
2010-07-01
... section, if theory and class instruction constitute more than 50 percent of the required hours in a trade... shops and the time involved in field trips and group instruction may be included in computing the clock... programs and the time involved in field trips and individual and group instruction may be included in...
Backtrack Programming: A Computer-Based Approach to Group Problem Solving.
ERIC Educational Resources Information Center
Scott, Michael D.; Bodaken, Edward M.
Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…
Science Teachers' Response to the Digital Education Revolution
ERIC Educational Resources Information Center
Nielsen, Wendy; Miller, K. Alex; Hoban, Garry
2015-01-01
We report a case study of two highly qualified science teachers as they implemented laptop computers in their Years 9 and 10 science classes at the beginning of the "Digital Education Revolution," Australia's national one-to-one laptop program initiated in 2009. When a large-scale investment is made in a significant educational change,…
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V.; Davis, Damek; Isaacson, Douglas R.
2012-01-01
A class of problems in air traffic management asks for a scheduling algorithm that supplies the air traffic services authority not only with a schedule of arrivals and departures, but also with speed advisories. Since advisories must be finite, a scheduling algorithm must ultimately produce a finite data set, hence must either start with a purely discrete model or involve a discretization of a continuous one. The former choice, often preferred for intuitive clarity, naturally leads to mixed-integer programs, hindering proofs of correctness and computational cost bounds (crucial for real-time operations). In this paper, a hybrid control system is used to model air traffic scheduling, capturing both the discrete and continuous aspects. This framework is applied to a class of problems, called the Fully Routed Nominal Problem. We prove a number of geometric results on feasible schedules and use these results to formulate an algorithm that attempts to compute a collective speed advisory, effectively finite, and has computational cost polynomial in the number of aircraft. This work is a first step toward optimization and models refined with more realistic detail.
Lee, Jung-Ah; Nguyen, Hannah; Park, Joan; Tran, Linh; Nguyen, Trang; Huynh, Yen
2017-10-01
Families of ethnic minority persons with dementia often seek help at later stages of the disease. Little is known about the effectiveness of various methods in supporting ethnic minority dementia patients' caregivers. The objective of the study was to identify smartphone and computer usage among family caregivers of dementia patients (i.e., Korean and Vietnamese Americans) to develop dementia-care education programs for them. Participants were asked various questions related to their computer or smartphone usage in conjunction with needs-assessment interviews. Flyers were distributed at two ethnic minority community centers in Southern California. Snowball recruitment was also utilized to reach out to the families of dementia patients dwelling in the community. Thirty-five family caregivers, including 20 Vietnamese and 15 Korean individuals, participated in this survey. Thirty participants (30 of 35, 85.7%) were computer users. Among those, 76.7% (23 of 30) reported daily usage and 53% (16 of 30) claimed to use social media. A majority of the participants (31 of 35, 88.6%) reported that they owned smartphones. More than half of smartphone users (18 of 29, 62%) claimed to use social media applications. Many participants claimed that they could not attend in-class education due to caregiving and/or transportation issues. Most family caregivers of dementia patients use smartphones more often than computers, and more than half of those caregivers communicate with others through social media apps. A smartphone-app-based caregiver intervention may serve as a more effective approach compared to the conventional in-class method. Multiple modalities for the development of caregiver interventions should be considered.
Advanced Technology System Scheduling Governance Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ang, Jim; Carnes, Brian; Hoang, Thuc
In the fall of 2005, the Advanced Simulation and Computing (ASC) Program appointed a team to formulate a governance model for allocating resources and scheduling the stockpile stewardship workload on ASC capability systems. This update to the original document takes into account the new technical challenges and roles for advanced technology (AT) systems and the new ASC Program workload categories that must be supported. The goal of this updated model is to effectively allocate and schedule AT computing resources among all three National Nuclear Security Administration (NNSA) laboratories for weapons deliverables that merit priority on this class of resource. Themore » process outlined below describes how proposed work can be evaluated and approved for resource allocations while preserving high effective utilization of the systems. This approach will provide the broadest possible benefit to the Stockpile Stewardship Program (SSP).« less
Stochastic Feedforward Control Technique
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1990-01-01
Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.
More About Software for No-Loss Computing
NASA Technical Reports Server (NTRS)
Edmonds, Iarina
2007-01-01
A document presents some additional information on the subject matter of "Integrated Hardware and Software for No- Loss Computing" (NPO-42554), which appears elsewhere in this issue of NASA Tech Briefs. To recapitulate: The hardware and software designs of a developmental parallel computing system are integrated to effectuate a concept of no-loss computing (NLC). The system is designed to reconfigure an application program such that it can be monitored in real time and further reconfigured to continue a computation in the event of failure of one of the computers. The design provides for (1) a distributed class of NLC computation agents, denoted introspection agents, that effects hierarchical detection of anomalies; (2) enhancement of the compiler of the parallel computing system to cause generation of state vectors that can be used to continue a computation in the event of a failure; and (3) activation of a recovery component when an anomaly is detected.
Graduate Students' Attitudes in an Activity-Based Statistics Course.
ERIC Educational Resources Information Center
Kennedy, Robert L.; McCallister, Corliss Jean
This study investigated graduate students' attitudes toward statistics in a class in which the focus of instruction was the use of a computer program that made possible an individualized, self-paced student-centered, activity-based course. The six sections involved in the study were offered in 2001 and 2002. There were 43 participants for whom…
A Comparison Study between a Traditional and Experimental Program.
ERIC Educational Resources Information Center
Dogan, Hamide
This paper is part of a dissertation defended in January 2001 as part of the author's Ph.D. requirement. The study investigated the effects of use of Mathematica, a computer algebra system, in learning basic linear algebra concepts, It was done by means of comparing two first year linear algebra classes, one traditional and one Mathematica…
ERIC Educational Resources Information Center
Hagerty, Gary; Smith, Stanley; Goodwin, Danielle
2010-01-01
In 2001, Black Hills State University (BHSU) redesigned college algebra to use the computer-based mastery learning program, Assessment and Learning in Knowledge Spaces [1], historical development of concepts modules, whole class discussions, cooperative activities, relevant applications problems, and many fewer lectures. This resulted in a 21%…
Religious Genres, Entextualization, and Literacy in Gitano Children.
ERIC Educational Resources Information Center
Palomares-Valera, Manuel; Cano, Ana; Poveda, David
This paper analyzes the connections between the oral genres displayed by Gitano (also known as Gypsies or Romani) children and adults during religious instruction classes of an Evangelist Church and the writings produced by Gitano children in a computer after-school program of the same community. Subjects were Gitano children (n=30), ages 5-13…
User's guide for the northern hardwood stand models: SIMSAP and SIMTIM
Dale S. Solomon; Richard A. Hosmer; Richard A. Hosmer
1987-01-01
SIMSAP and SlMTlM are computer programs that have been developed to simulate the stand growth and development of natural and treated evenaged northern hardwood stands. SIMSAP begins with species distributions by quality classes in sapling stands after regeneration. SIMTIM, the poletimber-sawtimber-harvest phase, uses stocking guides based on quadratic mean stand...
ERIC Educational Resources Information Center
Fotaris, Panagiotis; Mastoras, Theodoros; Leinfellner, Richard; Rosunally, Yasmine
2016-01-01
Conventional taught learning practices often experience difficulties in keeping students motivated and engaged. Video games, however, are very successful at sustaining high levels of motivation and engagement through a set of tasks for hours without apparent loss of focus. In addition, gamers solve complex problems within a gaming environment…
ERIC Educational Resources Information Center
Ocak, Mehmet
2008-01-01
This correlational study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…
Factors that Influence the Success of Male and Female Computer Programming Students in College
NASA Astrophysics Data System (ADS)
Clinkenbeard, Drew A.
As the demand for a technologically skilled work force grows, experience and skill in computer science have become increasingly valuable for college students. However, the number of students graduating with computer science degrees is not growing proportional to this need. Traditionally several groups are underrepresented in this field, notably women and students of color. This study investigated elements of computer science education that influence academic achievement in beginning computer programming courses. The goal of the study was to identify elements that increase success in computer programming courses. A 38-item questionnaire was developed and administered during the Spring 2016 semester at California State University Fullerton (CSUF). CSUF is an urban public university comprised of about 40,000 students. Data were collected from three beginning programming classes offered at CSUF. In total 411 questionnaires were collected resulting in a response rate of 58.63%. Data for the study were grouped into three broad categories of variables. These included academic and background variables; affective variables; and peer, mentor, and role-model variables. A conceptual model was developed to investigate how these variables might predict final course grade. Data were analyzed using statistical techniques such as linear regression, factor analysis, and path analysis. Ultimately this study found that peer interactions, comfort with computers, computer self-efficacy, self-concept, and perception of achievement were the best predictors of final course grade. In addition, the analyses showed that male students exhibited higher levels of computer self-efficacy and self-concept compared to female students, even when they achieved comparable course grades. Implications and explanations of these findings are explored, and potential policy changes are offered.
New NAS Parallel Benchmarks Results
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; Saphir, William; VanderWijngaart, Rob; Woo, Alex; Kutler, Paul (Technical Monitor)
1997-01-01
NPB2 (NAS (NASA Advanced Supercomputing) Parallel Benchmarks 2) is an implementation, based on Fortran and the MPI (message passing interface) message passing standard, of the original NAS Parallel Benchmark specifications. NPB2 programs are run with little or no tuning, in contrast to NPB vendor implementations, which are highly optimized for specific architectures. NPB2 results complement, rather than replace, NPB results. Because they have not been optimized by vendors, NPB2 implementations approximate the performance a typical user can expect for a portable parallel program on distributed memory parallel computers. Together these results provide an insightful comparison of the real-world performance of high-performance computers. New NPB2 features: New implementation (CG), new workstation class problem sizes, new serial sample versions, more performance statistics.
The effect of an enriched learning community on success and retention in chemistry courses
NASA Astrophysics Data System (ADS)
Willoughby, Lois Jane
Since the mid-1990s, the United States has experienced a shortage of scientists and engineers, declining numbers of students choosing these fields as majors, and low student success and retention rates in these disciplines. Learning theorists, educational researchers, and practitioners believe that learning environments can be created so that an improvement in the numbers of students who complete courses successfully could be attained (Astin, 1993; Magolda & Terenzini, n.d.; O'Banion, 1997). Learning communities do this by providing high expectations, academic and social support, feedback during the entire educational process, and involvement with faculty, other students, and the institution (Ketcheson & Levine, 1999). A program evaluation of an existing learning community of science, mathematics, and engineering majors was conducted to determine the extent to which the program met its goals and was effective from faculty and student perspectives. The program provided laptop computers, peer tutors, supplemental instruction with and without computer software, small class size, opportunities for contact with specialists in selected career fields, a resource library, and Peer-Led Team Learning. During the two years the project has existed, success, retention, and next-course continuation rates were higher than in traditional courses. Faculty and student interviews indicated there were many affective accomplishments as well. Success and retention rates for one learning community class ( n = 27) and one traditional class (n = 61) in chemistry were collected and compared using Pearson chi square procedures ( p = .05). No statistically significant difference was found between the two groups. Data from an open-ended student survey about how specific elements of their course experiences contributed to success and persistence were analyzed by coding the responses and comparing the learning community and traditional classes. Substantial differences were found in their perceptions about the lecture, the lab, other supports used for the course, contact with other students, helping them reach their potential, and their recommendation about the course to others. Because of the limitation of small sample size, these differences are reported in descriptive terms.
Alonso-Silverio, Gustavo A; Pérez-Escamirosa, Fernando; Bruno-Sanchez, Raúl; Ortiz-Simon, José L; Muñoz-Guerrero, Roberto; Minor-Martinez, Arturo; Alarcón-Paredes, Antonio
2018-05-01
A trainer for online laparoscopic surgical skills assessment based on the performance of experts and nonexperts is presented. The system uses computer vision, augmented reality, and artificial intelligence algorithms, implemented into a Raspberry Pi board with Python programming language. Two training tasks were evaluated by the laparoscopic system: transferring and pattern cutting. Computer vision libraries were used to obtain the number of transferred points and simulated pattern cutting trace by means of tracking of the laparoscopic instrument. An artificial neural network (ANN) was trained to learn from experts and nonexperts' behavior for pattern cutting task, whereas the assessment of transferring task was performed using a preestablished threshold. Four expert surgeons in laparoscopic surgery, from hospital "Raymundo Abarca Alarcón," constituted the experienced class for the ANN. Sixteen trainees (10 medical students and 6 residents) without laparoscopic surgical skills and limited experience in minimal invasive techniques from School of Medicine at Universidad Autónoma de Guerrero constituted the nonexperienced class. Data from participants performing 5 daily repetitions for each task during 5 days were used to build the ANN. The participants tend to improve their learning curve and dexterity with this laparoscopic training system. The classifier shows mean accuracy and receiver operating characteristic curve of 90.98% and 0.93, respectively. Moreover, the ANN was able to evaluate the psychomotor skills of users into 2 classes: experienced or nonexperienced. We constructed and evaluated an affordable laparoscopic trainer system using computer vision, augmented reality, and an artificial intelligence algorithm. The proposed trainer has the potential to increase the self-confidence of trainees and to be applied to programs with limited resources.
Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael
2015-01-01
The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064
Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael
2015-04-01
The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.
NASA Astrophysics Data System (ADS)
Klopfer, Eric; Scheintaub, Hal; Huang, Wendy; Wendel, Daniel
Computational approaches to science are radically altering the nature of scientific investigatiogn. Yet these computer programs and simulations are sparsely used in science education, and when they are used, they are typically “canned” simulations which are black boxes to students. StarLogo The Next Generation (TNG) was developed to make programming of simulations more accessible for students and teachers. StarLogo TNG builds on the StarLogo tradition of agent-based modeling for students and teachers, with the added features of a graphical programming environment and a three-dimensional (3D) world. The graphical programming environment reduces the learning curve of programming, especially syntax. The 3D graphics make for a more immersive and engaging experience for students, including making it easy to design and program their own video games. Another change to StarLogo TNG is a fundamental restructuring of the virtual machine to make it more transparent. As a result of these changes, classroom use of TNG is expanding to new areas. This chapter is concluded with a description of field tests conducted in middle and high school science classes.
Cognitive training in Parkinson disease: cognition-specific vs nonspecific computer training.
Zimmermann, Ronan; Gschwandtner, Ute; Benz, Nina; Hatz, Florian; Schindler, Christian; Taub, Ethan; Fuhr, Peter
2014-04-08
In this study, we compared a cognition-specific computer-based cognitive training program with a motion-controlled computer sports game that is not cognition-specific for their ability to enhance cognitive performance in various cognitive domains in patients with Parkinson disease (PD). Patients with PD were trained with either a computer program designed to enhance cognition (CogniPlus, 19 patients) or a computer sports game with motion-capturing controllers (Nintendo Wii, 20 patients). The effect of training in 5 cognitive domains was measured by neuropsychological testing at baseline and after training. Group differences over all variables were assessed with multivariate analysis of variance, and group differences in single variables were assessed with 95% confidence intervals of mean difference. The groups were similar regarding age, sex, and educational level. Patients with PD who were trained with Wii for 4 weeks performed better in attention (95% confidence interval: -1.49 to -0.11) than patients trained with CogniPlus. In our study, patients with PD derived at least the same degree of cognitive benefit from non-cognition-specific training involving movement as from cognition-specific computerized training. For patients with PD, game consoles may be a less expensive and more entertaining alternative to computer programs specifically designed for cognitive training. This study provides Class III evidence that, in patients with PD, cognition-specific computer-based training is not superior to a motion-controlled computer game in improving cognitive performance.
Using Computers in Introductory Astronomy Courses
NASA Astrophysics Data System (ADS)
Deming, Grace L.
1995-12-01
Computer literacy is fast becoming a focal point in undergraduate education. Scientific literacy has been a continuing goal of undergraduate programs across the nation and a course in introductory astronomy is often used to satisfy such science requirements. At U. MD an introduction to computer skills is being integrated into our astronomy curriculum for non-science majors. The campus is adequately equipped with computer labs, yet many students enter college without basic computer skills. In Astronomy 101 (General Astronomy) students are introduced to electronic mail, a Listserver, and the world wide web. Students in this course are required to register for a free campus computer account. Their first assignment is to use e-mail to subscribe to the class Listserver, Milkyway. Through Milkyway, students have access to weekly lecture summaries, questions to review for exams, and copies of previous exams. Using e-mail students may pose questions, provide comments, or exchange opinions using Milkyway, or they may e-mail the instructor directly. Studies indicate that using e-mail is less intimidating to a student than asking a question in a class of 200 students. Monitoring e-mail for student questions has not been a problem. Student reaction has been favorable to using e-mail, since instructor office hours are not always convenient, especially to commuting or working students. Through required assignments, students receive an introduction to accessing information on the world wide web using Netscape. Astronomy has great resources available on the Internet which can be used to supplement and reinforce introductory material. Assignments are structured so that students will gain the techniques necessary to access available information. It is hoped that students will successfully apply the computer skills they learn in astronomy class to their own fields and as life-long learners. We have found that students comfortable with computers are willing to share their knowledge with others. The computer activities have been structured to promote cooperation between students. These skills are also necessary for success.
Samiei, Vida; Wan Puteh, Sharifa Ezat; Abdul Manaf, Mohd Rizal; Abdul Latip, Khalib; Ismail, Aniza
2016-03-01
The idea of launching an internet-based self-management program for patients with diabetes led us to do a cross-sectional study to find out about the willingness, interest, equipment, and level of usage of computer and internet in a medium- to low-social class area and to find the feasibility of using e-telemonitoring systems for these patients. A total of 180 patients with type 2 diabetes participated in this study and fulfilled the self-administered questionnaire in Diabetes Clinic of Primary Medical Center of University Kebangsaan Malaysia Medical Centre; the response rate was 84%. We used the universal sampling method and assessed three groups of factors including sociodemographic, information and communication technology (ICT), willingness and interest, and disease factors. Our results showed that 56% of the patients with diabetes were interested to use such programs; majority of the patients were Malay, and patients in the age group of 51-60 years formed the largest group. Majority of these patients studied up to secondary level of education. Age, education, income, and money spent for checkup were significantly associated with the interest of patients with diabetes to the internet-based programs. ICT-related factors such as computer ownership, computer knowledge, access to the internet, frequency of using the internet and reasons of internet usage had a positive effect on patients' interest. Our results show that among low to intermediate social class of Malaysian patients with type 2 diabetes, more than 50% of them can and wanted to use the internet-based self-management programs. Furthermore, we also show that patients equipped with more ICT-related factors had more interest toward these programs. Therefore, we propose making ICT more affordable and integrating it into the health care system at primary care level and then extending it nationwide.
Conquering technophobia: preparing faculty for today.
Richard, P L
1997-01-01
The constantly changing world of technology creates excitement and an obligation for faculty of schools of nursing to address computer literacy in the curricula at all levels. The initial step in the process of meeting the goals was to assist the faculty in becoming computer literate so that they could foster and encourage the same in the students. The implementation of The Cure for Technophobia included basic and advanced computer skills designed to assist the faculty in becoming comfortable and competent computer users. The applications addressed included: introduction to windows, electronic mail, word processing, presentation and database applications, library on-line searches of literature databases, introduction to internet browsers and a computerized testing program. Efforts were made to overcome barriers to computer literacy and promote the learning process. Familiar, competent, computer literate individuals were used to conduct the classes to accomplish this goal.
Botta, Cinzia B; Cabri, Walter; Cini, Elena; De Cesare, Lucia; Fattorusso, Caterina; Giannini, Giuseppe; Persico, Marco; Petrella, Antonello; Rondinelli, Francesca; Rodriquez, Manuela; Russo, Adele; Taddei, Maurizio
2011-04-14
Several oxime containing molecules, characterized by a SAHA-like structure, were explored to select a potentially new biasing binding element for the zinc in HDAC catalytic site. All compounds were evaluated for their in vitro inhibitory activity against the 11 human HDACs isoforms. After identification of a "hit" molecule, a programmed variation at the cap group and at the linker was carried out in order to increase HDAC inhibition and/or paralogue selectivity. Some of the new derivatives showed increased activity against a number of HDAC isoforms, even if their overall activity range is still far from the inhibition values reported for SAHA. Moreover, different from what was reported for their hydroxamic acid analogues the new α-oxime amide derivatives do not select between class I and class II HDACs; rather they target specific isoforms in each class. These somehow contradictory results were finally rationalized by a computational assisted SAR, which gave us the chance to understand how the oxime derivatives interact with the catalytic site and justify the observed activity profile.
Parallel computing in enterprise modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.
2008-08-01
This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priorimore » ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.« less
VALIDATION OF ANSYS FINITE ELEMENT ANALYSIS SOFTWARE
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMM, E.R.
2003-06-27
This document provides a record of the verification and Validation of the ANSYS Version 7.0 software that is installed on selected CH2M HILL computers. The issues addressed include: Software verification, installation, validation, configuration management and error reporting. The ANSYS{reg_sign} computer program is a large scale multi-purpose finite element program which may be used for solving several classes of engineering analysis. The analysis capabilities of ANSYS Full Mechanical Version 7.0 installed on selected CH2M Hill Hanford Group (CH2M HILL) Intel processor based computers include the ability to solve static and dynamic structural analyses, steady-state and transient heat transfer problems, mode-frequency andmore » buckling eigenvalue problems, static or time-varying magnetic analyses and various types of field and coupled-field applications. The program contains many special features which allow nonlinearities or secondary effects to be included in the solution, such as plasticity, large strain, hyperelasticity, creep, swelling, large deflections, contact, stress stiffening, temperature dependency, material anisotropy, and thermal radiation. The ANSYS program has been in commercial use since 1970, and has been used extensively in the aerospace, automotive, construction, electronic, energy services, manufacturing, nuclear, plastics, oil and steel industries.« less
HepML, an XML-based format for describing simulated data in high energy physics
NASA Astrophysics Data System (ADS)
Belov, S.; Dudko, L.; Kekelidze, D.; Sherstnev, A.
2010-10-01
In this paper we describe a HepML format and a corresponding C++ library developed for keeping complete description of parton level events in a unified and flexible form. HepML tags contain enough information to understand what kind of physics the simulated events describe and how the events have been prepared. A HepML block can be included into event files in the LHEF format. The structure of the HepML block is described by means of several XML Schemas. The Schemas define necessary information for the HepML block and how this information should be located within the block. The library libhepml is a C++ library intended for parsing and serialization of HepML tags, and representing the HepML block in computer memory. The library is an API for external software. For example, Matrix Element Monte Carlo event generators can use the library for preparing and writing a header of an LHEF file in the form of HepML tags. In turn, Showering and Hadronization event generators can parse the HepML header and get the information in the form of C++ classes. libhepml can be used in C++, C, and Fortran programs. All necessary parts of HepML have been prepared and we present the project to the HEP community. Program summaryProgram title: libhepml Catalogue identifier: AEGL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 138 866 No. of bytes in distributed program, including test data, etc.: 613 122 Distribution format: tar.gz Programming language: C++, C Computer: PCs and workstations Operating system: Scientific Linux CERN 4/5, Ubuntu 9.10 RAM: 1 073 741 824 bytes (1 Gb) Classification: 6.2, 11.1, 11.2 External routines: Xerces XML library ( http://xerces.apache.org/xerces-c/), Expat XML Parser ( http://expat.sourceforge.net/) Nature of problem: Monte Carlo simulation in high energy physics is divided into several stages. Various programs exist for these stages. In this article we are interested in interfacing different Monte Carlo event generators via data files, in particular, Matrix Element (ME) generators and Showering and Hadronization (SH) generators. There is a widely accepted format for data files for such interfaces - Les Houches Event Format (LHEF). Although information kept in an LHEF file is enough for proper working of SH generators, it is insufficient for understanding how events in the LHEF file have been prepared and which physical model has been applied. In this paper we propose an extension of the format for keeping additional information available in generators. We propose to add a new information block, marked up with XML tags, to the LHEF file. This block describes events in the file in more detail. In particular, it stores information about a physical model, kinematical cuts, generator, etc. This helps to make LHEF files self-documented. Certainly, HepML can be applied in more general context, not in LHEF files only. Solution method: In order to overcome drawbacks of the original LHEF accord we propose to add a new information block of HepML tags. HepML is an XML-based markup language. We designed several XML Schemas for all tags in the language. Any HepML document should follow rules of the Schemas. The language is equipped with a library for operation with HepML tags and documents. This C++ library, called libhepml, consists of classes for HepML objects, which represent a HepML document in computer memory, parsing classes, serializating classes, and some auxiliary classes. Restrictions: The software is adapted for solving problems, described in the article. There are no additional restrictions. Running time: Tests have been done on a computer with Intel(R) Core(TM)2 Solo, 1.4 GHz. Parsing of a HepML file: 6 ms (size of the HepML files is 12.5 Kb) Writing of a HepML block to file: 14 ms (file size 12.5 Kb) Merging of two HepML blocks and writing to file: 18 ms (file size - 25.0 Kb).
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
The UCLA MEDLARS Computer System *
Garvis, Francis J.
1966-01-01
Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested. PMID:5901355
ERIC Educational Resources Information Center
Reubsaet, A.; Brug, J.; Kitslaar, J.; Van Hooff, J. P.; van den Borne, H. W.
2004-01-01
The present paper describes the impact and evaluation of two intervention components--a video with group discussion and an interactive computer-tailored program--in order to encourage adolescents to register their organ donation preference. Studies were conducted in school during regular school hours. The video with group discussion in class had a…
ERIC Educational Resources Information Center
Pittsburg Unified School District, CA.
The card games in this publication are an alternative activity to help students master computational skills. Games for operations with whole numbers, fractions, decimals, percents, integers, and square roots are included. They can be used to introduce math topics and for practice and review, with either the whole class or in small groups with 2 to…
Don C. Bragg
2002-01-01
This article is an introduction to the computer software used by the Potential Relative Increment (PRI) approach to optimal tree diameter growth modeling. These DOS programs extract qualified tree and plot data from the Eastwide Forest Inventory Data Base (EFIDB), calculate relative tree increment, sort for the highest relative increments by diameter class, and...
NASA Technical Reports Server (NTRS)
Phillips, D. T.; Manseur, B.; Foster, J. W.
1982-01-01
Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.
The Efficacy of the Internet-Based Blackboard Platform in Developmental Writing Classes
ERIC Educational Resources Information Center
Shudooh, Yusuf M.
2016-01-01
The application of computer-assisted platforms in writing classes is a relatively new paradigm in education. The adoption of computers-assisted writing classes is gaining ground in many western and non western universities. Numerous issues can be addressed when conducting computer-assisted classes (CAC). However, a few studies conducted to assess…
A portable MPI-based parallel vector template library
NASA Technical Reports Server (NTRS)
Sheffler, Thomas J.
1995-01-01
This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C++ by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of C or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.
A Portable MPI-Based Parallel Vector Template Library
NASA Technical Reports Server (NTRS)
Sheffler, Thomas J.
1995-01-01
This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C + + by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of c or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.
Numerical methods on some structured matrix algebra problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessup, E.R.
1996-06-01
This proposal concerned the design, analysis, and implementation of serial and parallel algorithms for certain structured matrix algebra problems. It emphasized large order problems and so focused on methods that can be implemented efficiently on distributed-memory MIMD multiprocessors. Such machines supply the computing power and extensive memory demanded by the large order problems. We proposed to examine three classes of matrix algebra problems: the symmetric and nonsymmetric eigenvalue problems (especially the tridiagonal cases) and the solution of linear systems with specially structured coefficient matrices. As all of these are of practical interest, a major goal of this work was tomore » translate our research in linear algebra into useful tools for use by the computational scientists interested in these and related applications. Thus, in addition to software specific to the linear algebra problems, we proposed to produce a programming paradigm and library to aid in the design and implementation of programs for distributed-memory MIMD computers. We now report on our progress on each of the problems and on the programming tools.« less
[Hepatox: database on hepatotoxic drugs].
Quinton, A; Latry, P; Biour, M
1993-01-01
Hepatox is a data base on the hepatotoxic drugs file published every year in Gastroentérologie Clinique et Biologique. The program was developed under Omnis 7 for Apple computers, and under Visual Basic Professional Toolkit and Code Base for IBM PC and compatibles computers. The data base includes forms of 866 drugs identified by their approved name and those of their 1,300 corresponding proprietary names in France; drugs are distributed among 104 pharmacological classes. It is possible to have instantaneously access to the card of a drug identified by its approved name. Acceding to a drug identified by its proprietary name gives a list of the approved name of its components; going from a name of this list to the correspondent card of hepatoxicity is immediate. It is easy to extract lists of drugs responsible of a type of hepatic injury, and a table of types of hepatic injuries induced by the drugs of a pharmacological class.
Dickinson, Kathleen; Place, Maurice
2016-06-01
Problems with social functioning are a major area of difficulty for children with autism. Such problems have the potential to exert a negative influence on several aspects of the children's functioning, including their ability to access education. This study looked to examine if a computer-based activity program could improve the social functioning of these children. Using a pooled subject design, 100 children with autistic spectrum disorder were randomly allocated, controlling where possible for age and gender, to either an intervention or a control group. The children in the intervention group were encouraged to use the Nintendo (Kyoto, Japan) Wii™ and the software package "Mario & Sonic at the Olympics" in addition to their routine school physical education classes over a 9-month period. The control group attended only the routine physical education classes. After 1 year, analysis of the changes in the scores of teacher-completed measures of social functioning showed that boys in the intervention group had made statistically significant improvement in their functioning when compared with controls. The number of girls in the study was too small for any change to reach statistical significance. This type of intervention appears to have potential as a mechanism to produce improvement in the social functioning, at least of boys, as part of a physical education program.
NASA Technical Reports Server (NTRS)
Dayton, J. A., Jr.; Kosmahl, H. G.; Ramins, P.; Stankiewicz, N.
1979-01-01
Experimental and analytical results are compared for two high performance, octave bandwidth TWT's that use depressed collectors (MDC's) to improve the efficiency. The computations were carried out with advanced, multidimensional computer programs that are described here in detail. These programs model the electron beam as a series of either disks or rings of charge and follow their multidimensional trajectories from the RF input of the ideal TWT, through the slow wave structure, through the magnetic refocusing system, to their points of impact in the depressed collector. Traveling wave tube performance, collector efficiency, and collector current distribution were computed and the results compared with measurements for a number of TWT-MDC systems. Power conservation and correct accounting of TWT and collector losses were observed. For the TWT's operating at saturation, very good agreement was obtained between the computed and measured collector efficiencies. For a TWT operating 3 and 6 dB below saturation, excellent agreement between computed and measured collector efficiencies was obtained in some cases but only fair agreement in others. However, deviations can largely be explained by small differences in the computed and actual spent beam energy distributions. The analytical tools used here appear to be sufficiently refined to design efficient collectors for this class of TWT. However, for maximum efficiency, some experimental optimization (e.g., collector voltages and aperture sizes) will most likely be required.
Bhanot, Gyan [Princeton, NJ; Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Takken, Todd E [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY
2009-09-08
Class network routing is implemented in a network such as a computer network comprising a plurality of parallel compute processors at nodes thereof. Class network routing allows a compute processor to broadcast a message to a range (one or more) of other compute processors in the computer network, such as processors in a column or a row. Normally this type of operation requires a separate message to be sent to each processor. With class network routing pursuant to the invention, a single message is sufficient, which generally reduces the total number of messages in the network as well as the latency to do a broadcast. Class network routing is also applied to dense matrix inversion algorithms on distributed memory parallel supercomputers with hardware class function (multicast) capability. This is achieved by exploiting the fact that the communication patterns of dense matrix inversion can be served by hardware class functions, which results in faster execution times.
A survey on the design of multiprocessing systems for artificial intelligence applications
NASA Technical Reports Server (NTRS)
Wah, Benjamin W.; Li, Guo Jie
1989-01-01
Some issues in designing computers for artificial intelligence (AI) processing are discussed. These issues are divided into three levels: the representation level, the control level, and the processor level. The representation level deals with the knowledge and methods used to solve the problem and the means to represent it. The control level is concerned with the detection of dependencies and parallelism in the algorithmic and program representations of the problem, and with the synchronization and sheduling of concurrent tasks. The processor level addresses the hardware and architectural components needed to evaluate the algorithmic and program representations. Solutions for the problems of each level are illustrated by a number of representative systems. Design decisions in existing projects on AI computers are classed into top-down, bottom-up, and middle-out approaches.
Integrated modeling of advanced optical systems
NASA Astrophysics Data System (ADS)
Briggs, Hugh C.; Needels, Laura; Levine, B. Martin
1993-02-01
This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.
In-Class Robot Flyby of an Endoplanet
NASA Astrophysics Data System (ADS)
Chadwick, A. J.; Capaldi, T.; Aurnou, J. M.
2013-12-01
For our Introduction to Computing class, we have developed a miniature robotic spacecraft mission that performs a flyby of an in-class 'endoplanet.' Our constructed endoplanet contains an internal dipole magnet, tilted with a dip angle that is unknown a priori. The spacecraft analog is a remotely controlled LEGO MINDSTORMS robot programmed using LabVIEW. Students acquire magnetic field data via a first spacecraft flyby past the endoplanet. This dataset is then imported into MATLAB, and is inverted to create a model of the magnet's orientation and dipole moment. Students use their models to predict the magnetic field profile along a different flyby path. They then test the accuracy of their models, comparing their predictions against the data acquired from this secondary flyby. We will be demonstrating this device at our poster in the Moscone Center.
Execution models for mapping programs onto distributed memory parallel computers
NASA Technical Reports Server (NTRS)
Sussman, Alan
1992-01-01
The problem of exploiting the parallelism available in a program to efficiently employ the resources of the target machine is addressed. The problem is discussed in the context of building a mapping compiler for a distributed memory parallel machine. The paper describes using execution models to drive the process of mapping a program in the most efficient way onto a particular machine. Through analysis of the execution models for several mapping techniques for one class of programs, we show that the selection of the best technique for a particular program instance can make a significant difference in performance. On the other hand, the results of benchmarks from an implementation of a mapping compiler show that our execution models are accurate enough to select the best mapping technique for a given program.
From Turing machines to computer viruses.
Marion, Jean-Yves
2012-07-28
Self-replication is one of the fundamental aspects of computing where a program or a system may duplicate, evolve and mutate. Our point of view is that Kleene's (second) recursion theorem is essential to understand self-replication mechanisms. An interesting example of self-replication codes is given by computer viruses. This was initially explained in the seminal works of Cohen and of Adleman in the 1980s. In fact, the different variants of recursion theorems provide and explain constructions of self-replicating codes and, as a result, of various classes of malware. None of the results are new from the point of view of computability theory. We now propose a self-modifying register machine as a model of computation in which we can effectively deal with the self-reproduction and in which new offsprings can be activated as independent organisms.
Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
A new computer approach to mixed feature classification for forestry application
NASA Technical Reports Server (NTRS)
Kan, E. P.
1976-01-01
A computer approach for mapping mixed forest features (i.e., types, classes) from computer classification maps is discussed. Mixed features such as mixed softwood/hardwood stands are treated as admixtures of softwood and hardwood areas. Large-area mixed features are identified and small-area features neglected when the nominal size of a mixed feature can be specified. The computer program merges small isolated areas into surrounding areas by the iterative manipulation of the postprocessing algorithm that eliminates small connected sets. For a forestry application, computer-classified LANDSAT multispectral scanner data of the Sam Houston National Forest were used to demonstrate the proposed approach. The technique was successful in cleaning the salt-and-pepper appearance of multiclass classification maps and in mapping admixtures of softwood areas and hardwood areas. However, the computer-mapped mixed areas matched very poorly with the ground truth because of inadequate resolution and inappropriate definition of mixed features.
Computational Workbench for Multibody Dynamics
NASA Technical Reports Server (NTRS)
Edmonds, Karina
2007-01-01
PyCraft is a computer program that provides an interactive, workbenchlike computing environment for developing and testing algorithms for multibody dynamics. Examples of multibody dynamic systems amenable to analysis with the help of PyCraft include land vehicles, spacecraft, robots, and molecular models. PyCraft is based on the Spatial-Operator- Algebra (SOA) formulation for multibody dynamics. The SOA operators enable construction of simple and compact representations of complex multibody dynamical equations. Within the Py-Craft computational workbench, users can, essentially, use the high-level SOA operator notation to represent the variety of dynamical quantities and algorithms and to perform computations interactively. PyCraft provides a Python-language interface to underlying C++ code. Working with SOA concepts, a user can create and manipulate Python-level operator classes in order to implement and evaluate new dynamical quantities and algorithms. During use of PyCraft, virtually all SOA-based algorithms are available for computational experiments.
NASA Technical Reports Server (NTRS)
Fijany, Amir; Toomarian, Benny N.
2000-01-01
There has been significant improvement in the performance of VLSI devices, in terms of size, power consumption, and speed, in recent years and this trend may also continue for some near future. However, it is a well known fact that there are major obstacles, i.e., physical limitation of feature size reduction and ever increasing cost of foundry, that would prevent the long term continuation of this trend. This has motivated the exploration of some fundamentally new technologies that are not dependent on the conventional feature size approach. Such technologies are expected to enable scaling to continue to the ultimate level, i.e., molecular and atomistic size. Quantum computing, quantum dot-based computing, DNA based computing, biologically inspired computing, etc., are examples of such new technologies. In particular, quantum-dots based computing by using Quantum-dot Cellular Automata (QCA) has recently been intensely investigated as a promising new technology capable of offering significant improvement over conventional VLSI in terms of reduction of feature size (and hence increase in integration level), reduction of power consumption, and increase of switching speed. Quantum dot-based computing and memory in general and QCA specifically, are intriguing to NASA due to their high packing density (10(exp 11) - 10(exp 12) per square cm ) and low power consumption (no transfer of current) and potentially higher radiation tolerant. Under Revolutionary Computing Technology (RTC) Program at the NASA/JPL Center for Integrated Space Microelectronics (CISM), we have been investigating the potential applications of QCA for the space program. To this end, exploiting the intrinsic features of QCA, we have designed novel QCA-based circuits for co-planner (i.e., single layer) and compact implementation of a class of data permutation matrices, a class of interconnection networks, and a bit-serial processor. Building upon these circuits, we have developed novel algorithms and QCA-based architectures for highly parallel and systolic computation of signal/image processing applications, such as FFT and Wavelet and Wlash-Hadamard Transforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadayappan, Ponnuswamy
Exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today's machines. Systems software for exascale machines must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. We propose a new approach to the data and work distribution model provided by system software based on the unifying formalism of an abstract file system. The proposed hierarchical data model providesmore » simple, familiar visibility and access to data structures through the file system hierarchy, while providing fault tolerance through selective redundancy. The hierarchical task model features work queues whose form and organization are represented as file system objects. Data and work are both first class entities. By exposing the relationships between data and work to the runtime system, information is available to optimize execution time and provide fault tolerance. The data distribution scheme provides replication (where desirable and possible) for fault tolerance and efficiency, and it is hierarchical to make it possible to take advantage of locality. The user, tools, and applications, including legacy applications, can interface with the data, work queues, and one another through the abstract file model. This runtime environment will provide multiple interfaces to support traditional Message Passing Interface applications, languages developed under DARPA's High Productivity Computing Systems program, as well as other, experimental programming models. We will validate our runtime system with pilot codes on existing platforms and will use simulation to validate for exascale-class platforms. In this final report, we summarize research results from the work done at the Ohio State University towards the larger goals of the project listed above.« less
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej
2011-06-01
A computational model is a computer program, which attempts to simulate an abstract model of a particular system. Computational models use enormous calculations and often require supercomputer speed. As personal computers are becoming more and more powerful, more laboratory experiments can be converted into computer models that can be interactively examined by scientists and students without the risk and cost of the actual experiments. The future of programming is concurrent programming. The threaded programming model provides application programmers with a useful abstraction of concurrent execution of multiple tasks. The objective of this release is to address the design of architecture for scientific application, which may execute as multiple threads execution, as well as implementations of the related shared data structures. New version program summaryProgram title: GrowthCP Catalogue identifier: ADVL_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 32 269 No. of bytes in distributed program, including test data, etc.: 8 234 229 Distribution format: tar.gz Programming language: Free Object Pascal Computer: multi-core x64-based PC Operating system: Windows XP, Vista, 7 Has the code been vectorised or parallelized?: No RAM: More than 1 GB. The program requires a 32-bit or 64-bit processor to run the generated code. Memory is addressed using 32-bit (on 32-bit processors) or 64-bit (on 64-bit processors with 64-bit addressing) pointers. The amount of addressed memory is limited only by the available amount of virtual memory. Supplementary material: The figures mentioned in the "Summary of revisions" section can be obtained here. Classification: 4.3, 7.2, 6.2, 8, 14 External routines: Lazarus [1] Catalogue identifier of previous version: ADVL_v3_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 709 Does the new version supersede the previous version?: Yes Nature of problem: Reflection high-energy electron diffraction (RHEED) is an important in-situ analysis technique, which is capable of giving quantitative information about the growth process of thin layers and its control. It can be used to calibrate growth rate, analyze surface morphology, calibrate surface temperature, monitor the arrangement of the surface atoms, and provide information about growth kinetics. Such control allows the development of structures where the electrons can be confined in space, giving quantum wells or even quantum dots. In order to determine the atomic positions of atoms in the first few layers, the RHEED intensity must be measured as a function of the scattering angles and then compared with dynamic calculations. The objective of this release is to address the design of architecture for application that simulates the rocking curves RHEED intensities during hetero-epitaxial growth process of thin films. Solution method: The GrowthCP is a complex numerical model that uses multiple threads for simulation of epitaxial growth of thin layers. This model consists of two transactional parts. The first part is a mathematical model being based on the Runge-Kutta method with adaptive step-size control. The second part represents first-principles of the one-dimensional RHEED computational model. This model is based on solving a one-dimensional Schrödinger equation. Several problems can arise when applications contain a mixture of data access code, numerical code, and presentation code. Such applications are difficult to maintain, because interdependencies between all the components cause strong ripple effects whenever a change is made anywhere. Adding new data views often requires reimplementing a numerical code, which then requires maintenance in multiple places. In order to solve problems of this type, the computational and threading layers of the project have been implemented in the form of one design pattern as a part of Model-View-Controller architecture. Reasons for new version: Responding to the users' feedback the Growth09 project has been upgraded to a standard that allows the carrying out of sample computations of the RHEED intensities for a disordered surface for a wide range of single- and epitaxial hetero-structures. The design pattern on which the project is based has also been improved. It is shown that this model can be effectively used for multithreaded growth simulations of thin epitaxial layers and corresponding RHEED intensities for a wide range of single- and hetero-structures. Responding to the users' feedback the present release has been implemented using a well-documented free compiler [1] not requiring the special configuration and installation additional libraries. Summary of revisions: The logical structure of the Growth09 program has been modified according to the scheme showed in Fig. 1. The class diagram in Fig. 1 is a static view of the main platform-specific elements of the GrowthCP architecture. Fig. 2 provides a dynamic view by showing the creation and destruction simplistic sequence diagram for the process. The program requires the user to provide the appropriate parameters in the form of a knowledge base for the crystal structures under investigation. These parameters are loaded from the parameters. ini files at run-time. Instructions to prepare the .ini files can be found in the new distribution. The program enables carrying out different growth models and one-dimensional dynamical RHEED calculations for the fcc lattice with basis of three-atoms, fcc lattice with basis of two-atoms, fcc lattice with single atom basis, Zinc-Blende, Sodium Chloride, and Wurtzite crystalline structures and hetero-structures, but yet the Fourier component of the scattering potential in the TRHEEDCalculations.crystPotUgXXX() procedure can be modified and implemented according to users' specific application requirements. The Fourier component of the scattering potential of the whole crystalline hetero-structures can be determined as a sum of contributions coming from all thin slices of individual atomic layers. To carry out one-dimensional calculations of the scattering potentials, the program uses properly constructed self-consistent procedures. Each component of the system shown in Figs. 1 and 2 is fully extendable and can easily be adapted to new changeable requirements. Two essential logical elements of the system, i.e. TGrowthTransaction and TRHEEDCalculations classes, were designed and implemented in this way for them to pass the information to themselves without the need to use the data-exchange files given. In consequence each of them can be independently modified and/or extended. Implementing other types of differential equations and the different algorithm for solving them in the TGrowthTransaction class does not require another implementation of the TRHEEDCalculations class. Similarly, implementing other forms of scattering potential and different algorithm for RHEED calculation stays without the influence on the TGrowthTransaction class construction. Unusual features: The program is distributed in the form of main project GrowthCP.lpr, with associated files, and should be compiled using Lazarus IDE. The program should be compiled with English/USA regional and language options. Running time: The typical running time is machine and user-parameters dependent. References: http://sourceforge.net/projects/lazarus/files/.
ERIC Educational Resources Information Center
Riskin, Steve R.
This paper discusses the results of an experimental, non-traditional university class in sociology in which students produced an interactive multimedia module in a social science subject area using a computer system that allowed instant access to film, sound, television, images, and text. There were no constraints on the selection of media, or the…
ERIC Educational Resources Information Center
Berney, Tomi D.; Keyes, Jose
The Computer Writing Skills for Limited English Proficient Students Project (COMPUGRAFIA.LEP) was partially implemented in 1987-88, during the first year of a 3-year cycle. It is a staff development program serving 35 bilingual special education classes with 414 limited-English-proficient Hispanic students in 10 elementary schools in the Bronx.…
Psychological Sciences Division 1979 Programs.
1979-11-01
the potential for substantially improving the ONR effectiveness of Navy undersea manipulator sys- tems. Computer aided controls can be used to Report...OPERATOR VIEWING AND CONTROL OF the operator enters control orders) to determine UNDERSEA VEHICLE AND WORK SYS- ihe structure and mode of command inputs...efforts focus upon a class of control-display ele- Current Navy submersible work systems, such as ments common to general purpose undersea work CURV
Jordan, Denis; Steiner, Marcel; Kochs, Eberhard F; Schneider, Gerhard
2010-12-01
Prediction probability (P(K)) and the area under the receiver operating characteristic curve (AUC) are statistical measures to assess the performance of anesthetic depth indicators, to more precisely quantify the correlation between observed anesthetic depth and corresponding values of a monitor or indicator. In contrast to many other statistical tests, they offer several advantages. First, P(K) and AUC are independent from scale units and assumptions on underlying distributions. Second, the calculation can be performed without any knowledge about particular indicator threshold values, which makes the test more independent from specific test data. Third, recent approaches using resampling methods allow a reliable comparison of P(K) or AUC of different indicators of anesthetic depth. Furthermore, both tests allow simple interpretation, whereby results between 0 and 1 are related to the probability, how good an indicator separates the observed levels of anesthesia. For these reasons, P(K) and AUC have become popular in medical decision making. P(K) is intended for polytomous patient states (i.e., >2 anesthetic levels) and can be considered as a generalization of the AUC, which was basically introduced to assess a predictor of dichotomous classes (e.g., consciousness and unconsciousness in anesthesia). Dichotomous paradigms provide equal values of P(K) and AUC test statistics. In the present investigation, we introduce a user-friendly computer program for computing P(K) and estimating reliable bootstrap confidence intervals. It is designed for multiple comparisons of the performance of depth of anesthesia indicators. Additionally, for dichotomous classes, the program plots the receiver operating characteristic graph completing information obtained from P(K) or AUC, respectively. In clinical investigations, both measures are applied for indicator assessment, where ambiguous usage and interpretation may be a consequence. Therefore, a summary of the concepts of P(K) and AUC including brief and easily understandable proof of their equality is presented in the text. The exposure introduces readers to the algorithms of the provided computer program and is intended to make standardized performance tests of depth of anesthesia indicators available to medical researchers.
A High School Level Course On Robot Design And Construction
NASA Astrophysics Data System (ADS)
Sadler, Paul M.; Crandall, Jack L.
1984-02-01
The Robotics Design and Construction Class at Sehome High School was developed to offer gifted and/or highly motivated students an in-depth introduction to a modern engineering topic. The course includes instruction in basic electronics, digital and radio electronics, construction skills, robotics literacy, construction of the HERO 1 Heathkit Robot, computer/ robot programming, and voice synthesis. A key element which leads to the success of the course is the involvement of various community assets including manpower and financial assistance. The instructors included a physics/electronics teacher, a computer science teacher, two retired engineers, and an electronics technician.
StrateGene: object-oriented programming in molecular biology.
Carhart, R E; Cash, H D; Moore, J F
1988-03-01
This paper describes some of the ways that object-oriented programming methodologies have been used to represent and manipulate biological information in a working application. When running on a Xerox 1100 series computer, StrateGene functions as a genetic engineering workstation for the management of information about cloning experiments. It represents biological molecules, enzymes, fragments, and methods as classes, subclasses, and members in a hierarchy of objects. These objects may have various attributes, which themselves can be defined and classified. The attributes and their values can be passed from the classes of objects down to the subclasses and members. The user can modify the objects and their attributes while using them. New knowledge and changes to the system can be incorporated relatively easily. The operations on the biological objects are associated with the objects themselves. This makes it easier to invoke them correctly and allows generic operations to be customized for the particular object.
Using a Virtual Class to Demonstrate Computer-Mediated Group Dynamics Concepts
ERIC Educational Resources Information Center
Franz, Timothy M.; Vicker, Lauren A.
2010-01-01
We report about an active learning demonstration designed to use a virtual class to present computer-mediated group communication course concepts to show that students can learn about these concepts in a virtual class. We designated 1 class period as a virtual rather than face-to-face class, when class members "attended" virtually using…
NASA Astrophysics Data System (ADS)
Runge, Alan Paul
1997-10-01
A traditional undergraduate physics course on mathematical methods has been redesigned to incorporate the use of Maplesp{sc {TM}}, a computer algebra program, during all aspects of the course. Topics covered were: complex number theory; series approximations; matrix theory; partial differentiation; vector algebra; and vector calculus. Five undergraduate students were enrolled, from sophomore to senior in academic class standing. A qualitative case study methodology was used to describe the changes in the course design resulting from the incorporation of Maplesp{sc {TM}} and their impact on the instruction of the course, and to determine the effects on the students' learning and development of problem solving skills in physics using Maplesp{sc {TM}} as a problem solving tool. The impact of using Maplesp{sc {TM}} on the number and types of interactions is presented. The entire semester long course was included in this study. Each class session is described in detail. Examples of the Maplesp{sc {TM}} materials used are given. The use of the Maplesp{sc {TM}} program was allowed on all homework and exams with each student having their own computer during class. Constraints were made so that the assessment emphasis remained on the mathematics and the conceptual understanding of the problem solving methods. All of the students demonstrated some level of proficiency in using Maplesp{TM} to solve the assigned problems. Strategies for effectively using Maplesp{TM} were presented and were individualized by the students. The students reported positive and negative impacts of using Maplesp{sc {TM}}. All of the students satisfactorily completed the course requirements, receiving final course grades from B to A+. All of them continued to voluntarily use Maplesp{sc {TM}} during the following semester. Instructional methods used included various lecture techniques without Maplesp{sc {TM}} assistance, lectures and demonstrations using only Maplesp{sc {TM}}, and student tasks assigned in class worked with the aid of Maplesp{sc {TM}}. Maplesp{sc {TM}} was used in one of these aspects in all but 3, out of 45, class periods. The use of Maplesp{sc {TM}} constituted about half of the overall class time.
Sociocultural Influences On Undergraduate Women's Entry into a Computer Science Major
NASA Astrophysics Data System (ADS)
Lyon, Louise Ann
Computer science not only displays the pattern of underrepresentation of many other science, technology, engineering, and math (STEM) fields, but has actually experienced a decline in the number of women choosing the field over the past two decades. Broken out by gender and race, the picture becomes more nuanced, with the ratio of females to males receiving bachelor's degrees in computer science higher for non-White ethnic groups than for Whites. This dissertation explores the experiences of university women differing along the axis of race, class, and culture who are considering majoring in computer science in order to highlight how well-prepared women are persuaded that they belong (or not) in the field and how the confluence of social categories plays out in their decision. This study focuses on a university seminar entitled "Women in Computer Science and Engineering" open to women concurrently enrolled in introductory programming and uses an ethnographic approach including classroom participant observation, interviews with seminar students and instructors, observations of students in other classes, and interviews with parents of students. Three stand-alone but related articles explore various aspects of the experiences of women who participated in the study using Rom Harre's positioning theory as a theoretical framework. The first article uses data from twenty-two interviews to uncover how interactions with others and patterns in society position women in relation to a computer science major, and how these women have arrived at the point of considering the major despite messages that they do not belong. The second article more deeply explores the cases of three women who vary greatly along the axes of race, class, and culture in order to uncover pattern and interaction differences for women based on their ethnic background. The final article focuses on the attitudes and expectations of the mothers of three students of contrasting ethnicities and how reported interactions between mothers and daughters either constrain or afford opportunities for the daughters to choose a computer science major.
NASA Astrophysics Data System (ADS)
Andersson, David L.
The field of Computer Information Systems (CIS) or Information Technology (IT) is experiencing rapid change. A 2003 study analyzing the IT degree programs and those of competing disciplines at 10 post-secondary institutions concluded that information technology programs are perceived differently from information systems and computer science programs and are significantly less focused on both math and pure science subjects. In Information Technology programs, voluntary professional certifications, generally known in the Information Technology field as "IT" certifications, are used as indicators of professional skill. A descriptive study noting one subject group's responses to items that were nearly identical except for IT certification information was done to investigate undergraduate CIS/IT student perceptions of IT industry certified instructors. The subject group was comprised of undergraduate CIS/IT students from a regionally accredited private institution and a public institution. The methodology was descriptive, based on a previous model by Dr. McKillip, Professor of Psychology, Southern Illinois University at Carbondale, utilizing a web-based survey instrument with a Likert scale, providing for voluntary anonymous responses outside the classroom over a ten day window. The results indicated that IT certification affected student perceptions of instructor effectiveness, teaching methodology, and student engagement in the class, and to a lesser degree, instructor technical qualifications. The implications suggest that additional research on this topic is merited. Although the study was not designed to examine the precise cause and effect, an important implication is that students may be motivated to attend classes taught by instructors they view as more confident and effective and that teachers with IT industry certification can better engage their students.
RIP-REMOTE INTERACTIVE PARTICLE-TRACER
NASA Technical Reports Server (NTRS)
Rogers, S. E.
1994-01-01
Remote Interactive Particle-tracing (RIP) is a distributed-graphics program which computes particle traces for computational fluid dynamics (CFD) solution data sets. A particle trace is a line which shows the path a massless particle in a fluid will take; it is a visual image of where the fluid is going. The program is able to compute and display particle traces at a speed of about one trace per second because it runs on two machines concurrently. The data used by the program is contained in two files. The solution file contains data on density, momentum and energy quantities of a flow field at discrete points in three-dimensional space, while the grid file contains the physical coordinates of each of the discrete points. RIP requires two computers. A local graphics workstation interfaces with the user for program control and graphics manipulation, and a remote machine interfaces with the solution data set and performs time-intensive computations. The program utilizes two machines in a distributed mode for two reasons. First, the data to be used by the program is usually generated on the supercomputer. RIP avoids having to convert and transfer the data, eliminating any memory limitations of the local machine. Second, as computing the particle traces can be computationally expensive, RIP utilizes the power of the supercomputer for this task. Although the remote site code was developed on a CRAY, it is possible to port this to any supercomputer class machine with a UNIX-like operating system. Integration of a velocity field from a starting physical location produces the particle trace. The remote machine computes the particle traces using the particle-tracing subroutines from PLOT3D/AMES, a CFD post-processing graphics program available from COSMIC (ARC-12779). These routines use a second-order predictor-corrector method to integrate the velocity field. Then the remote program sends graphics tokens to the local machine via a remote-graphics library. The local machine interprets the graphics tokens and draws the particle traces. The program is menu driven. RIP is implemented on the silicon graphics IRIS 3000 (local workstation) with an IRIX operating system and on the CRAY2 (remote station) with a UNICOS 1.0 or 2.0 operating system. The IRIS 4D can be used in place of the IRIS 3000. The program is written in C (67%) and FORTRAN 77 (43%) and has an IRIS memory requirement of 4 MB. The remote and local stations must use the same user ID. PLOT3D/AMES unformatted data sets are required for the remote machine. The program was developed in 1988.
Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact
NASA Astrophysics Data System (ADS)
Abadjiev, Valentin; Kawasaki, Haruhisa
2014-09-01
The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.
The GI Project: a prototype electronic textbook for high school biology.
Calhoun, P S; Fishman, E K
1997-01-01
A prototype electronic science textbook for secondary education was developed to help bridge the gap between state-of-the-art medical technology and the basic science classroom. The prototype combines the latest in radiologic imaging techniques with a user-friendly multimedia computer program to teach the anatomy, physiology, and diseases of the gastrointestinal (GI) tract. The program includes original text, illustrations, photographs, animations, images from upper GI studies, plain radiographs, computed tomographic images, and three-dimensional reconstructions. These features are intended to create a stimulus-rich environment in which the high school science student can enjoy a variety of interactive experiences that will facilitate the learning process. The computer-based book is a new educational tool that promises to play a prominent role in the coming years. Current research suggests that computer-based books are valuable as an alternative educational medium. Although it is not yet clear what form textbooks will take in the future, computer-based books are already proving valuable as an alternative educational medium. For beginning students, they reinforce the material found in traditional textbooks and class presentations; for advanced students, they provide motivation to learn outside the traditional classroom.
Flutter analysis of swept-wing subsonic aircraft with parameter studies of composite wings
NASA Technical Reports Server (NTRS)
Housner, J. M.; Stein, M.
1974-01-01
A computer program is presented for the flutter analysis, including the effects of rigid-body roll, pitch, and plunge of swept-wing subsonic aircraft with a flexible fuselage and engines mounted on flexible pylons. The program utilizes a direct flutter solution in which the flutter determinant is derived by using finite differences, and the root locus branches of the determinant are searched for the lowest flutter speed. In addition, a preprocessing subroutine is included which evaluates the variable bending and twisting stiffness properties of the wing by using a laminated, balanced ply, filamentary composite plate theory. The program has been substantiated by comparisons with existing flutter solutions. The program has been applied to parameter studies which examine the effect of filament orientation upon the flutter behavior of wings belonging to the following three classes: wings having different angles of sweep, wings having different mass ratios, and wings having variable skin thicknesses. These studies demonstrated that the program can perform a complete parameter study in one computer run. The program is designed to detect abrupt changes in the lowest flutter speed and mode shape as the parameters are varied.
An expert computer program for classifying stars on the MK spectral classification system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, R. O.; Corbally, C. J.
2014-04-01
This paper describes an expert computer program (MKCLASS) designed to classify stellar spectra on the MK Spectral Classification system in a way similar to humans—by direct comparison with the MK classification standards. Like an expert human classifier, the program first comes up with a rough spectral type, and then refines that spectral type by direct comparison with MK standards drawn from a standards library. A number of spectral peculiarities, including barium stars, Ap and Am stars, λ Bootis stars, carbon-rich giants, etc., can be detected and classified by the program. The program also evaluates the quality of the delivered spectralmore » type. The program currently is capable of classifying spectra in the violet-green region in either the rectified or flux-calibrated format, although the accuracy of the flux calibration is not important. We report on tests of MKCLASS on spectra classified by human classifiers; those tests suggest that over the entire HR diagram, MKCLASS will classify in the temperature dimension with a precision of 0.6 spectral subclass, and in the luminosity dimension with a precision of about one half of a luminosity class. These results compare well with human classifiers.« less
Educational NASA Computational and Scientific Studies (enCOMPASS)
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2013-01-01
Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.
NASA Astrophysics Data System (ADS)
Noreen, Amna; Olaussen, Kåre
2012-10-01
A subroutine for a very-high-precision numerical solution of a class of ordinary differential equations is provided. For a given evaluation point and equation parameters the memory requirement scales linearly with precision P, and the number of algebraic operations scales roughly linearly with P when P becomes sufficiently large. We discuss results from extensive tests of the code, and how one, for a given evaluation point and equation parameters, may estimate precision loss and computing time in advance. Program summary Program title: seriesSolveOde1 Catalogue identifier: AEMW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 991 No. of bytes in distributed program, including test data, etc.: 488116 Distribution format: tar.gz Programming language: C++ Computer: PC's or higher performance computers. Operating system: Linux and MacOS RAM: Few to many megabytes (problem dependent). Classification: 2.7, 4.3 External routines: CLN — Class Library for Numbers [1] built with the GNU MP library [2], and GSL — GNU Scientific Library [3] (only for time measurements). Nature of problem: The differential equation -s2({d2}/{dz2}+{1-ν+-ν-}/{z}{d}/{dz}+{ν+ν-}/{z2})ψ(z)+{1}/{z} ∑n=0N vnznψ(z)=0, is solved numerically to very high precision. The evaluation point z and some or all of the equation parameters may be complex numbers; some or all of them may be represented exactly in terms of rational numbers. Solution method: The solution ψ(z), and optionally ψ'(z), is evaluated at the point z by executing the recursion A(z)={s-2}/{(m+1+ν-ν+)(m+1+ν-ν-)} ∑n=0N Vn(z)A(z), ψ(z)=ψ(z)+A(z), to sufficiently large m. Here ν is either ν+ or ν-, and Vn(z)=vnz. The recursion is initialized by A(z)=δzν,for n=0,1,…,N ψ(z)=A0(z). Restrictions: No solution is computed if z=0, or s=0, or if ν=ν- (assuming Reν+≥Reν-) with ν+-ν- an integer, except when ν+-ν-=1 and v =0 (i.e. when z is an ordinary point for zψ(z)). Additional comments: The code of the main algorithm is in the file seriesSolveOde1.cc, which "#include" the file checkForBreakOde1.cc. These routines, and the programs using them, must "#include" the file seriesSolveOde1.cc. Running time: On a Linux PC that is a few years old, at y=√{10} to an accuracy of P=200 decimal digits, evaluating the ground state wavefunction of the anharmonic oscillator (with the eigenvalue known in advance); (cf. Eq. (6)) takes about 2 ms, and about 40 min at an accuracy of P=100000 decimal digits. References: [1] B. Haible and R.B. Kreckel, CLN — Class Library for Numbers, http://www.ginac.de/CLN/ [2] T. Granlund and collaborators, GMP — The GNU Multiple Precision Arithmetic Library, http://gmplib.org/ [3] M. Galassi et al., GNU Scientific Library Reference Manual (3rd Ed.), ISBN 0954612078., http://www.gnu.org/software/gsl/
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
Learning sorting algorithms through visualization construction
NASA Astrophysics Data System (ADS)
Cetin, Ibrahim; Andrews-Larson, Christine
2016-01-01
Recent increased interest in computational thinking poses an important question to researchers: What are the best ways to teach fundamental computing concepts to students? Visualization is suggested as one way of supporting student learning. This mixed-method study aimed to (i) examine the effect of instruction in which students constructed visualizations on students' programming achievement and students' attitudes toward computer programming, and (ii) explore how this kind of instruction supports students' learning according to their self-reported experiences in the course. The study was conducted with 58 pre-service teachers who were enrolled in their second programming class. They expect to teach information technology and computing-related courses at the primary and secondary levels. An embedded experimental model was utilized as a research design. Students in the experimental group were given instruction that required students to construct visualizations related to sorting, whereas students in the control group viewed pre-made visualizations. After the instructional intervention, eight students from each group were selected for semi-structured interviews. The results showed that the intervention based on visualization construction resulted in significantly better acquisition of sorting concepts. However, there was no significant difference between the groups with respect to students' attitudes toward computer programming. Qualitative data analysis indicated that students in the experimental group constructed necessary abstractions through their engagement in visualization construction activities. The authors of this study argue that the students' active engagement in the visualization construction activities explains only one side of students' success. The other side can be explained through the instructional approach, constructionism in this case, used to design instruction. The conclusions and implications of this study can be used by researchers and instructors dealing with computational thinking.
Project Integration Architecture
NASA Technical Reports Server (NTRS)
Jones, William Henry
2008-01-01
The Project Integration Architecture (PIA) is a distributed, object-oriented, conceptual, software framework for the generation, organization, publication, integration, and consumption of all information involved in any complex technological process in a manner that is intelligible to both computers and humans. In the development of PIA, it was recognized that in order to provide a single computational environment in which all information associated with any given complex technological process could be viewed, reviewed, manipulated, and shared, it is necessary to formulate all the elements of such a process on the most fundamental level. In this formulation, any such element is regarded as being composed of any or all of three parts: input information, some transformation of that input information, and some useful output information. Another fundamental principle of PIA is the assumption that no consumer of information, whether human or computer, can be assumed to have any useful foreknowledge of an element presented to it. Consequently, a PIA-compliant computing system is required to be ready to respond to any questions, posed by the consumer, concerning the nature of the proffered element. In colloquial terms, a PIA-compliant system must be prepared to provide all the information needed to place the element in context. To satisfy this requirement, PIA extends the previously established object-oriented- programming concept of self-revelation and applies it on a grand scale. To enable pervasive use of self-revelation, PIA exploits another previously established object-oriented-programming concept - that of semantic infusion through class derivation. By means of self-revelation and semantic infusion through class derivation, a consumer of information can inquire about the contents of all information entities (e.g., databases and software) and can interact appropriately with those entities. Other key features of PIA are listed.
FY16 ASME High Temperature Code Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swindeman, M. J.; Jetter, R. I.; Sham, T. -L.
2016-09-01
One of the objectives of the ASME high temperature Code activities is to develop and validate both improvements and the basic features of Section III, Division 5, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to be used to assess whether or not a specific component under specified loading conditions will satisfy the elevated temperature design requirements for Class A components in Section III, Division 5, Subsection HB, Subpart B (HBB). There are many features and alternative paths of varying complexity in HBB. The initial focus of this task is amore » basic path through the various options for a single reference material, 316H stainless steel. However, the program will be structured for eventual incorporation all the features and permitted materials of HBB. Since this task has recently been initiated, this report focuses on the description of the initial path forward and an overall description of the approach to computer program development.« less
Spreter Von Kreudenstein, Thomas; Lario, Paula I; Dixit, Surjit B
2014-01-01
Computational and structure guided methods can make significant contributions to the development of solutions for difficult protein engineering problems, including the optimization of next generation of engineered antibodies. In this paper, we describe a contemporary industrial antibody engineering program, based on hypothesis-driven in silico protein optimization method. The foundational concepts and methods of computational protein engineering are discussed, and an example of a computational modeling and structure-guided protein engineering workflow is provided for the design of best-in-class heterodimeric Fc with high purity and favorable biophysical properties. We present the engineering rationale as well as structural and functional characterization data on these engineered designs. Copyright © 2013 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolverton, Christopher; Ozolins, Vidvuds; Kung, Harold H.
The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H 2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH 2+NH 3BH 3] and nitrogen-hydrogen based borohydrides [e.g.more » Al(BH 4) 3(NH 3) 3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And, state-of-the-art storage experiments will give key storage attributes of the investigated reactions, validate computational predictions, and help guide and improve computational methods. In sum, our approach involves a powerful blend of: 1) H2 Storage measurements and characterization, 2) State-of-the-art computational modeling, 3) Detailed catalysis experiments, 4) In-depth automotive perspective.« less
AE9/AP9/SPM Model Application Programming Interface, Version 1.00.000
2014-02-18
propagator, a SatEph implementation and a Kepler +J2 only propagator. Clients of this class can choose which to use... Kepler -J2 orbit propagator Parameters: none Return values: none void useSGP4ImprovedMode...values: none void setOrbitType ( const string& strOrbit ) Usage: Sets the type of orbit to compute for the Kepler /J2 propagator. Valid values are
Distributing an executable job load file to compute nodes in a parallel computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gooding, Thomas M.
Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less
Projects Using a Computer Algebra System in First-Year Undergraduate Mathematics
ERIC Educational Resources Information Center
Rosenzweig, Martin
2007-01-01
This paper illustrates the use of computer-based projects in two one-semester first-year undergraduate mathematics classes. Developed over a period of years, the approach is one in which the classes are organised into work-groups, with computer-based projects being undertaken periodically to illustrate the class material. These projects are…
A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 3: Program manual
NASA Technical Reports Server (NTRS)
Johnson, W.
1980-01-01
The computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. This analysis is designed to calculate rotor performance, loads, and noise; the helicopter vibration and gust response; the flight dynamics and handling qualities; and the system aeroelastic stability. The analysis is a combination of structural, inertial, and aerodynamic models that is applicable to a wide range of problems and a wide class of vehicles. The analysis is intended for use in the design, testing, and evaluation of rotors and rotorcraft and to be a basis for further development of rotary wing theories.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.
Reduze - Feynman integral reduction in C++
NASA Astrophysics Data System (ADS)
Studerus, C.
2010-07-01
Reduze is a computer program for reducing Feynman integrals to master integrals employing a Laporta algorithm. The program is written in C++ and uses classes provided by the GiNaC library to perform the simplifications of the algebraic prefactors in the system of equations. Reduze offers the possibility to run reductions in parallel. Program summaryProgram title:Reduze Catalogue identifier: AEGE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:: yes No. of lines in distributed program, including test data, etc.: 55 433 No. of bytes in distributed program, including test data, etc.: 554 866 Distribution format: tar.gz Programming language: C++ Computer: All Operating system: Unix/Linux Number of processors used: The number of processors is problem dependent. More than one possible but not arbitrary many. RAM: Depends on the complexity of the system. Classification: 4.4, 5 External routines: CLN ( http://www.ginac.de/CLN/), GiNaC ( http://www.ginac.de/) Nature of problem: Solving large systems of linear equations with Feynman integrals as unknowns and rational polynomials as prefactors. Solution method: Using a Gauss/Laporta algorithm to solve the system of equations. Restrictions: Limitations depend on the complexity of the system (number of equations, number of kinematic invariants). Running time: Depends on the complexity of the system.
[Computer simulation of a clinical magnet resonance tomography scanner for training purposes].
Hackländer, T; Mertens, H; Cramer, B M
2004-08-01
The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. The simulation has been used in the university education for more than 1 year, successfully illustrating the dependence of the MR images on the measuring parameters. This should facititate the approach of students to the understanding MR imaging in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gooding, Thomas M.
Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less
A meta-analysis of pedagogical tools used in introductory programming courses
NASA Astrophysics Data System (ADS)
Trees, Frances P.
Programming is recognized as being challenging for teachers to teach and difficult for students to learn. For decades, computer science educators have looked at innovative approaches by creating pedagogical software tools that attempt to facilitate both the teaching of and the learning of programming. This dissertation investigates the motivations for the integration of pedagogical tools in introductory programming courses and the characteristics that are perceived to contribute to the effectiveness of these tools. The study employs three research stages that examine the tool characteristics and their use. The first stage surveys teachers who use pedagogical tools in an introductory programming course. The second interviews teachers to explore the survey results in more detail and to add greater depth into the choice and use of pedagogical tools in the introductory programming class. The third interviews tool developers to provide an explanatory insight of the tool and the motivation for its creation. The results indicate that the pedagogical tools perceived to be effective share common characteristics: They provide an environment that is manageable, flexible and visual; they provide for active engagement in learning activities and support programming in small pieces; they allow for an easy transition to subsequent courses and more robust environments; they provide technical support and resource materials. The results of this study also indicate that recommendations from other computer science educators have a strong impact on a teacher's initial tool choice for an introductory programming course. This study informs present and future tool developers of the characteristics that the teachers perceive to contribute to the effectiveness of a pedagogical tool and how to present their tools to encourage a more efficient and more effective widespread adoption of the tool into the teacher's curriculum. The teachers involved in this study are actively involved in the computer science education community. The results of this study, based on the perceptions of these computer science educators, provide guidance to those educators choosing to introduce a new pedagogical tool into their programming course.
Multilevel algorithms for nonlinear optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia; Dennis, J. E., Jr.
1994-01-01
Multidisciplinary design optimization (MDO) gives rise to nonlinear optimization problems characterized by a large number of constraints that naturally occur in blocks. We propose a class of multilevel optimization methods motivated by the structure and number of constraints and by the expense of the derivative computations for MDO. The algorithms are an extension to the nonlinear programming problem of the successful class of local Brown-Brent algorithms for nonlinear equations. Our extensions allow the user to partition constraints into arbitrary blocks to fit the application, and they separately process each block and the objective function, restricted to certain subspaces. The methods use trust regions as a globalization strategy, and they have been shown to be globally convergent under reasonable assumptions. The multilevel algorithms can be applied to all classes of MDO formulations. Multilevel algorithms for solving nonlinear systems of equations are a special case of the multilevel optimization methods. In this case, they can be viewed as a trust-region globalization of the Brown-Brent class.
GPUs in a computational physics course
NASA Astrophysics Data System (ADS)
Adler, Joan; Nissim, Gal; Kiswani, Ahmad
2017-10-01
In an introductory computational physics class of the type that many of us give, time constraints lead to hard choices on topics. Everyone likes to include their own research in such a class but an overview of many areas is paramount. Parallel programming algorithms using MPI is one important topic. Both the principle and the need to break the “fear barrier” of using a large machine with a queuing system via ssh must be sucessfully passed on. Due to the plateau in chip development and to power considerations future HPC hardware choices will include heavy use of GPUs. Thus the need to introduce these at the level of an introductory course has arisen. Just as for parallel coding, explanation of the benefits and simple examples to guide the hesitant first time user should be selected. Several student projects using GPUs that include how-to pages were proposed at the Technion. Two of the more successful ones were lattice Boltzmann and a finite element code, and we present these in detail.
NASA Technical Reports Server (NTRS)
White, Allan L.; Palumbo, Daniel L.
1991-01-01
Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.
Outreach to Scientists and Engineers at the Hanford Technical Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buxton, Karen A.
Staff at the Hanford Technical Library has developed a suite of programs designed to help busy researchers at the Pacific Northwest National Laboratory (PNNL) make better use of library products and services. Programs include formal training classes, one-on-one consultations, and targeted email messages announcing new materials to researchers in specific fields. A staple of outreach has been to teach classes to library clients covering research tools in their fields. These classes started out in the library classroom and then expanded to other venues around PNNL. Class surveys indicated that many researchers desired a practical approach to learning rather than themore » traditional lecture format. The library instituted “Library Learning Day” and hosted classes in the PNNL computer training room to provide lab employees with a hands-on learning experience. Classes are generally offered at noon and lab staff attends classes on their lunch hour. Many just do not have time to spend a full hour in training. Library staff added some experimental half-hour mini classes in campus buildings geared to the projects and interests of researchers there to see if this format was more appealing. As other programs have developed librarians are teaching fewer classes but average attendance figures has remained fairly stable from 2005-2007. In summer of 2004 the library began the Traveling Librarian program. Librarians call-on groups and individuals in 24 buildings on the Richland Washington campus. Five full-time and two part-time librarians are involved in the program. Librarians usually send out email announcements prior to visits and encourage scientists and engineers to make appointments for a brief 15 minute consultation in the researcher’s own office. During the meeting lab staff learn about products or product features that can help them work more productively. Librarians also make cold calls to staff that do not request a consultation and may not be making full use of the library. Scientists and engineers who require longer sessions can arrange half-hour training appointments in the researcher’s own office or at the library. Since the program was implemented staff made 165 visits to 1249 laboratory staff including some repeat consultation requests. New acquisitions lists are sent to individuals and groups that would be interested in recent journal, database, and books purchases. These lists are topic specific and targeted to groups and individuals with an interest in the field. For example newly acquired engineering resources are targeted at engineering groups. The new acquisitions list for engineering began mid year in 2005. An analysis of circulation statistics for engineering books in fiscal year 2005, 2006, and 2007 show that circulation increased each year with 2007 circulation nearly double that of 2005. This took place when overall circulation rose in FY06 but fell slightly in FY07. Outreach strategies tailored and individualized can be effective. Offering multiple outreach options offers researchers different ways to interact with library staff and services.« less
Improving Search Properties in Genetic Programming
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.; DeWeese, Scott
1997-01-01
With the advancing computer processing capabilities, practical computer applications are mostly limited by the amount of human programming required to accomplish a specific task. This necessary human participation creates many problems, such as dramatically increased cost. To alleviate the problem, computers must become more autonomous. In other words, computers must be capable to program/reprogram themselves to adapt to changing environments/tasks/demands/domains. Evolutionary computation offers potential means, but it must be advanced beyond its current practical limitations. Evolutionary algorithms model nature. They maintain a population of structures representing potential solutions to the problem at hand. These structures undergo a simulated evolution by means of mutation, crossover, and a Darwinian selective pressure. Genetic programming (GP) is the most promising example of an evolutionary algorithm. In GP, the structures that evolve are trees, which is a dramatic departure from previously used representations such as strings in genetic algorithms. The space of potential trees is defined by means of their elements: functions, which label internal nodes, and terminals, which label leaves. By attaching semantic interpretation to those elements, trees can be interpreted as computer programs (given an interpreter), evolved architectures, etc. JSC has begun exploring GP as a potential tool for its long-term project on evolving dextrous robotic capabilities. Last year we identified representation redundancies as the primary source of inefficiency in GP. Subsequently, we proposed a method to use problem constraints to reduce those redundancies, effectively reducing GP complexity. This method was implemented afterwards at the University of Missouri. This summer, we have evaluated the payoff from using problem constraints to reduce search complexity on two classes of problems: learning boolean functions and solving the forward kinematics problem. We have also developed and implemented methods to use additional problem heuristics to fine-tune the searchable space, and to use typing information to further reduce the search space. Additional improvements have been proposed, but they are yet to be explored and implemented.
Computational Physics in a Nutshell
NASA Astrophysics Data System (ADS)
Schillaci, Michael
2001-11-01
Too often students of science are expected to ``pick-up'' what they need to know about the Art of Science. A description of the two-semester Computational Physics course being taught by the author offers a remedy to this situation. The course teaches students the three pillars of modern scientific research: Problem Solving, Programming, and Presentation. Using FORTRAN, LaTeXe, MAPLE V, HTML, and JAVA, students learn the fundamentals of algorithm development, how to implement classes and packages written by others, how to produce publication quality graphics and documents and how to publish them on the world-wide-web. The course content is outlined and project examples are offered.
Computational Labs Using VPython Complement Conventional Labs in Online and Regular Physics Classes
NASA Astrophysics Data System (ADS)
Bachlechner, Martina E.
2009-03-01
Fairmont State University has developed online physics classes for the high-school teaching certificate based on the text book Matter and Interaction by Chabay and Sherwood. This lead to using computational VPython labs also in the traditional class room setting to complement conventional labs. The computational modeling process has proven to provide an excellent basis for the subsequent conventional lab and allows for a concrete experience of the difference between behavior according to a model and realistic behavior. Observations in the regular class room setting feed back into the development of the online classes.
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.
Promoting Active Learning: The Use of Computational Software Programs
NASA Astrophysics Data System (ADS)
Dickinson, Tom
The increased emphasis on active learning in essentially all disciplines is proving beneficial in terms of a student's depth of learning, retention, and completion of challenging courses. Formats labeled flipped, hybrid and blended facilitate face-to-face active learning. To be effective, students need to absorb a significant fraction of the course material prior to class, e.g., using online lectures and reading assignments. Getting students to assimilate and at least partially understand this material prior to class can be extremely difficult. As an aid to achieving this preparation as well as enhancing depth of understanding, we find the use of software programs such as Mathematica®or MatLab®, very helpful. We have written several Mathematica®applications and student exercises for use in a blended format two semester E&M course. Formats include tutorials, simulations, graded and non-graded quizzes, walk-through problems, exploration and interpretation exercises, and numerical solutions of complex problems. A good portion of this activity involves student-written code. We will discuss the efficacy of these applications, their role in promoting active learning, and the range of possible uses of this basic scheme in other classes.
NASA Technical Reports Server (NTRS)
Dash, S.; Delguidice, P. D.
1975-01-01
A parametric numerical procedure permitting the rapid determination of the performance of a class of scramjet nozzle configurations is presented. The geometric complexity of these configurations ruled out attempts to employ conventional nozzle design procedures. The numerical program developed permitted the parametric variation of cowl length, turning angles on the cowl and vehicle undersurface and lateral expansion, and was subject to fixed constraints such as the vehicle length and nozzle exit height. The program required uniform initial conditions at the burner exit station and yielded the location of all predominant wave zones, accounting for lateral expansion effects. In addition, the program yielded the detailed pressure distribution on the cowl, vehicle undersurface and fences, if any, and calculated the nozzle thrust, lift and pitching moments.
NASA Astrophysics Data System (ADS)
Klieger, Aviva; Ben-Hur, Yehuda; Bar-Yossef, Nurit
2010-04-01
The study examines the professional development of junior-high-school teachers participating in the Israeli "Katom" (Computer for Every Class, Student and Teacher) Program, begun in 2004. A three-circle support and training model was developed for teachers' professional development. The first circle applies to all teachers in the program; the second, to all teachers at individual schools; the third to teachers of specific disciplines. The study reveals and describes the attitudes of science teachers to the integration of laptop computers and to the accompanying professional development model. Semi-structured interviews were conducted with eight science teachers from the four schools participating in the program. The interviews were analyzed according to the internal relational framework taken from the information that arose from the interviews. Two factors influenced science teachers' professional development: (1) Introduction of laptops to the teachers and students. (2) The support and training system. Interview analysis shows that the disciplinary training is most relevant to teachers and they are very interested in belonging to the professional science teachers' community. They also prefer face-to-face meetings in their school. Among the difficulties they noted were the new learning environment, including control of student computers, computer integration in laboratory work and technical problems. Laptop computers contributed significantly to teachers' professional and personal development and to a shift from teacher-centered to student-centered teaching. One-to-One laptops also changed the schools' digital culture. The findings are important for designing concepts and models for professional development when introducing technological innovation into the educational system.
NASA Astrophysics Data System (ADS)
Choi, Heekyung
2009-12-01
The purpose of this study is to learn about students' perspectives of an undergraduate level information technology (IT) education program. The IT program is a recent effort to create a new educational opportunity for computing in college, with recognition that the recent IT developments have had a greater influence on various aspects of people's lives than ever. Students' perspectives are a necessary piece of information to develop this innovative IT education program into a sound educational opportunity. Data were gathered through qualitative in-depth interviews conducted with 28 undergraduate students, most of whom have taken one or more IT classes before. The interview data were analyzed using the grounded theory approach. The analysis found that college students perceived that they were very competent in dealing with IT primarily due to their continued exposure to computers since youth. However, this perceived competency was not very stable. Students felt that they did not have sufficient IT competency when technical skills of dealing with IT came to attention. They also felt so when comparing their IT competency with that of their peers, examining it in a class context, and confronting a transition from education to the real world. In spite of their preference for and confidence in self-guided learning, students wanted to receive a formal instruction in IT when they needed to learn something difficult, something that they were not very interested in, and something important for their future lives. They also expressed a desire to gain a comprehensive understanding of computers without needing to learn fundamental computing principles. Students' various interests in IT education were dispersed around learning practical technical skills and understanding social implications of IT. Many participants' focus was a mix of the two factors, which was often expressed as an area that dealt with "how humans and computers interact." This blended interest suggested a potential defining characteristic for IT education. Students' motivations for pursuing IT education ranged from their passion to some practical considerations. The majority of students expressed mixed motivations, often more strongly inclined to practicality. This finding implied that students' practical considerations as well as their pure interests were an important factor to consider in administering an IT program. Participants found that the primary value of the IT program was that it incorporated technological and social topics which had not been well connected previously. Yet, balancing the technical and non-technical components in the curriculum also proved to be the most controversial aspect. Students perceived that the weaknesses of the IT program were also associated with its interdisciplinary nature. Students also viewed that the topics in the IT program were more closely related to many real world problems than the curricula of typical college education programs. Finally, the analysis revealed that students determined the value of the IT minor program in relation to their majors and career interests. Students took the IT minor to supplement their majors, in terras of their interests in developing their careers beyond formal education. Overall, this investigation showed that students perceived this broad-based education program for IT as an intermediate field that filled a significant niche in college education caused by the recent technological innovations: between technical and social, between school and everyday life, and between formal education and the "real world." The results have practical implications for the development of IT programs in college and for future research directions.
Object-oriented sequence analysis: SCL--a C++ class library.
Vahrson, W; Hermann, K; Kleffe, J; Wittig, B
1996-04-01
SCL (Sequence Class Library) is a class library written in the C++ programming language. Designed using object-oriented programming principles, SCL consists of classes of objects performing tasks typically needed for analyzing DNA or protein sequences. Among them are very flexible sequence classes, classes accessing databases in various formats, classes managing collections of sequences, as well as classes performing higher-level tasks like calculating a pairwise sequence alignment. SCL also includes classes that provide general programming support, like a dynamically growing array, sets, matrices, strings, classes performing file input/output, and utilities for error handling. By providing these components, SCL fosters an explorative programming style: experimenting with algorithms and alternative implementations is encouraged rather than punished. A description of SCL's overall structure as well as an overview of its classes is given. Important aspects of the work with SCL are discussed in the context of a sample program.
Future perspectives - proposal for Oxford Physiome Project.
Oku, Yoshitaka
2010-01-01
The Physiome Project is an effort to understand living creatures using "analysis by synthesis" strategy, i.e., by reproducing their behaviors. In order to achieve its goal, sharing developed models between different computer languages and application programs to incorporate into integrated models is critical. To date, several XML-based markup languages has been developed for this purpose. However, source codes written with XML-based languages are very difficult to read and edit using text editors. An alternative way is to use an object-oriented meta-language, which can be translated to different computer languages and transplanted to different application programs. Object-oriented languages are suitable for describing structural organization by hierarchical classes and taking advantage of statistical properties to reduce the number of parameter while keeping the complexity of behaviors. Using object-oriented languages to describe each element and posting it to a public domain should be the next step to build up integrated models of the respiratory control system.
NASA Technical Reports Server (NTRS)
Savely, Robert T.; Loftin, R. Bowen
1990-01-01
Training is a major endeavor in all modern societies. Common training methods include training manuals, formal classes, procedural computer programs, simulations, and on-the-job training. NASA's training approach has focussed primarily on on-the-job training in a simulation environment for both crew and ground based personnel. NASA must explore new approaches to training for the 1990's and beyond. Specific autonomous training systems are described which are based on artificial intelligence technology for use by NASA astronauts, flight controllers, and ground based support personnel that show an alternative to current training systems. In addition to these specific systems, the evolution of a general architecture for autonomous intelligent training systems that integrates many of the features of traditional training programs with artificial intelligence techniques is presented. These Intelligent Computer Aided Training (ICAT) systems would provide much of the same experience that could be gained from the best on-the-job training.
Quantitative Modeling of Earth Surface Processes
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.
This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.
Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing
NASA Technical Reports Server (NTRS)
Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael
2013-01-01
Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.
Standoff reconnaissance imagery - Applications and interpreter training
NASA Astrophysics Data System (ADS)
Gustafson, G. C.
1980-01-01
The capabilities, advantages and applications of Long Range Oblique Photography (LOROP) standoff air reconnaissance cameras are reviewed, with emphasis on the problems likely to be encountered in photo interpreter training. Results of student exercises in descriptive image analysis and mensuration are presented and discussed, and current work on the computer programming of oblique and panoramic mensuration tasks is summarized. Numerous examples of this class of photographs and their interpretation at various magnifications are also presented.
Zhao, Weizhao; Li, Xiping; Chen, Hairong; Manns, Fabrice
2012-01-01
Medical Imaging is a key training component in Biomedical Engineering programs. Medical imaging education is interdisciplinary training, involving physics, mathematics, chemistry, electrical engineering, computer engineering, and applications in biology and medicine. Seeking an efficient teaching method for instructors and an effective learning environment for students has long been a goal for medical imaging education. By the support of NSF grants, we developed the medical imaging teaching software (MITS) and associated dynamic assessment tracking system (DATS). The MITS/DATS system has been applied to junior and senior medical imaging classes through a hybrid teaching model. The results show that student's learning gain improved, particularly in concept understanding and simulation project completion. The results also indicate disparities in subjective perception between junior and senior classes. Three institutions are collaborating to expand the courseware system and plan to apply it to different class settings.
Multithreaded transactions in scientific computing. The Growth06_v2 program
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej
2009-07-01
Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronization, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents a new version of the GROWTHGr and GROWTH06 programs. New version program summaryProgram title: GROWTH06_v2 Catalogue identifier: ADVL_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 65 255 No. of bytes in distributed program, including test data, etc.: 865 985 Distribution format: tar.gz Programming language: Object Pascal Computer: Pentium-based PC Operating system: Windows 9x, XP, NT, Vista RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADVL_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 678 Does the new version supersede the previous version?: Yes Nature of problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory. Solution method: Epitaxial growth of thin films is modelled by a set of non-linear differential equations [1]. The Runge-Kutta method with adaptive stepsize control was used for solving initial value problem for non-linear differential equations [2]. Reasons for new version: According to the users' suggestions functionality of the program has been improved. Moreover, new use cases have been added which make the handling of the program easier and more efficient than the previous ones [3]. Summary of revisions:The design pattern (See Fig. 2 of Ref. [3]) has been modified according to the scheme shown on Fig. 1. A graphical user interface (GUI) for the program has been reconstructed. Fig. 2 presents a hybrid diagram of a GUI that shows how onscreen objects connect to use cases. The program has been compiled with English/USA regional and language options. Note: The figures mentioned above are contained in the program distribution file. Unusual features: The program is distributed in the form of source project GROWTH06_v2.dpr with associated files, and should be compiled using Borland Delphi compilers versions 6 or latter (including Borland Developer Studio 2006 and Code Gear compilers for Delphi). Additional comments: Two figures are included in the program distribution file. These are captioned Static classes model for Transaction design pattern. A model of a window that shows how onscreen objects connect to use cases. Running time: The typical running time is machine and user-parameters dependent. References: [1] A. Daniluk, Comput. Phys. Comm. 170 (2005) 265. [2] W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes in Pascal: The Art of Scientific Computing, first ed., Cambridge University Press, 1989. [3] M. Brzuszek, A. Daniluk, Comput. Phys. Comm. 175 (2006) 678.
Indirect addressing and load balancing for faster solution to Mandelbrot Set on SIMD architectures
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl
1989-01-01
SIMD computers with local indirect addressing allow programs to have queues and buffers, making certain kinds of problems much more efficient. Examined here are a class of problems characterized by computations on data points where the computation is identical, but the convergence rate is data dependent. Normally, in this situation, the algorithm time is governed by the maximum number of iterations required by each point. Using indirect addressing allows a processor to proceed to the next data point when it is done, reducing the overall number of iterations required to approach the mean convergence rate when a sufficiently large problem set is solved. Load balancing techniques can be applied for additional performance improvement. Simulations of this technique applied to solving Mandelbrot Sets indicate significant performance gains.
GLAD: a system for developing and deploying large-scale bioinformatics grid.
Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong
2005-03-01
Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.
Zalkind, D; Malec, B
1988-01-01
A national survey of alumni of AUPHA programs from the classes of 1983, 1984, and 1985 was undertaken to assess their experiences in management information systems education, both formally and on the job. The survey covered 38 AUPHA graduate member programs and resulted in 1,181 responses. Over 40 percent of the alumni indicated that they had had an introductory management information systems (MIS) course in a health administration program. Since graduation, almost 90 percent have had some significant on-the-job involvement with computers, computer-generated information, or MIS. More than one-third of the respondents felt that their MIS course work did not adequately prepare them for what was expected on the job. Alumni stressed that microcomputer software applications, such as spreadsheets and data bases, are important areas for student hands-on experiences. When asked the importance of certain areas to be included in a required introductory MIS course, the alumni also recommended spreadsheet analysis and design, report writing and data presentation, and other management areas. Additional comments suggested more access to personal computers (PCs), more relevance in the curriculum to the "real world," and the importance of MIS to the career paths of alumni. Faculty suggestions from a 1984-85 survey are compared with alumni responses in order to identify curricular changes needed. Recommendations are outlined for consideration.
PGOPHER in the Classroom and the Laboratory
NASA Astrophysics Data System (ADS)
Western, Colin
2015-06-01
PGOPHER is a general purpose program for simulating and fitting rotational, vibrational and electronic spectra. As it uses a graphical user interface the basic operation is sufficiently straightforward to make it suitable for use in undergraduate practicals and computer based classes. This talk will present two experiments that have been in regular use by Bristol undergraduates for some years based on the analysis of infra-red spectra of cigarette smoke and, for more advanced students, visible and near ultra-violet spectra of a nitrogen discharge and a hydrocarbon flame. For all of these the rotational structure is analysed and used to explore ideas of bonding. The talk will discuss the requirements for the apparatus and the support required. Other ideas for other possible experiments and computer based exercises will also be presented, including a group exercise. The PGOPHER program is open source, and is available for Microsoft Windows, Apple Mac and Linux. It can be freely downloaded from the supporting website http://pgopher.chm.bris.ac.uk. The program does not require any installation process, so can be run on student's own machines or easily setup on classroom or laboratory computers. PGOPHER, a Program for Simulating Rotational, Vibrational and Electronic Structure, C. M. Western, University of Bristol, http://pgopher.chm.bris.ac.uk PGOPHER version 8.0, C M Western, 2014, University of Bristol Research Data Repository, doi:10.5523/bris.huflggvpcuc1zvliqed497r2
The contribution of dance to daily physical activity among adolescent girls
2011-01-01
Background Structured physical activity (PA) programs are well positioned to promote PA among youth, however, little is known about these programs, particularly dance classes. The aims of this study were to: 1) describe PA levels of girls enrolled in dance classes, 2) determine the contribution of dance classes to total moderate-to-vigorous physical activity (MVPA), and 3) compare PA between days with a dance class (program days) and days without a dance class (non-program days). Methods Participants were 149 girls (11-18 years) enrolled in dance classes in 11 dance studios. Overall PA was assessed with accelerometry for 8 consecutive days, and girls reported when they attended dance classes during those days. The percent contribution of dance classes to total MVPA was calculated, and data were reduced to compare PA on program days to non-program days. Data were analyzed using mixed models, adjusting for total monitoring time. Results Girls engaged in 25.0 ± 0.9 minutes/day of MVPA. Dance classes contributed 28.7% (95% CI: 25.9%-31.6%) to girls' total MVPA. Girls accumulated more MVPA on program (28.7 ± 1.4 minutes/day) than non-program days (16.4 ± 1.5 minutes/day) (p < 0.001). Girls had less sedentary behavior on program (554.0 ± 8.1 minutes/day) than non-program days (600.2 ± 8.7 minutes/day) (p < 0.001). Conclusions Dance classes contributed a substantial proportion (29%) to girls' total MVPA, and girls accumulated 70% more MVPA and 8% less sedentary behavior on program days than on non-program days. Dance classes can make an important contribution to girls' total physical activity. PMID:21816074
A Class for Teachers Featuring a NASA Satellite Mission
NASA Astrophysics Data System (ADS)
Battle, R.; Hawkins, I.
1996-05-01
As part of the NASA IDEA (Initiative to Develop Education through Astronomy) program, the UC Berkeley Center for EUV Astrophysics (CEA) received a grant to develop a self-contained teacher professional development class featuring NASA's Extreme Ultraviolet Explorer (EUVE) satellite mission. This class was offered in collaboration with the Physics/Astronomy Department and the Education Department of San Francisco State University during 1994, and in collaboration with the UCB Graduate School of Education in 1995 as an extension course. The class served as the foundation for the Science Education Program at CEA, providing valuable lessons and experience through a full year of intense collaboration with 50 teachers from the diverse school districts of the San Francisco Bay Area teaching in the 3rd--12th grade range. The underlying theme of the class focused on how scientists carry out research using a NASA satellite mission. Emphasis was given to problem-solving techniques, with specific examples taken from the pre- and post-launch stages of the EUVE mission. The two, semester-long classes were hosted by the CEA, so the teachers spent an average of 4 hours/week during 17 weeks immersed in astrophysics, collaborating with astronomers, and working with colleagues from the Lawrence Hall of Science and the Graduate School of Education. The teachers were taught the computer skills and space astrophysics concepts needed to perform hands-on analysis and interpretation of the EUVE satellite data and the optical identification program. As a final project, groups of teachers developed lesson plans based on NASA and other resources that they posted on the World Wide Web using html. This project's model treats teachers as professionals, and allows them to collaborate with scientists and to hone their curriculum development skills, an important aspect of their professional growth. We will summarize class highlights and showcase teacher-developed lesson plans. A detailed evaluation report will be made available. We acknowledge NASA contracts NAS5-30180 and NAS5-29298 to CEA/UCB and NASA grant ED-90033.01-94A to SSL/UCB.
Logic integer programming models for signaling networks.
Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert
2009-05-01
We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.
Particle-In-Cell simulations of high pressure plasmas using graphics processing units
NASA Astrophysics Data System (ADS)
Gebhardt, Markus; Atteln, Frank; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Mertmann, Philipp; Awakowicz, Peter
2009-10-01
Particle-In-Cell (PIC) simulations are widely used to understand the fundamental phenomena in low-temperature plasmas. Particularly plasmas at very low gas pressures are studied using PIC methods. The inherent drawback of these methods is that they are very time consuming -- certain stability conditions has to be satisfied. This holds even more for the PIC simulation of high pressure plasmas due to the very high collision rates. The simulations take up to very much time to run on standard computers and require the help of computer clusters or super computers. Recent advances in the field of graphics processing units (GPUs) provides every personal computer with a highly parallel multi processor architecture for very little money. This architecture is freely programmable and can be used to implement a wide class of problems. In this paper we present the concepts of a fully parallel PIC simulation of high pressure plasmas using the benefits of GPU programming.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
The identification of trends in outgassing technology
NASA Technical Reports Server (NTRS)
Colony, J. A.
1980-01-01
A large amount of chemical analysis data involving identification of outgassing products from spacecraft, experiment modules, and support equipment accumulated at the Goddard Space Flight Center over the past ten years were reduced to a computer compatible format and subjected to a variety of relevant program operations. From these data a list of the most troublesome outgassing species were compiled and several useful and interesting materials' correlations were developed. The frequency of occurrence totals show that in aerospace program, di(2-ethyl hexyl) phthalate (DEHP) is the most often found individual species in outgassing samples and that esters are the leading generic class of compounds. The effectiveness of this data bank was demonstrated by the good correlations between materials and their outgassing products for solar panel bakeouts and cable bakeouts. However, trends in frequency of occurrence of many compounds were demonstrated where no correlation could be established. In the case of the class of compounds called aliphatic hydrocarbons, it is shown that the number of instances of significant outgassing due to these materials is increasing.
47 CFR 15.101 - Equipment authorization of unintentional radiators.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...
47 CFR 15.101 - Equipment authorization of unintentional radiators.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...
47 CFR 15.101 - Equipment authorization of unintentional radiators.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...
47 CFR 15.101 - Equipment authorization of unintentional radiators.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...
47 CFR 15.101 - Equipment authorization of unintentional radiators.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...
HTMT-class Latency Tolerant Parallel Architecture for Petaflops Scale Computation
NASA Technical Reports Server (NTRS)
Sterling, Thomas; Bergman, Larry
2000-01-01
Computational Aero Sciences and other numeric intensive computation disciplines demand computing throughputs substantially greater than the Teraflops scale systems only now becoming available. The related fields of fluids, structures, thermal, combustion, and dynamic controls are among the interdisciplinary areas that in combination with sufficient resolution and advanced adaptive techniques may force performance requirements towards Petaflops. This will be especially true for compute intensive models such as Navier-Stokes are or when such system models are only part of a larger design optimization computation involving many design points. Yet recent experience with conventional MPP configurations comprising commodity processing and memory components has shown that larger scale frequently results in higher programming difficulty and lower system efficiency. While important advances in system software and algorithms techniques have had some impact on efficiency and programmability for certain classes of problems, in general it is unlikely that software alone will resolve the challenges to higher scalability. As in the past, future generations of high-end computers may require a combination of hardware architecture and system software advances to enable efficient operation at a Petaflops level. The NASA led HTMT project has engaged the talents of a broad interdisciplinary team to develop a new strategy in high-end system architecture to deliver petaflops scale computing in the 2004/5 timeframe. The Hybrid-Technology, MultiThreaded parallel computer architecture incorporates several advanced technologies in combination with an innovative dynamic adaptive scheduling mechanism to provide unprecedented performance and efficiency within practical constraints of cost, complexity, and power consumption. The emerging superconductor Rapid Single Flux Quantum electronics can operate at 100 GHz (the record is 770 GHz) and one percent of the power required by convention semiconductor logic. Wave Division Multiplexing optical communications can approach a peak per fiber bandwidth of 1 Tbps and the new Data Vortex network topology employing this technology can connect tens of thousands of ports providing a bi-section bandwidth on the order of a Petabyte per second with latencies well below 100 nanoseconds, even under heavy loads. Processor-in-Memory (PIM) technology combines logic and memory on the same chip exposing the internal bandwidth of the memory row buffers at low latency. And holographic storage photorefractive storage technologies provide high-density memory with access a thousand times faster than conventional disk technologies. Together these technologies enable a new class of shared memory system architecture with a peak performance in the range of a Petaflops but size and power requirements comparable to today's largest Teraflops scale systems. To achieve high-sustained performance, HTMT combines an advanced multithreading processor architecture with a memory-driven coarse-grained latency management strategy called "percolation", yielding high efficiency while reducing the much of the parallel programming burden. This paper will present the basic system architecture characteristics made possible through this series of advanced technologies and then give a detailed description of the new percolation approach to runtime latency management.
NASA Astrophysics Data System (ADS)
Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.
2016-12-01
Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.
MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich; Trügler, Andreas
2012-02-01
MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.
Video-Game-Like Engine for Depicting Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Upchurch, Paul R.
2009-01-01
GoView is a video-game-like software engine, written in the C and C++ computing languages, that enables real-time, three-dimensional (3D)-appearing visual representation of spacecraft and trajectories (1) from any perspective; (2) at any spatial scale from spacecraft to Solar-system dimensions; (3) in user-selectable time scales; (4) in the past, present, and/or future; (5) with varying speeds; and (6) forward or backward in time. GoView constructs an interactive 3D world by use of spacecraft-mission data from pre-existing engineering software tools. GoView can also be used to produce distributable application programs for depicting NASA orbital missions on personal computers running the Windows XP, Mac OsX, and Linux operating systems. GoView enables seamless rendering of Cartesian coordinate spaces with programmable graphics hardware, whereas prior programs for depicting spacecraft trajectories variously require non-Cartesian coordinates and/or are not compatible with programmable hardware. GoView incorporates an algorithm for nonlinear interpolation between arbitrary reference frames, whereas the prior programs are restricted to special classes of inertial and non-inertial reference frames. Finally, whereas the prior programs present complex user interfaces requiring hours of training, the GoView interface provides guidance, enabling use without any training.
Archer, Charles J.; Faraj, Ahmad A.; Inglett, Todd A.; Ratterman, Joseph D.
2012-10-23
Methods, apparatus, and products are disclosed for providing nearest neighbor point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: identifying each link in the global combining network for each compute node of the operational group; designating one of a plurality of point-to-point class routing identifiers for each link such that no compute node in the operational group is connected to two adjacent compute nodes in the operational group with links designated for the same class routing identifiers; and configuring each compute node of the operational group for point-to-point communications with each adjacent compute node in the global combining network through the link between that compute node and that adjacent compute node using that link's designated class routing identifier.
Strauch, Konstantin; Baur, Max P; Wienker, Thomas F
2004-01-01
We present a recoding scheme that allows for a parametric multipoint X-chromosomal linkage analysis of dichotomous traits in the context of a computer program for autosomes that can use trait models with imprinting. Furthermore, with this scheme, it is possible to perform a joint multipoint analysis of X-linked and pseudoautosomal loci. It is required that (1) the marker genotypes of all female nonfounders are available and that (2) there are no male nonfounders who have daughters in the pedigree. The second requirement does not apply if the trait locus is pseudoautosomal. The X-linked marker loci are recorded by adding a dummy allele to the males' hemizygous genotypes. For modelling an X-linked trait locus, five different liability classes are defined, in conjunction with a paternal imprinting model for male nonfounders. The formulation aims at the mapping of a diallelic trait locus relative to an arbitrary number of codominant markers with known genetic distances, in cases where a program for a genuine X-chromosomal analysis is not available. 2004 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Zielinski, Theresa Julia; Brooks, David W.; Crippen, Kent J.; March, Joe L.
2001-06-01
Time management is an important issue for teachers and students. This article discusses teachers' use of time from the perspective of curriculum and instruction. Average high school students spend fewer than 5 hours per week in outside-of-class study; average college students spend about 20 hours. Procrastination, often viewed in a negative light by teachers, usually pays off so well for college students that seniors become better at it than freshmen. Three suggestions for designing instruction are: test early and often; do not waste the best students' time in an effort to improve overall performance; and use engaging activities that motivate students to give of their time. The impact of computers on curricula is a double-edged sword. Time must be devoted to teaching the use of applications, but the programs reduce busywork. Will this turn out to be a simple tradeoff, or will the programs make us much more efficient so that less time is required? Will computer programs ultimately lead to an expanded criterion for expertise, thus demanding even more time to become an expert? These issues are described and suggestions for controlling time during instruction are provided.
PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS
Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.
2013-01-01
Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390
An evaluation of four single element airfoil analytic methods
NASA Technical Reports Server (NTRS)
Freuler, R. J.; Gregorek, G. M.
1979-01-01
A comparison of four computer codes for the analysis of two-dimensional single element airfoil sections is presented for three classes of section geometries. Two of the computer codes utilize vortex singularities methods to obtain the potential flow solution. The other two codes solve the full inviscid potential flow equation using finite differencing techniques, allowing results to be obtained for transonic flow about an airfoil including weak shocks. Each program incorporates boundary layer routines for computing the boundary layer displacement thickness and boundary layer effects on aerodynamic coefficients. Computational results are given for a symmetrical section represented by an NACA 0012 profile, a conventional section illustrated by an NACA 65A413 profile, and a supercritical type section for general aviation applications typified by a NASA LS(1)-0413 section. The four codes are compared and contrasted in the areas of method of approach, range of applicability, agreement among each other and with experiment, individual advantages and disadvantages, computer run times and memory requirements, and operational idiosyncrasies.
Shape-programmable magnetic soft matter
Lum, Guo Zhan; Ye, Zhou; Dong, Xiaoguang; Marvi, Hamid; Erin, Onder; Hu, Wenqi; Sitti, Metin
2016-01-01
Shape-programmable matter is a class of active materials whose geometry can be controlled to potentially achieve mechanical functionalities beyond those of traditional machines. Among these materials, magnetically actuated matter is particularly promising for achieving complex time-varying shapes at small scale (overall dimensions smaller than 1 cm). However, previous work can only program these materials for limited applications, as they rely solely on human intuition to approximate the required magnetization profile and actuating magnetic fields for their materials. Here, we propose a universal programming methodology that can automatically generate the required magnetization profile and actuating fields for soft matter to achieve new time-varying shapes. The universality of the proposed method can therefore inspire a vast number of miniature soft devices that are critical in robotics, smart engineering surfaces and materials, and biomedical devices. Our proposed method includes theoretical formulations, computational strategies, and fabrication procedures for programming magnetic soft matter. The presented theory and computational method are universal for programming 2D or 3D time-varying shapes, whereas the fabrication technique is generic only for creating planar beams. Based on the proposed programming method, we created a jellyfish-like robot, a spermatozoid-like undulating swimmer, and an artificial cilium that could mimic the complex beating patterns of its biological counterpart. PMID:27671658
Shape-programmable magnetic soft matter.
Lum, Guo Zhan; Ye, Zhou; Dong, Xiaoguang; Marvi, Hamid; Erin, Onder; Hu, Wenqi; Sitti, Metin
2016-10-11
Shape-programmable matter is a class of active materials whose geometry can be controlled to potentially achieve mechanical functionalities beyond those of traditional machines. Among these materials, magnetically actuated matter is particularly promising for achieving complex time-varying shapes at small scale (overall dimensions smaller than 1 cm). However, previous work can only program these materials for limited applications, as they rely solely on human intuition to approximate the required magnetization profile and actuating magnetic fields for their materials. Here, we propose a universal programming methodology that can automatically generate the required magnetization profile and actuating fields for soft matter to achieve new time-varying shapes. The universality of the proposed method can therefore inspire a vast number of miniature soft devices that are critical in robotics, smart engineering surfaces and materials, and biomedical devices. Our proposed method includes theoretical formulations, computational strategies, and fabrication procedures for programming magnetic soft matter. The presented theory and computational method are universal for programming 2D or 3D time-varying shapes, whereas the fabrication technique is generic only for creating planar beams. Based on the proposed programming method, we created a jellyfish-like robot, a spermatozoid-like undulating swimmer, and an artificial cilium that could mimic the complex beating patterns of its biological counterpart.
Shape-programmable magnetic soft matter
NASA Astrophysics Data System (ADS)
Zhan Lum, Guo; Ye, Zhou; Dong, Xiaoguang; Marvi, Hamid; Erin, Onder; Hu, Wenqi; Sitti, Metin
2016-10-01
Shape-programmable matter is a class of active materials whose geometry can be controlled to potentially achieve mechanical functionalities beyond those of traditional machines. Among these materials, magnetically actuated matter is particularly promising for achieving complex time-varying shapes at small scale (overall dimensions smaller than 1 cm). However, previous work can only program these materials for limited applications, as they rely solely on human intuition to approximate the required magnetization profile and actuating magnetic fields for their materials. Here, we propose a universal programming methodology that can automatically generate the required magnetization profile and actuating fields for soft matter to achieve new time-varying shapes. The universality of the proposed method can therefore inspire a vast number of miniature soft devices that are critical in robotics, smart engineering surfaces and materials, and biomedical devices. Our proposed method includes theoretical formulations, computational strategies, and fabrication procedures for programming magnetic soft matter. The presented theory and computational method are universal for programming 2D or 3D time-varying shapes, whereas the fabrication technique is generic only for creating planar beams. Based on the proposed programming method, we created a jellyfish-like robot, a spermatozoid-like undulating swimmer, and an artificial cilium that could mimic the complex beating patterns of its biological counterpart.
Byers, J A
1992-09-01
A compiled program, JCE-REFS.EXE (coded in the QuickBASIC language), for use on IBM-compatible personal computers is described. The program converts a DOS text file of current B-I-T-S (BIOSIS Information Transfer System) or BIOSIS Previews references into a DOS file of citations, including abstracts, in a general style used by scientific journals. The latter file can be imported directly into a word processor or the program can convert the file into a random access data base of the references. The program can search the data base for up to 40 text strings with Boolean logic. Selected references in the data base can be exported as a DOS text file of citations. Using the search facility, articles in theJournal of Chemical Ecology from 1975 to 1991 were searched for certain key words in regard to semiochemicals, taxa, methods, chemical classes, and biological terms to determine trends in usage over the period. Positive trends were statistically significant in the use of the words: semiochemical, allomone, allelochemic, deterrent, repellent, plants, angiosperms, dicots, wind tunnel, olfactometer, electrophysiology, mass spectrometry, ketone, evolution, physiology, herbivore, defense, and receptor. Significant negative trends were found for: pheromone, vertebrates, mammals, Coleoptera, Scolytidae,Dendroctonus, lactone, isomer, and calling.
Zhang, Jie; Wu, Xiaohong; Yu, Yanmei; Luo, Daisheng
2013-01-01
In optical printed Chinese character recognition (OPCCR), many classifiers have been proposed for the recognition. Among the classifiers, support vector machine (SVM) might be the best classifier. However, SVM is a classifier for two classes. When it is used for multi-classes in OPCCR, its computation is time-consuming. Thus, we propose a neighbor classes based SVM (NC-SVM) to reduce the computation consumption of SVM. Experiments of NC-SVM classification for OPCCR have been done. The results of the experiments have shown that the NC-SVM we proposed can effectively reduce the computation time in OPCCR.
Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC
NASA Astrophysics Data System (ADS)
Alruwaili, Manal
With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.
Program for Editing Spacecraft Command Sequences
NASA Technical Reports Server (NTRS)
Gladden, Roy; Waggoner, Bruce; Kordon, Mark; Hashemi, Mahnaz; Hanks, David; Salcedo, Jose
2006-01-01
Sequence Translator, Editor, and Expander Resource (STEER) is a computer program that facilitates construction of sequences and blocks of sequences (hereafter denoted generally as sequence products) for commanding a spacecraft. STEER also provides mechanisms for translating among various sequence product types and quickly expanding activities of a given sequence in chronological order for review and analysis of the sequence. To date, construction of sequence products has generally been done by use of such clumsy mechanisms as text-editor programs, translating among sequence product types has been challenging, and expanding sequences to time-ordered lists has involved arduous processes of converting sequence products to "real" sequences and running them through Class-A software (defined, loosely, as flight and ground software critical to a spacecraft mission). Also, heretofore, generating sequence products in standard formats has been troublesome because precise formatting and syntax are required. STEER alleviates these issues by providing a graphical user interface containing intuitive fields in which the user can enter the necessary information. The STEER expansion function provides a "quick and dirty" means of seeing how a sequence and sequence block would expand into a chronological list, without need to use of Class-A software.
40 CFR 147.1600 - State-administered program-Class II wells.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) WATER PROGRAMS (CONTINUED) STATE, TRIBAL, AND EPA-ADMINISTERED UNDERGROUND INJECTION CONTROL PROGRAMS New Mexico § 147.1600 State-administered program—Class II wells. The UIC program for Class II wells in the State of New Mexico, except for those on Indian lands, is the program administered by the New...
Attitudes on Using Pair-Programming
ERIC Educational Resources Information Center
Howard, Elizabeth V.
2007-01-01
During a research study conducted over four semesters, students enrolled in an introductory programming class at a commuter campus used the pair-programming approach for both in-class labs and out-of-class programming assignments. This study was a comprehensive assessment of pair-programming using multiple measures of both quantitative and…
COMPUTATIONAL INVESTIGATION OF CHEMICAL REACTIVITY IN RELATION TO BIOACTIV A TION AND TOXICITY ACROSS CLASSES OF HALOORGANICS: BROMINATION VS. CHLORINATION.
Halogenation is a common feature of many classes of environmental contaminants, and often plays a crucial role in po...
JAVA CLASSES FOR NONPROCEDURAL VARIOGRAM MONITORING. JOURNAL OF COMPUTERS AND GEOSCIENCE
NRMRL-ADA-00229 Faulkner*, B.P. Java Classes for Nonprocedural Variogram Monitoring. Journal of Computers and Geosciences ( Elsevier Science, Ltd.) 28:387-397 (2002). EPA/600/J-02/235. A set of Java classes was written for variogram modeling to support research for US EPA's Reg...
Sreenivas, K; Sekhar, N Seshadri; Saxena, Manoj; Paliwal, R; Pathak, S; Porwal, M C; Fyzee, M A; Rao, S V C Kameswara; Wadodkar, M; Anasuya, T; Murthy, M S R; Ravisankar, T; Dadhwal, V K
2015-09-15
The present study aims at analysis of spatial and temporal variability in agricultural land cover during 2005-6 and 2011-12 from an ongoing program of annual land use mapping using multidate Advanced Wide Field Sensor (AWiFS) data aboard Resourcesat-1 and 2. About 640-690 multi-temporal AWiFS quadrant data products per year (depending on cloud cover) were co-registered and radiometrically normalized to prepare state (administrative unit) mosaics. An 18-fold classification was adopted in this project. Rule-based techniques along with maximum-likelihood algorithm were employed to deriving land cover information as well as changes within agricultural land cover classes. The agricultural land cover classes include - kharif (June-October), rabi (November-April), zaid (April-June), area sown more than once, fallow lands and plantation crops. Mean kappa accuracy of these estimates varied from 0.87 to 0.96 for various classes. Standard error of estimate has been computed for each class annually and the area estimates were corrected using standard error of estimate. The corrected estimates range between 99 and 116 Mha for kharif and 77-91 Mha for rabi. The kharif, rabi and net sown area were aggregated at 10 km × 10 km grid on annual basis for entire India and CV was computed at each grid cell using temporal spatially-aggregated area as input. This spatial variability of agricultural land cover classes was analyzed across meteorological zones, irrigated command areas and administrative boundaries. The results indicate that out of various states/meteorological zones, Punjab was consistently cropped during kharif as well as rabi seasons. Out of all irrigated commands, Tawa irrigated command was consistently cropped during rabi season. Copyright © 2014 Elsevier Ltd. All rights reserved.
Internet messenger based smart virtual class learning using ubiquitous computing
NASA Astrophysics Data System (ADS)
Umam, K.; Mardi, S. N. S.; Hariadi, M.
2017-06-01
Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning
2016-10-25
program, a program to design and build a new class of 12 ballistic missile submarines (SSBNs) to replace the Navy’s current force of 14 Ohio-class SSBNs...billion in detailed design and nonrecurring engineering (DD/NRE) costs for the entire class, and $8.8 billion in construction costs for the ship... Design ................................................................................................................. 8 Program Cost
ERIC Educational Resources Information Center
Frein, Scott T.
2011-01-01
This article describes three experiments comparing paper-and-pencil tests (PPTs) to computer-based tests (CBTs) in terms of test method preferences and student performance. In Experiment 1, students took tests using three methods: PPT in class, CBT in class, and CBT at the time and place of their choosing. Results indicate that test method did not…
Programming an offline-analyzer of motor imagery signals via python language.
Alonso-Valerdi, Luz María; Sepulveda, Francisco
2011-01-01
Brain Computer Interface (BCI) systems control the user's environment via his/her brain signals. Brain signals related to motor imagery (MI) have become a widespread method employed by the BCI community. Despite the large number of references describing the MI signal treatment, there is not enough information related to the available programming languages that could be suitable to develop a specific-purpose MI-based BCI. The present paper describes the development of an offline-analysis system based on MI-EEG signals via open-source programming languages, and the assessment of the system using electrical activity recorded from three subjects. The analyzer recognized at least 63% of the MI signals corresponding to three classes. The results of the offline analysis showed a promising performance considering that the subjects have never undergone MI trainings.
StrBioLib: a Java library for development of custom computational structural biology applications.
Chandonia, John-Marc
2007-08-01
StrBioLib is a library of Java classes useful for developing software for computational structural biology research. StrBioLib contains classes to represent and manipulate protein structures, biopolymer sequences, sets of biopolymer sequences, and alignments between biopolymers based on either sequence or structure. Interfaces are provided to interact with commonly used bioinformatics applications, including (psi)-blast, modeller, muscle and Primer3, and tools are provided to read and write many file formats used to represent bioinformatic data. The library includes a general-purpose neural network object with multiple training algorithms, the Hooke and Jeeves non-linear optimization algorithm, and tools for efficient C-style string parsing and formatting. StrBioLib is the basis for the Pred2ary secondary structure prediction program, is used to build the astral compendium for sequence and structure analysis, and has been extensively tested through use in many smaller projects. Examples and documentation are available at the site below. StrBioLib may be obtained under the terms of the GNU LGPL license from http://strbio.sourceforge.net/
Modeling Subsurface Reactive Flows Using Leadership-Class Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Richard T; Hammond, Glenn; Lichtner, Peter
2009-01-01
We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.
RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS
NASA Technical Reports Server (NTRS)
Yates, G. L.
1994-01-01
Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.
An object oriented code for simulating supersymmetric Yang-Mills theories
NASA Astrophysics Data System (ADS)
Catterall, Simon; Joseph, Anosh
2012-06-01
We present SUSY_LATTICE - a C++ program that can be used to simulate certain classes of supersymmetric Yang-Mills (SYM) theories, including the well known N=4 SYM in four dimensions, on a flat Euclidean space-time lattice. Discretization of SYM theories is an old problem in lattice field theory. It has resisted solution until recently when new ideas drawn from orbifold constructions and topological field theories have been brought to bear on the question. The result has been the creation of a new class of lattice gauge theories in which the lattice action is invariant under one or more supersymmetries. The resultant theories are local, free of doublers and also possess exact gauge-invariance. In principle they form the basis for a truly non-perturbative definition of the continuum SYM theories. In the continuum limit they reproduce versions of the SYM theories formulated in terms of twisted fields, which on a flat space-time is just a change of the field variables. In this paper, we briefly review these ideas and then go on to provide the details of the C++ code. We sketch the design of the code, with particular emphasis being placed on SYM theories with N=(2,2) in two dimensions and N=4 in three and four dimensions, making one-to-one comparisons between the essential components of the SYM theories and their corresponding counterparts appearing in the simulation code. The code may be used to compute several quantities associated with the SYM theories such as the Polyakov loop, mean energy, and the width of the scalar eigenvalue distributions. Program summaryProgram title: SUSY_LATTICE Catalogue identifier: AELS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9315 No. of bytes in distributed program, including test data, etc.: 95 371 Distribution format: tar.gz Programming language: C++ Computer: PCs and Workstations Operating system: Any, tested on Linux machines Classification:: 11.6 Nature of problem: To compute some of the observables of supersymmetric Yang-Mills theories such as supersymmetric action, Polyakov/Wilson loops, scalar eigenvalues and Pfaffian phases. Solution method: We use the Rational Hybrid Monte Carlo algorithm followed by a Leapfrog evolution and a Metropolis test. The input parameters of the model are read in from a parameter file. Restrictions: This code applies only to supersymmetric gauge theories with extended supersymmetry, which undergo the process of maximal twisting. (See Section 2 of the manuscript for details.) Running time: From a few minutes to several hours depending on the amount of statistics needed.
Inquiry-Based Early Undergraduate Research Using High-Altitude Ballooning
NASA Astrophysics Data System (ADS)
Sibbernsen, K.; Sibbernsen, M.
2012-12-01
One common objective for undergraduate science classes is to have students learn how to do scientific inquiry. However, often in science laboratory classes, students learn to take data, analyze the data, and come to conclusions, but they are told what to study and do not have the opportunity to ask their own research questions, a crucial part of scientific inquiry. A special topics class in high-altitude ballooning (HAB) was offered at Metropolitan Community College, a large metropolitan two-year college in Omaha, Nebraska to focus on scientific inquiry for the participants through support of NASA Nebraska Space Grant. A weather balloon with payloads attached (balloonSAT) was launched to near space where the balloon burst and fell back to the ground with a parachute. Students worked in small groups to ask their research questions, they designed their payloads, participated in the launch and retrieval of equipment, analyzed data, and presented the results of their research. This type of experience has potential uses in physics, physical science, engineering, electronics, computer programming, meteorology, astronomy, and chemistry classes. The balloonSAT experience can act as a stepping-stone to designing sounding rocket payloads and it can allow students the opportunity to participate in regional competitions and present at HAB conferences. Results from the workshop are shared, as well as student responses to the experience and suggestions for administering a high-altitude ballooning program for undergraduates or extending inquiry-based ballooning experiences into high-school or middle-school.
Solving a Class of Stochastic Mixed-Integer Programs With Branch and Price
2006-01-01
a two-dimensional knapsack problem, but for a given m, the objective value gi does not depend on the variance index v. This will be used in a final...optimization. Journal of Multicriteria Decision Analysis 11, 139–150 (2002) 29. Ford, L.R., Fulkerson, D.R.: A suggested computation for the maximal...for solution by a branch-and-price algorithm (B&P). We then survey a number of examples, and use a stochastic facility-location problem (SFLP) for a
Appearance-Based Vision and the Automatic Generation of Object Recognition Programs
1992-07-01
q u a groued into equivalence clases with respect o visible featms; the equivalence classes me called alpecu. A recognitio smuegy is generated from...illustates th concept. pge 9 Table 1: Summary o fSnsors Samr Vertex Edge Face Active/ Passive Edge detector line, passive Shape-fzm-shading - r passive...example of the detectability computation for a liht-stripe range finder is shown zn Fqgur 2. Figure 2: Detectability of a face for a light-stripe range
Reinforced Concrete Wall Form Design Program
1992-08-01
criteria is an absolute limit. You have the choice of 1/8 or 1/16 of an inch total deflection in a span. Once these limits are set here, then they are...Calls GET-INFO-TEXT - Calls ZERO -PLY - If the response to GET-INFO-TEXT is "Values retrieved by computer", then the following procedures are executed...like to enter their own values. ZERO -PLY - Re-initializes all PLY-VEC values to"?". GET-PLY-CLASS - Retrieves from the user the grade of plyform to be
Jeffries, P R
2001-10-01
The purpose of this study was to compare the effectiveness of both an interactive, multimedia CD-ROM and a traditional lecture for teaching oral medication administration to nursing students. A randomized pretest/posttest experimental design was used. Forty-two junior baccalaureate nursing students beginning their fundamentals nursing course were recruited for this study at a large university in the midwestern United States. The students ranged in age from 19 to 45. Seventy-three percent reported having average computer skills and experience, while 15% reported poor to below average skills. Two methods were compared for teaching oral medication administration--a scripted lecture with black and white overhead transparencies, in addition to an 18-minute videotape on medication administration, and an interactive, multimedia CD-ROM program, covering the same content. There were no significant (p < .05) baseline differences between the computer and lecture groups by education or computer skills. Results showed significant differences between the two groups in cognitive gains and student satisfaction (p = .01), with the computer group demonstrating higher student satisfaction and more cognitive gains than the lecture group. The groups were similar in their ability to demonstrate the skill correctly. Importantly, time on task using the CD-ROM was less, with 96% of the learners completing the program in 2 hours or less, compared to 3 hours of class time for the lecture group.
Computational methods for a three-dimensional model of the petroleum-discovery process
Schuenemeyer, J.H.; Bawiec, W.J.; Drew, L.J.
1980-01-01
A discovery-process model devised by Drew, Schuenemeyer, and Root can be used to predict the amount of petroleum to be discovered in a basin from some future level of exploratory effort: the predictions are based on historical drilling and discovery data. Because marginal costs of discovery and production are a function of field size, the model can be used to make estimates of future discoveries within deposit size classes. The modeling approach is a geometric one in which the area searched is a function of the size and shape of the targets being sought. A high correlation is assumed between the surface-projection area of the fields and the volume of petroleum. To predict how much oil remains to be found, the area searched must be computed, and the basin size and discovery efficiency must be estimated. The basin is assumed to be explored randomly rather than by pattern drilling. The model may be used to compute independent estimates of future oil at different depth intervals for a play involving multiple producing horizons. We have written FORTRAN computer programs that are used with Drew, Schuenemeyer, and Root's model to merge the discovery and drilling information and perform the necessary computations to estimate undiscovered petroleum. These program may be modified easily for the estimation of remaining quantities of commodities other than petroleum. ?? 1980.
Parallel hyperbolic PDE simulation on clusters: Cell versus GPU
NASA Astrophysics Data System (ADS)
Rostrup, Scott; De Sterck, Hans
2010-12-01
Increasingly, high-performance computing is looking towards data-parallel computational devices to enhance computational performance. Two technologies that have received significant attention are IBM's Cell Processor and NVIDIA's CUDA programming model for graphics processing unit (GPU) computing. In this paper we investigate the acceleration of parallel hyperbolic partial differential equation simulation on structured grids with explicit time integration on clusters with Cell and GPU backends. The message passing interface (MPI) is used for communication between nodes at the coarsest level of parallelism. Optimizations of the simulation code at the several finer levels of parallelism that the data-parallel devices provide are described in terms of data layout, data flow and data-parallel instructions. Optimized Cell and GPU performance are compared with reference code performance on a single x86 central processing unit (CPU) core in single and double precision. We further compare the CPU, Cell and GPU platforms on a chip-to-chip basis, and compare performance on single cluster nodes with two CPUs, two Cell processors or two GPUs in a shared memory configuration (without MPI). We finally compare performance on clusters with 32 CPUs, 32 Cell processors, and 32 GPUs using MPI. Our GPU cluster results use NVIDIA Tesla GPUs with GT200 architecture, but some preliminary results on recently introduced NVIDIA GPUs with the next-generation Fermi architecture are also included. This paper provides computational scientists and engineers who are considering porting their codes to accelerator environments with insight into how structured grid based explicit algorithms can be optimized for clusters with Cell and GPU accelerators. It also provides insight into the speed-up that may be gained on current and future accelerator architectures for this class of applications. Program summaryProgram title: SWsolver Catalogue identifier: AEGY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v3 No. of lines in distributed program, including test data, etc.: 59 168 No. of bytes in distributed program, including test data, etc.: 453 409 Distribution format: tar.gz Programming language: C, CUDA Computer: Parallel Computing Clusters. Individual compute nodes may consist of x86 CPU, Cell processor, or x86 CPU with attached NVIDIA GPU accelerator. Operating system: Linux Has the code been vectorised or parallelized?: Yes. Tested on 1-128 x86 CPU cores, 1-32 Cell Processors, and 1-32 NVIDIA GPUs. RAM: Tested on Problems requiring up to 4 GB per compute node. Classification: 12 External routines: MPI, CUDA, IBM Cell SDK Nature of problem: MPI-parallel simulation of Shallow Water equations using high-resolution 2D hyperbolic equation solver on regular Cartesian grids for x86 CPU, Cell Processor, and NVIDIA GPU using CUDA. Solution method: SWsolver provides 3 implementations of a high-resolution 2D Shallow Water equation solver on regular Cartesian grids, for CPU, Cell Processor, and NVIDIA GPU. Each implementation uses MPI to divide work across a parallel computing cluster. Additional comments: Sub-program numdiff is used for the test run.
Performance Analysis, Modeling and Scaling of HPC Applications and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatele, Abhinav
2016-01-13
E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research alongmore » the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.« less
Design and optimization of a portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.
ERIC Educational Resources Information Center
Piele, Philip K.
This document shows how computer technology can aid educators in meeting demands for improved class scheduling and more efficient use of transportation resources. The first section surveys literature on operational systems that provide individualized scheduling for students, varied class structures, and maximum use of space and staff skills.…
Inquiry-Based Whole-Class Teaching with Computer Simulations in Physics
ERIC Educational Resources Information Center
Rutten, Nico; van der Veen, Jan T.; van Joolingen, Wouter R.
2015-01-01
In this study we investigated the pedagogical context of whole-class teaching with computer simulations. We examined relations between the attitudes and learning goals of teachers and their students regarding the use of simulations in whole-class teaching, and how teachers implement these simulations in their teaching practices. We observed…
Zhang, Jie; Wu, Xiaohong; Yu, Yanmei; Luo, Daisheng
2013-01-01
In optical printed Chinese character recognition (OPCCR), many classifiers have been proposed for the recognition. Among the classifiers, support vector machine (SVM) might be the best classifier. However, SVM is a classifier for two classes. When it is used for multi-classes in OPCCR, its computation is time-consuming. Thus, we propose a neighbor classes based SVM (NC-SVM) to reduce the computation consumption of SVM. Experiments of NC-SVM classification for OPCCR have been done. The results of the experiments have shown that the NC-SVM we proposed can effectively reduce the computation time in OPCCR. PMID:23536777
Computers and Classroom Culture.
ERIC Educational Resources Information Center
Schofield, Janet Ward
This book explores the meaning of computer technology in schools. The book is based on data gathered from a two-year observation of more than 30 different classrooms in an urban high school: geometry classes in which students used artificially intelligent tutors; business classes in which students learned word processing; and computer science…
2014-01-01
Background Psychostimulants and cannabis are two of the three most commonly used illicit drugs by young Australians. As such, it is important to deliver prevention for these substances to prevent their misuse and to reduce associated harms. The present study aims to evaluate the feasibility and effectiveness of the universal computer-based Climate Schools: Psychostimulant and Cannabis Module. Methods A cluster randomised controlled trial was conducted with 1734 Year 10 students (mean age = 15.44 years; SD = 0.41) from 21 secondary schools in Australia. Schools were randomised to receive either the six lesson computer-based Climate Schools program or their usual health classes, including drug education, over the year. Results The Climate Schools program was shown to increase knowledge of cannabis and psychostimulants and decrease pro-drug attitudes. In the short-term the program was effective in subduing the uptake and plateauing the frequency of ecstasy use, however there were no changes in meth/amphetamine use. In addition, females who received the program used cannabis significantly less frequently than students who received drug education as usual. Finally, the Climate Schools program was related to decreasing students’ intentions to use meth/amphetamine and ecstasy in the future, however these effects did not last over time. Conclusions These findings provide support for the use of a harm-minimisation approach and computer technology as an innovative platform for the delivery of prevention education for illicit drugs in schools. The current study indicated that teachers and students enjoyed the program and that it is feasible to extend the successful Climate Schools model to the prevention of other drugs, namely cannabis and psychostimulants. Trial registration Australian and New Zealand Clinical Trials Registry ACTRN12613000492752. PMID:24943829
Vogl, Laura Elise; Newton, Nicola Clare; Champion, Katrina Elizabeth; Teesson, Maree
2014-06-18
Psychostimulants and cannabis are two of the three most commonly used illicit drugs by young Australians. As such, it is important to deliver prevention for these substances to prevent their misuse and to reduce associated harms. The present study aims to evaluate the feasibility and effectiveness of the universal computer-based Climate Schools: Psychostimulant and Cannabis Module. A cluster randomised controlled trial was conducted with 1734 Year 10 students (mean age = 15.44 years; SD = 0.41) from 21 secondary schools in Australia. Schools were randomised to receive either the six lesson computer-based Climate Schools program or their usual health classes, including drug education, over the year. The Climate Schools program was shown to increase knowledge of cannabis and psychostimulants and decrease pro-drug attitudes. In the short-term the program was effective in subduing the uptake and plateauing the frequency of ecstasy use, however there were no changes in meth/amphetamine use. In addition, females who received the program used cannabis significantly less frequently than students who received drug education as usual. Finally, the Climate Schools program was related to decreasing students' intentions to use meth/amphetamine and ecstasy in the future, however these effects did not last over time. These findings provide support for the use of a harm-minimisation approach and computer technology as an innovative platform for the delivery of prevention education for illicit drugs in schools. The current study indicated that teachers and students enjoyed the program and that it is feasible to extend the successful Climate Schools model to the prevention of other drugs, namely cannabis and psychostimulants. Australian and New Zealand Clinical Trials Registry ACTRN12613000492752.
A pilot biomedical engineering course in rapid prototyping for mobile health.
Stokes, Todd H; Venugopalan, Janani; Hubbard, Elena N; Wang, May D
2013-01-01
Rapid prototyping of medically assistive mobile devices promises to fuel innovation and provides opportunity for hands-on engineering training in biomedical engineering curricula. This paper presents the design and outcomes of a course offered during a 16-week semester in Fall 2011 with 11 students enrolled. The syllabus covered a mobile health design process from end-to-end, including storyboarding, non-functional prototypes, integrated circuit programming, 3D modeling, 3D printing, cloud computing database programming, and developing patient engagement through animated videos describing the benefits of a new device. Most technologies presented in this class are open source and thus provide unlimited "hackability". They are also cost-effective and easily transferrable to other departments.
A variational dynamic programming approach to robot-path planning with a distance-safety criterion
NASA Technical Reports Server (NTRS)
Suh, Suk-Hwan; Shin, Kang G.
1988-01-01
An approach to robot-path planning is developed by considering both the traveling distance and the safety of the robot. A computationally-efficient algorithm is developed to find a near-optimal path with a weighted distance-safety criterion by using a variational calculus and dynamic programming (VCDP) method. The algorithm is readily applicable to any factory environment by representing the free workspace as channels. A method for deriving these channels is also proposed. Although it is developed mainly for two-dimensional problems, this method can be easily extended to a class of three-dimensional problems. Numerical examples are presented to demonstrate the utility and power of this method.
An evolutionary algorithm that constructs recurrent neural networks.
Angeline, P J; Saunders, G M; Pollack, J B
1994-01-01
Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods.
Programming methodology for a general purpose automation controller
NASA Technical Reports Server (NTRS)
Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.
1987-01-01
The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.
PRACE - The European HPC Infrastructure
NASA Astrophysics Data System (ADS)
Stadelmeyer, Peter
2014-05-01
The mission of PRACE (Partnership for Advanced Computing in Europe) is to enable high impact scientific discovery and engineering research and development across all disciplines to enhance European competitiveness for the benefit of society. PRACE seeks to realize this mission by offering world class computing and data management resources and services through a peer review process. This talk gives a general overview about PRACE and the PRACE research infrastructure (RI). PRACE is established as an international not-for-profit association and the PRACE RI is a pan-European supercomputing infrastructure which offers access to computing and data management resources at partner sites distributed throughout Europe. Besides a short summary about the organization, history, and activities of PRACE, it is explained how scientists and researchers from academia and industry from around the world can access PRACE systems and which education and training activities are offered by PRACE. The overview also contains a selection of PRACE contributions to societal challenges and ongoing activities. Examples of the latter are beside others petascaling, application benchmark suite, best practice guides for efficient use of key architectures, application enabling / scaling, new programming models, and industrial applications. The Partnership for Advanced Computing in Europe (PRACE) is an international non-profit association with its seat in Brussels. The PRACE Research Infrastructure provides a persistent world-class high performance computing service for scientists and researchers from academia and industry in Europe. The computer systems and their operations accessible through PRACE are provided by 4 PRACE members (BSC representing Spain, CINECA representing Italy, GCS representing Germany and GENCI representing France). The Implementation Phase of PRACE receives funding from the EU's Seventh Framework Programme (FP7/2007-2013) under grant agreements RI-261557, RI-283493 and RI-312763. For more information, see www.prace-ri.eu
High-performance parallel computing in the classroom using the public goods game as an example
NASA Astrophysics Data System (ADS)
Perc, Matjaž
2017-07-01
The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.
Walther, Birte; Hanewinkel, Reiner; Morgenstern, Matthis
2014-09-01
The aim of this study was to evaluate the effects of a four-session school-based media literacy curriculum on adolescent computer gaming and Internet use behavior. The study comprised a cluster randomized controlled trial with three assessments (baseline, posttest, and 12-month follow-up). At baseline, a total of 2,303 sixth and seventh grade adolescents from 27 secondary schools were assessed. Of these, 1,843 (80%) could be reached at all three assessments (Mage=12.0 years; SD=0.83). Students of the intervention group received the media literacy program Vernetzte www.Welten ("Connected www.Worlds ") implemented by trained teachers during class time. The control group attended regular class. Main outcome measures were adolescents' computer gaming and Internet use: days per month, hours per day, and addictive use patterns. Parental media monitoring and rules at home were assessed as secondary outcomes. Results of multilevel growth-curve models revealed a significant intervention effect in terms of a lower increase in self-reported gaming frequency (β = -1.10 [95% CI -2.06, -0.13]), gaming time (β = -0.27 [95% CI -0.40, -0.14]), and proportion of excessive gamers (AOR=0.21 [95% CI 0.08, 0.57]) in the intervention group. There were also significant group-time interactions for the addictive gaming scale (β=-0.08 [95% CI -0.12, -0.04]), and the Internet Addiction Scale (β = -0.06 [95% CI -0.10, -0.01]). No effect was found for days and hours of Internet use or parental media behavior. The study shows that the program Vernetzte www.Welten can influence adolescents' media use behavior. Future research should address mediating and moderating variables of program effects.
ERIC Educational Resources Information Center
Shen, Ruimin; Wang, Minjuan; Gao, Wanping; Novak, D.; Tang, Lin
2009-01-01
The computer science classes in China's institutions of higher education often have large numbers of students. In addition, many institutions offer "blended" classes that include both on-campus and online students. These large blended classrooms have long suffered from a lack of interactivity. Many online classes simply provide recorded…
ERIC Educational Resources Information Center
Kaufman, Maurice
The Distar I Reading, Language and Arithmetic programs were used with two first grade classes. The Distar II programs were used with two second grade classes. One first grade Distar class appeared to make some progress in oral language. Comparison of the first grade Distar classes with a first grade control class that used a Scott-Foresman basal…
Albert R. Stage; Thomas Ledermann
2008-01-01
We illustrate effects of competitor spacing for a new class of individual-tree indices of competition that we call semi-distance-independent. This new class is similar to the class of distance-independent indices except that the index is computed independently at each subsampling plot surrounding a subject tree for which growth is to be modelled. We derive the effects...
NASA Astrophysics Data System (ADS)
Sugiharti, Gulmah
2018-03-01
This study aims to see the improvement of student learning outcomes by independent learning using computer-based learning media in the course of STBM (Teaching and Learning Strategy) Chemistry. Population in this research all student of class of 2014 which take subject STBM Chemistry as many as 4 class. While the sample is taken by purposive as many as 2 classes, each 32 students, as control class and expriment class. The instrument used is the test of learning outcomes in the form of multiple choice with the number of questions as many as 20 questions that have been declared valid, and reliable. Data analysis techniques used one-sided t test and improved learning outcomes using a normalized gain test. Based on the learning result data, the average of normalized gain values for the experimental class is 0,530 and for the control class is 0,224. The result of the experimental student learning result is 53% and the control class is 22,4%. Hypothesis testing results obtained t count> ttable is 9.02> 1.6723 at the level of significance α = 0.05 and db = 58. This means that the acceptance of Ha is the use of computer-based learning media (CAI Computer) can improve student learning outcomes in the course Learning Teaching Strategy (STBM) Chemistry academic year 2017/2018.
Towards automatic Markov reliability modeling of computer architectures
NASA Technical Reports Server (NTRS)
Liceaga, C. A.; Siewiorek, D. P.
1986-01-01
The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.
An Intervention Study on Mental Computation for Second Graders in Taiwan
ERIC Educational Resources Information Center
Yang, Der-Ching; Huang, Ke-Lun
2014-01-01
The authors compared the mental computation performance and mental strategies used by an experimental Grade 2 class and a control Grade 2 class before and after instructional intervention. Results indicate that students in the experimental group had better performance on mental computation. The use of mental strategies (counting, separation,…
The Computer, the Discipline and the Classroom: Two Perspectives.
ERIC Educational Resources Information Center
Thurber, Bart; Pope, Jack
The authors present two case studies in the use of computers in the classroom, one involving an introductory computer science class, the other an upper division literature class. After describing each case, the differences are discussed, showing that pedagogical models developed for one discipline may not transfer to another, and that the…
The document provides describes the current Class I UIC program, the history of Class I injection, and studies of human health risks associated with Class I injection wells, which were conducted for past regulatory efforts and policy documentation.
40 CFR 146.81 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-07-01
... INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS Criteria and Standards Applicable to Class VI Wells § 146... control programs to regulate any Class VI carbon dioxide geologic sequestration injection wells. (b) This...-authorized Class I, Class II, or Class V experimental carbon dioxide injection projects who seek to apply for...
40 CFR 146.81 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-07-01
... INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS Criteria and Standards Applicable to Class VI Wells § 146... control programs to regulate any Class VI carbon dioxide geologic sequestration injection wells. (b) This...-authorized Class I, Class II, or Class V experimental carbon dioxide injection projects who seek to apply for...
40 CFR 146.81 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-07-01
... INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS Criteria and Standards Applicable to Class VI Wells § 146... control programs to regulate any Class VI carbon dioxide geologic sequestration injection wells. (b) This...-authorized Class I, Class II, or Class V experimental carbon dioxide injection projects who seek to apply for...
40 CFR 146.81 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-07-01
... INJECTION CONTROL PROGRAM: CRITERIA AND STANDARDS Criteria and Standards Applicable to Class VI Wells § 146... control programs to regulate any Class VI carbon dioxide geologic sequestration injection wells. (b) This...-authorized Class I, Class II, or Class V experimental carbon dioxide injection projects who seek to apply for...
Streaming support for data intensive cloud-based sequence analysis.
Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed
2013-01-01
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.
Users Guide to the JPL Doppler Gravity Database
NASA Technical Reports Server (NTRS)
Muller, P. M.; Sjogren, W. L.
1986-01-01
Local gravity accelerations and gravimetry have been determined directly from spacecraft Doppler tracking data near the Moon and various planets by the Jet Propulsion Laboratory. Researchers in many fields have an interest in planet-wide global gravimetric mapping and its applications. Many of them use their own computers in support of their studies and would benefit from being able to directly manipulate these gravity data for inclusion in their own modeling computations. Pubication of some 150 Apollo 15 subsatellite low-altitude, high-resolution, single-orbit data sets is covered. The doppler residuals with a determination of the derivative function providing line-of-sight-gravity are both listed and plotted (on microfilm), and can be ordered in computer readable forms (tape and floppy disk). The form and format of this database as well as the methods of data reduction are explained and referenced. A skeleton computer program is provided which can be modified to support re-reductions and re-formatted presentations suitable to a wide variety of research needs undertaken on mainframe or PC class microcomputers.
Approximate labeling via graph cuts based on linear programming.
Komodakis, Nikos; Tziritas, Georgios
2007-08-01
A new framework is presented for both understanding and developing graph-cut-based combinatorial algorithms suitable for the approximate optimization of a very wide class of Markov Random Fields (MRFs) that are frequently encountered in computer vision. The proposed framework utilizes tools from the duality theory of linear programming in order to provide an alternative and more general view of state-of-the-art techniques like the \\alpha-expansion algorithm, which is included merely as a special case. Moreover, contrary to \\alpha-expansion, the derived algorithms generate solutions with guaranteed optimality properties for a much wider class of problems, for example, even for MRFs with nonmetric potentials. In addition, they are capable of providing per-instance suboptimality bounds in all occasions, including discrete MRFs with an arbitrary potential function. These bounds prove to be very tight in practice (that is, very close to 1), which means that the resulting solutions are almost optimal. Our algorithms' effectiveness is demonstrated by presenting experimental results on a variety of low-level vision tasks, such as stereo matching, image restoration, image completion, and optical flow estimation, as well as on synthetic problems.
NASA Technical Reports Server (NTRS)
Cibula, W. G.
1981-01-01
Four LANDSAT frames, each corresponding to one of the four seasons were spectrally classified and processed using NASA-developed computer programs. One data set was selected or two or more data sets were marged to improve surface cover classifications. Selected areas representing each spectral class were chosen and transferred to USGS 1:62,500 topographic maps for field use. Ground truth data were gathered to verify the accuracy of the classifications. Acreages were computed for each of the land cover types. The application of elevational data to seasonal LANDSAT frames resulted in the separation of high elevation meadows (both with and without recently emergent perennial vegetation) as well as areas in oak forests which have an evergreen understory as opposed to other areas which do not.
Giegerich, Robert; Voss, Björn; Rehmsmeier, Marc
2004-01-01
The function of a non-protein-coding RNA is often determined by its structure. Since experimental determination of RNA structure is time-consuming and expensive, its computational prediction is of great interest, and efficient solutions based on thermodynamic parameters are known. Frequently, however, the predicted minimum free energy structures are not the native ones, leading to the necessity of generating suboptimal solutions. While this can be accomplished by a number of programs, the user is often confronted with large outputs of similar structures, although he or she is interested in structures with more fundamental differences, or, in other words, with different abstract shapes. Here, we formalize the concept of abstract shapes and introduce their efficient computation. Each shape of an RNA molecule comprises a class of similar structures and has a representative structure of minimal free energy within the class. Shape analysis is implemented in the program RNAshapes. We applied RNAshapes to the prediction of optimal and suboptimal abstract shapes of several RNAs. For a given energy range, the number of shapes is considerably smaller than the number of structures, and in all cases, the native structures were among the top shape representatives. This demonstrates that the researcher can quickly focus on the structures of interest, without processing up to thousands of near-optimal solutions. We complement this study with a large-scale analysis of the growth behaviour of structure and shape spaces. RNAshapes is available for download and as an online version on the Bielefeld Bioinformatics Server.
ERIC Educational Resources Information Center
Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed
2008-01-01
This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…
ERIC Educational Resources Information Center
Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed
2008-01-01
This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…
Iwasaki, T; Sato, H; Suga, H; Takemoto, Y; Inada, E; Saitoh, I; Kakuno, K; Kanomi, R; Yamasaki, Y
2017-05-01
To examine the influence of negative pressure of the pharyngeal airway on mandibular retraction during inspiration in children with nasal obstruction using the computational fluid dynamics (CFD) method. Sixty-two children were divided into Classes I, II (mandibular retrusion) and III (mandibular protrusion) malocclusion groups. Cone-beam computed tomography data were used to reconstruct three-dimensional shapes of the nasal and pharyngeal airways. Airflow pressure was simulated using CFD to calculate nasal resistance and pharyngeal airway pressure during inspiration and expiration. Nasal resistance of the Class II group was significantly higher than that of the other two groups, and oropharyngeal airway inspiration pressure in the Class II (-247.64 Pa) group was larger than that in the Class I (-43.51 Pa) and Class III (-31.81 Pa) groups (P<.001). The oropharyngeal airway inspiration-expiration pressure difference in the Class II (-27.38 Pa) group was larger than that in the Class I (-5.17 Pa) and Class III (0.68 Pa) groups (P=.006). Large negative inspiratory pharyngeal airway pressure due to nasal obstruction in children with Class II malocclusion may be related to their retrognathia. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Risser, Dennis W.
2008-01-01
This report presents the results of a study by the U.S. Geological Survey, in cooperation with the Pennsylvania Geological Survey, to illustrate a water-budget method for mapping the spatial distribution of ground-water recharge for a 76-square-mile part of the Jordan Creek watershed, northwest of Allentown, in Lehigh County, Pennsylvania. Recharge was estimated by using the Hydrological Evaluation of Landfill Performance (HELP) water-budget model for 577 landscape units in Jordan Creek watershed, delineated on the basis of their soils, land use/land cover, and mean annual precipitation during 1951-2000. The water-budget model routes precipitation falling on each landscape unit to components of evapotranspiration, surface runoff, storage, and vertical percolation (recharge) for a five-layer soil column on a daily basis. The spatial distribution of mean annual recharge during 1951-2000 for each landscape unit was mapped by the use of a geographic information system. Recharge simulated by the water-budget model in Jordan Creek watershed during 1951-2000 averaged 12.3 inches per year and ranged by landscape unit from 0.11 to 17.05 inches per year. Mean annual recharge during 1951-2000 simulated by the water-budget model was most sensitive to changes to input values for precipitation and runoff-curve number. Mean annual recharge values for the crop, forest, pasture, and low-density urban land-use/land-cover classes were similar (11.2 to 12.2 inches per year) but were substantially less for high-density urban (6.8 inches per year), herbaceous wetlands (2.5 inches per year), and forested wetlands (1.3 inches per year). Recharge rates simulated for the crop, forest, pasture, and low-density urban land-cover classes were similar because those land-use/land-cover classes are represented in the model with parameter values that either did not significantly affect simulated recharge or tended to have offsetting effects on recharge. For example, for landscapes with forest land cover, values of runoff-curve number assigned to the model were smaller than for other land-use/land-cover classes (causing more recharge and less runoff), but the maximum depth of evapotranspiration was larger than for other land-use/ land-cover classes because of deeper root penetration in forests (causing more evapotranspiration and less recharge). The smaller simulated recharge for high-density urban and wetland land-use/land-cover classes was caused by the large values of runoff-curve number (greater than 90) assigned to those classes. The large runoff-curve number, however, certainly is not realistic for all wetlands; some wetlands act as areas of ground-water discharge and some as areas of recharge. Simulated mean annual recharge computed by the water-budget model for the 53-square-mile part of the watershed upstream from the streamflow-gaging station near Schnecksville was compared to estimates of recharge and base flow determined by analysis of streamflow records from 1967 to 2000. The mean annual recharge of 12.4 inches per year simulated by the water-budget method for 1967-2000 was less than estimates of mean annual recharge of 19.3 inches per year computed from the RORA computer program and base flow computed by the PART computer program (15.1 inches per year). In theory, the water-budget method provides a practical tool for estimating differences in recharge at local scales of interest, and the watershed- average recharge rate of 12.4 inches per year computed by the method is reasonable. However, the mean annual surface runoff of 4.5 inches per year simulated by the model is unrealistically small. The sum of surface runoff and recharge simulated by the water-budget model (16.9 inches per year) is 7 inches per year less than the streamflow measured at the gaging station near Schnecksville (23.9 inches per year) during 1967-2000, indicating that evapotranspiration is overestimated by the water-budget model by that amount. This discrepancy ca
DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Williams, C. H.; Spurlock, O. F.
2014-01-01
From the late 1960's through 1997, the leadership of NASA's Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRC's primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the code's operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960's is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the Atlas/Centaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (Atlas/Centaur, Titan/Centaur, and Shuttle/Centaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUP's many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.
DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Spurlock, O. Frank; Williams, Craig H.
2015-01-01
From the late 1960s through 1997, the leadership of NASAs Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRCs primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the codes operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960s is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the AtlasCentaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (AtlasCentaur, TitanCentaur, and ShuttleCentaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUPs many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.
40 CFR 147.2650 - State-administered program-Class I, II, III, IV, and V wells.
Code of Federal Regulations, 2010 CFR
2010-07-01
... CONTROL PROGRAMS Puerto Rico § 147.2650 State-administered program—Class I, II, III, IV, and V wells. The Underground Injection Control Program for all classes of wells in the Commonwealth of Puerto Rico, other than those on Indian lands, is the program administered by Puerto Rico's Environmental Quality Board (EQB...
40 CFR 147.2650 - State-administered program-Class I, II, III, IV, and V wells.
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTROL PROGRAMS Puerto Rico § 147.2650 State-administered program—Class I, II, III, IV, and V wells. The Underground Injection Control Program for all classes of wells in the Commonwealth of Puerto Rico, other than those on Indian lands, is the program administered by Puerto Rico's Environmental Quality Board (EQB...
40 CFR 147.2650 - State-administered program-Class I, II, III, IV, and V wells.
Code of Federal Regulations, 2013 CFR
2013-07-01
... CONTROL PROGRAMS Puerto Rico § 147.2650 State-administered program—Class I, II, III, IV, and V wells. The Underground Injection Control Program for all classes of wells in the Commonwealth of Puerto Rico, other than those on Indian lands, is the program administered by Puerto Rico's Environmental Quality Board (EQB...
ERIC Educational Resources Information Center
O'Brien, George M.
This paper reports on an on-going experiment in computer-aided language instruction. In 1972, a class of beginning German students at the Duluth campus of the University of Minnesota volunteered to test two pedagogic theories: (1) Could a computer-aided course be used by a class and an instructor who knew nothing of computers and who had to rely…
NASA Astrophysics Data System (ADS)
Cai, Ailong; Li, Lei; Zheng, Zhizhong; Zhang, Hanming; Wang, Linyuan; Hu, Guoen; Yan, Bin
2018-02-01
In medical imaging many conventional regularization methods, such as total variation or total generalized variation, impose strong prior assumptions which can only account for very limited classes of images. A more reasonable sparse representation frame for images is still badly needed. Visually understandable images contain meaningful patterns, and combinations or collections of these patterns can be utilized to form some sparse and redundant representations which promise to facilitate image reconstructions. In this work, we propose and study block matching sparsity regularization (BMSR) and devise an optimization program using BMSR for computed tomography (CT) image reconstruction for an incomplete projection set. The program is built as a constrained optimization, minimizing the L1-norm of the coefficients of the image in the transformed domain subject to data observation and positivity of the image itself. To solve the program efficiently, a practical method based on the proximal point algorithm is developed and analyzed. In order to accelerate the convergence rate, a practical strategy for tuning the BMSR parameter is proposed and applied. The experimental results for various settings, including real CT scanning, have verified the proposed reconstruction method showing promising capabilities over conventional regularization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grulke, Eric; Stencel, John
2011-09-13
The KY DOE EPSCoR Program supports two research clusters. The Materials Cluster uses unique equipment and computational methods that involve research expertise at the University of Kentucky and University of Louisville. This team determines the physical, chemical and mechanical properties of nanostructured materials and examines the dominant mechanisms involved in the formation of new self-assembled nanostructures. State-of-the-art parallel computational methods and algorithms are used to overcome current limitations of processing that otherwise are restricted to small system sizes and short times. The team also focuses on developing and applying advanced microtechnology fabrication techniques and the application of microelectrornechanical systems (MEMS)more » for creating new materials, novel microdevices, and integrated microsensors. The second research cluster concentrates on High Energy and Nuclear Physics. lt connects research and educational activities at the University of Kentucky, Eastern Kentucky University and national DOE research laboratories. Its vision is to establish world-class research status dedicated to experimental and theoretical investigations in strong interaction physics. The research provides a forum, facilities, and support for scientists to interact and collaborate in subatomic physics research. The program enables increased student involvement in fundamental physics research through the establishment of graduate fellowships and collaborative work.« less
Intrasystem Analysis Program (IAP) code summaries
NASA Astrophysics Data System (ADS)
Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.
1983-05-01
This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram
Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less
Teaching scientific thinking skills: Students and computers coaching each other
NASA Astrophysics Data System (ADS)
Reif, Frederick; Scott, Lisa A.
1999-09-01
Our attempts to improve physics instruction have led us to analyze thought processes needed to apply scientific principles to problems—and to recognize that reliable performance requires the basic cognitive functions of deciding, implementing, and assessing. Using a reciprocal-teaching strategy to teach such thought processes explicitly, we have developed computer programs called PALs (P_ersonal A_ssistants for L_earning) in which computers and students alternately coach each other. These computer-implemented tutorials make it practically feasible to provide students with individual guidance and feedback ordinarily unavailable in most courses. We constructed PALs specifically designed to teach the application of Newton's laws. In a comparative experimental study these computer tutorials were found to be nearly as effective as individual tutoring by expert teachers—and considerably more effective than the instruction provided in a well-taught physics class. Furthermore, almost all of the students using the PALs perceived them as very helpful to their learning. These results suggest that the proposed instructional approach could fruitfully be extended to improve instruction in various practically realistic contexts.
NASA Astrophysics Data System (ADS)
Tucker, Laura Jane
Under the harsh conditions of limited nutrient and hard growth surface, Paenibacillus dendritiformis in agar plates form two classes of patterns (morphotypes). The first class, called the dendritic morphotype, has radially directed branches. The second class, called the chiral morphotype, exhibits uniform handedness. The dendritic morphotype has been modeled successfully using a continuum model on a regular lattice; however, a suitable computational approach was not known to solve a continuum chiral model. This work details a new computational approach to solving the chiral continuum model of pattern formation in P. dendritiformis. The approach utilizes a random computational lattice and new methods for calculating certain derivative terms found in the model.
Nonproliferation Graduate Fellowship Program, Annual Report, Class of 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMakin, Andrea H.
2013-09-23
This 32-pp annual report/brochure describes the accomplishments of the Class of 2012 of the Nonproliferation Graduate Fellowship Program (the last class of this program), which PNNL administers for the National Nuclear Security Administration. The time period covers Sept 2011 through June 2013.
NASA Astrophysics Data System (ADS)
Lee, Kevin M.; French, R. S.; Hands, D. R.; Loranz, D. R.; Martino, D.; Rudolph, A. L.; Wysong, J.; Young, T. S.; Prather, E. E.; CATS
2010-01-01
ClassAction is a computer database of materials designed to enhance the conceptual understanding and reasoning abilities of Astro 101 students by promoting interactive engagement and providing rapid feedback. The main focus is dynamic conceptual questions largely based upon graphics that can be projected in the classroom. Instructors have the capability to select, order, and recast these questions into alternate permutations based on their own preferences and student responses. Instructors may also provide feedback through extensive resources including outlines, graphics, and simulations. The Light and Spectroscopy Concept Inventory (LSCI) is a multiple-choice assessment instrument which focuses on the electromagnetic spectrum, Doppler shift, Wien's Law, Stefan-Boltzmann Law, and Kirchhoff's Laws. Illustrative examples of how these concepts are targeted by the questions and resources of the ClassAction module are shown. ClassAction materials covering light and spectra concepts were utilized in multiple classrooms at 6 different institutions and the LSCI was delivered as a pretest and posttest to measure the gains in student understanding. A comparison of the gains achieved in these classes will be made against the national LSCI data. We will report on our investigation into correlations between gain and the extent of ClassAction usage. ClassAction materials are publicly available at http://astro.unl.edu. We would like to thank the NSF for funding under Grant Nos. 0404988 and 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS) Program.
Computer models of social processes: the case of migration.
Beshers, J M
1967-06-01
The demographic model is a program for representing births, deaths, migration, and social mobility as social processes in a non-stationary stochastic process (Markovian). Transition probabilities for each age group are stored and then retrieved at the next appearance of that age cohort. In this way new transition probabilities can be calculated as a function of the old transition probabilities and of two successive distribution vectors.Transition probabilities can be calculated to represent effects of the whole age-by-state distribution at any given time period, too. Such effects as saturation or queuing may be represented by a market mechanism; for example, migration between metropolitan areas can be represented as depending upon job supplies and labor markets. Within metropolitan areas, migration can be represented as invasion and succession processes with tipping points (acceleration curves), and the market device has been extended to represent this phenomenon.Thus, the demographic model makes possible the representation of alternative classes of models of demographic processes. With each class of model one can deduce implied time series (varying parame-terswithin the class) and the output of the several classes can be compared to each other and to outside criteria, such as empirical time series.
Buying into the Computer Age: A Look at the Hispanic Middle Class.
ERIC Educational Resources Information Center
Wilhelm, Anthony G.
The Tomas Rivera Policy Institute conducted focus groups in the summer of 1997 to gain insight into why there is a gap in computer ownership between Hispanic middle-class families and non-Hispanic families of the same middle class income bracket (between 25 and 50 thousand dollars). Results from 6 focus groups of 15 to 20 heads of household each…
40 CFR 147.650 - State-administrative program-Class I, II, III, IV, and V wells.
Code of Federal Regulations, 2010 CFR
2010-07-01
... CONTROL PROGRAMS Idaho § 147.650 State-administrative program—Class I, II, III, IV, and V wells. The UIC program for Class I, II, III, IV, and V wells in the State of Idaho, other than those on Indian lands, is the program administered by the Idaho Department of Water Resources, approved by EPA pursuant to...
Virtual Doors to Brick and Mortar Learning
NASA Astrophysics Data System (ADS)
Shaw, M. S.; Gay, P. L.; Meyer, D. T.; Zamfirescu, J. D.; Smith, J. E.; MIT Educational Studies Program Team
2005-12-01
The MIT Educational Studies Program (ESP) has spent the past year developing an online gateway for outreach programs. The website has a five-fold purpose: to introduce the organization to potential students, teachers, volunteers and collaborators; to allow teachers to create, design and interact with classes and to allow students to register for and dialogue with these classes; to provide an online forum for continuing dialogue; and to provide organizers a wiki for documenting program administration. What makes our site unique is the free and flexible nature of our easily edited and expanded code. In its standard installation, teachers setup classes, and administrators can approve/edit classes and make classes visible in an online catalogues. Student registration is completely customizable - students can register for self-selected classes, or they can register for a program and later get placed into teacher-selected classes. Free wiki software allows users to interactively create and edit documentation and knowledgebases. This allows administrators to track online what has been done while at the same time creating instant documentation for future programs. The online forum is a place where students can go after our programs end to learn more, interact with their classmates, and continue dialogues started in our classrooms. We also use the forum to get feedback on past and future programs. The ease with which the software handles program creation, registration, communications and more allows programs for roughly 3000 students per year to be handled by about 20 volunteering undergraduates. By combining all these elements - promotion, class creation, program registration, an organizational wiki, and student forums - we create a one-stop virtual entryway into face-to-face learning that allows students to continue their experience after they leave the classroom. The code for this site is available for free upon request to other organizations.
BONSAI Garden: Parallel knowledge discovery system for amino acid sequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoudai, T.; Miyano, S.; Shinohara, A.
1995-12-31
We have developed a machine discovery system BON-SAI which receives positive and negative examples as inputs and produces as a hypothesis a pair of a decision tree over regular patterns and an alphabet indexing. This system has succeeded in discovering reasonable knowledge on transmembrane domain sequences and signal peptide sequences by computer experiments. However, when several kinds of sequences axe mixed in the data, it does not seem reasonable for a single BONSAI system to find a hypothesis of a reasonably small size with high accuracy. For this purpose, we have designed a system BONSAI Garden, in which several BONSAI`smore » and a program called Gardener run over a network in parallel, to partition the data into some number of classes together with hypotheses explaining these classes accurately.« less
NASA Astrophysics Data System (ADS)
Babaali, Parisa; Gonzalez, Lidia
2015-07-01
Supporting student success in entry-level mathematics courses at the undergraduate level has and continues to be a challenge. Recently we have seen an increased reliance on technological supports including software to supplement more traditional in-class instruction. In this paper, we explore the effects on student performance of the use of a computer software program to supplement instruction in an entry-level mathematics course at the undergraduate level, specifically, a pre-calculus course. Relying on data from multiple sections of the course over various semesters, we compare student performance in those classes utilizing the software against those in which it was not used. Quantitative analysis of the data then leads us to conclusions about the effectiveness of the software as well as recommendations for future iterations of the course and others like it.
Computer applications in remote sensing education
NASA Technical Reports Server (NTRS)
Danielson, R. L.
1980-01-01
Computer applications to instruction in any field may be divided into two broad generic classes: computer-managed instruction and computer-assisted instruction. The division is based on how frequently the computer affects the instructional process and how active a role the computer affects the instructional process and how active a role the computer takes in actually providing instruction. There are no inherent characteristics of remote sensing education to preclude the use of one or both of these techniques, depending on the computer facilities available to the instructor. The characteristics of the two classes are summarized, potential applications to remote sensing education are discussed, and the advantages and disadvantages of computer applications to the instructional process are considered.
Teaching and learning experiences in a collaborative distance-education environment.
Martin, Peter; Scheetz, Laura Temple
2011-01-01
The Great Plains Distance Education Alliance (Great Plains IDEA) emphasizes the importance of a collaborative environment for instructors and students in distance education. The authors highlight a number of important principles for distance-education programs and point out similarities and differences when compared to traditional face-face-to classes such as communication, classroom management, connectivity, and technical challenges. They summarize general topics concerning the faculty, the syllabus, office hours, the calendar, and announcements. Three essential lesson components are noted: an overview, the lesson itself, and supplemanetary material. The authors also take the student perspective, emphasizing the diversity of students, the importance of computer proficiency, and student interactions. Finally, they summarize a first round of course evaluations in the Great Plains IDEA gerontology master's program.
NASA Astrophysics Data System (ADS)
Bourgeois, E.; Bokanowski, O.; Zidani, H.; Désilles, A.
2018-06-01
The resolution of the launcher ascent trajectory problem by the so-called Hamilton-Jacobi-Bellman (HJB) approach, relying on the Dynamic Programming Principle, has been investigated. The method gives a global optimum and does not need any initialization procedure. Despite these advantages, this approach is seldom used because of the dicculties of computing the solution of the HJB equation for high dimension problems. The present study shows that an eccient resolution is found. An illustration of the method is proposed on a heavy class launcher, for a typical GEO (Geostationary Earth Orbit) mission. This study has been performed in the frame of the Centre National d'Etudes Spatiales (CNES) Launchers Research & Technology Program.
Automatic data partitioning on distributed memory multicomputers. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gupta, Manish
1992-01-01
Distributed-memory parallel computers are increasingly being used to provide high levels of performance for scientific applications. Unfortunately, such machines are not very easy to program. A number of research efforts seek to alleviate this problem by developing compilers that take over the task of generating communication. The communication overheads and the extent of parallelism exploited in the resulting target program are determined largely by the manner in which data is partitioned across different processors of the machine. Most of the compilers provide no assistance to the programmer in the crucial task of determining a good data partitioning scheme. A novel approach is presented, the constraints-based approach, to the problem of automatic data partitioning for numeric programs. In this approach, the compiler identifies some desirable requirements on the distribution of various arrays being referenced in each statement, based on performance considerations. These desirable requirements are referred to as constraints. For each constraint, the compiler determines a quality measure that captures its importance with respect to the performance of the program. The quality measure is obtained through static performance estimation, without actually generating the target data-parallel program with explicit communication. Each data distribution decision is taken by combining all the relevant constraints. The compiler attempts to resolve any conflicts between constraints such that the overall execution time of the parallel program is minimized. This approach has been implemented as part of a compiler called Paradigm, that accepts Fortran 77 programs, and specifies the partitioning scheme to be used for each array in the program. We have obtained results on some programs taken from the Linpack and Eispack libraries, and the Perfect Benchmarks. These results are quite promising, and demonstrate the feasibility of automatic data partitioning for a significant class of scientific application programs with regular computations.
Implementing Machine Learning in Radiology Practice and Research.
Kohli, Marc; Prevedello, Luciano M; Filice, Ross W; Geis, J Raymond
2017-04-01
The purposes of this article are to describe concepts that radiologists should understand to evaluate machine learning projects, including common algorithms, supervised as opposed to unsupervised techniques, statistical pitfalls, and data considerations for training and evaluation, and to briefly describe ethical dilemmas and legal risk. Machine learning includes a broad class of computer programs that improve with experience. The complexity of creating, training, and monitoring machine learning indicates that the success of the algorithms will require radiologist involvement for years to come, leading to engagement rather than replacement.
1983-06-01
SCHOOL MONTEREY CA R T SCHWAB JUN 83 NCLASSIFIED F/G 9/2 N Enhloss monsooI mhhhhhhhhhhhhI --. II14Il m -. 7 • oo 116 L.ll ___ M - m # 1&-1 12.0ih - & L...Schwab CU~ June 1983 Thesis Advisor: Donald M . Layton Approved for public release; distribution unlimited DISCLAIMER NOTICE THIS DOCUMENT IS BEST...School June 83 Monterey, California 93940 IS. NUMBER OF PAGES 61 ~4. MOIT M14N TGMC %ZAME a 4001RESS1(fi Musang " M Co"ausm Off) IS. SECURITY CLASS. (of
On Tree-Based Phylogenetic Networks.
Zhang, Louxin
2016-07-01
A large class of phylogenetic networks can be obtained from trees by the addition of horizontal edges between the tree edges. These networks are called tree-based networks. We present a simple necessary and sufficient condition for tree-based networks and prove that a universal tree-based network exists for any number of taxa that contains as its base every phylogenetic tree on the same set of taxa. This answers two problems posted by Francis and Steel recently. A byproduct is a computer program for generating random binary phylogenetic networks under the uniform distribution model.
Future remote-sensing programs
NASA Technical Reports Server (NTRS)
Schweickart, R. L.
1975-01-01
User requirements and methods developed to fulfill them are discussed. Quick-look data, data storage on computer-compatible tape, and an integrated capability for production of images from the whole class of earth-viewing satellites are among the new developments briefly described. The increased capability of LANDSAT-C and Nimbus G and the needs of specialized applications such as, urban land use planning, cartography, accurate measurement of small agricultural fields, thermal mapping and coastal zone management are examined. The affect of the space shuttle on remote sensing technology through increased capability is considered.
HyperCard and Other Macintosh Applications in Astronomy Education
NASA Astrophysics Data System (ADS)
Meisel, D.
1992-12-01
For the past six years, Macintosh computers have been used in introductory astronomy classes and laboratories with HyperCard and other commercial Macintosh software. I will review some of the available software that has been found particularly useful in undergraduate situations. The review will start with HyperCard (a programmable "index card" system) since it is a mature multimedia platform for the Macintosh. Experiences with the Voyager, the TS-24, MathCad, NIH Image, and other programs as used by the author and George Mumford (Tufts University) in courses and workshops will be described.
Literal algebra for satellite dynamics. [perturbation analysis
NASA Technical Reports Server (NTRS)
Gaposchkin, E. M.
1975-01-01
A description of the rather general class of operations available is given and the operations are related to problems in satellite dynamics. The implementation of an algebra processor is discussed. The four main categories of symbol processors are related to list processing, string manipulation, symbol manipulation, and formula manipulation. Fundamental required operations for an algebra processor are considered. It is pointed out that algebra programs have been used for a number of problems in celestial mechanics with great success. The advantage of computer algebra is its accuracy and speed.
NNSA Nonproliferation Graduate Fellowship Program Annual Report June 2008 - May 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berkman, Clarissa O.; Fankhauser, Jana G.
2010-03-01
In 2009, the Nonproliferation Graduate Fellowship Program (NGFP) completed its 16th successful year in support of the NNSA’s mission by developing future leaders in nonproliferation and promoting awareness of career opportunities. We provide this annual report to review program activities from June 2008 through May 2009 - the fellowship term for the Class of 2008. Contents include: Welcome Letter Introduction The NGFP Team Program Management Highlights Class of 2008 Incoming Fellows Orientation Travel Career Development Management of the Fellows Performance Highlights Closing Ceremony Encore Performance Where They Are Now Alumnus Career Highlights: Christine Buzzard Class of 2009 Applicant Database Upgradesmore » Fall Recruitment Activities Interviews Hiring and Clearances Introducing the Class of 2009 Class of 2010 Recruitment Strategy On the Horizon Appendix A: Class of 2009 Fellows« less
ERIC Educational Resources Information Center
Center, Yola; Freeman, Louella
This research review examined the use of a whole class early literacy program in classes which included disadvantaged and at-risk children in Australia. The program, Schoolwide Early Language and Literacy (SWELL), is based on an interactive compensatory theory of literacy acquisition adapted from Success for All, a U.S. early literacy program. The…
An urban area minority outreach program for K-6 children in space science
NASA Astrophysics Data System (ADS)
Morris, P.; Garza, O.; Lindstrom, M.; Allen, J.; Wooten, J.; Sumners, C.; Obot, V.
The Houston area has minority populations with significant school dropout rates. This is similar to other major cities in the United States and elsewhere in the world where there are significant minority populations from rural areas. The student dropout rates are associated in many instances with the absence of educational support opportuni- ties either from the school and/or from the family. This is exacerbated if the student has poor English language skills. To address this issue, a NASA minority university initiative enabled us to develop a broad-based outreach program that includes younger children and their parents at a primarily Hispanic inner city charter school. The pro- gram at the charter school was initiated by teaching computer skills to the older chil- dren, who in turn taught parents. The older children were subsequently asked to help teach a computer literacy class for mothers with 4-5 year old children. The computers initially intimidated the mothers as most had limited educational backgrounds and En- glish language skills. To practice their newly acquired computer skills and learn about space science, the mothers and their children were asked to pick a space project and investigate it using their computer skills. The mothers and their children decided to learn about black holes. The project included designing space suits for their children so that they could travel through space and observe black holes from a closer proxim- ity. The children and their mothers learned about computers and how to use them for educational purposes. In addition, they learned about black holes and the importance of space suits in protecting astronauts as they investigated space. The parents are proud of their children and their achievements. By including the parents in the program, they have a greater understanding of the importance of their children staying in school and the opportunities for careers in space science and technology. For more information on our overall program, the charter school and their other space science related activities, visit their web site, http://www.tccc-ryss.org/solarsys/solarmingrant.htm
Computation Directorate Annual Report 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L; McGraw, J R; Ashby, S F
Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less
ERIC Educational Resources Information Center
Poling, Kirsten; Smit, Julie; Higgs, Dennis
2013-01-01
Laptop computers were provided for use in three biology classes with differing formats (a second year lecture course of 100 students, a third/fourth year lecture course of 50 students, and a second year course with greater than 250 students, in groups of 25 during the laboratory portion of the class) to assess their impact on student learning and…
Data-Acquisition Software for PSP/TSP Wind-Tunnel Cameras
NASA Technical Reports Server (NTRS)
Amer, Tahani R.; Goad, William K.
2005-01-01
Wing-Viewer is a computer program for acquisition and reduction of image data acquired by any of five different scientificgrade commercial electronic cameras used at Langley Research center to observe wind-tunnel models coated with pressure or temperature-sensitive paints (PSP/TSP). Wing-Viewer provides full automation of camera operation and acquisition of image data, and has limited data-preprocessing capability for quick viewing of the results of PSP/TSP test images. Wing- Viewer satisfies a requirement for a standard interface between all the cameras and a single personal computer: Written by use of Microsoft Visual C++ and the Microsoft Foundation Class Library as a framework, Wing-Viewer has the ability to communicate with the C/C++ software libraries that run on the controller circuit cards of all five cameras.
Optimization of knowledge-based systems and expert system building tools
NASA Technical Reports Server (NTRS)
Yasuda, Phyllis; Mckellar, Donald
1993-01-01
The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.
Fayn, J; Rubel, P
1988-01-01
The authors present a new computer program for serial ECG analysis that allows a direct comparison of any couple of three-dimensional ECGs and quantitatively assesses the degree of evolution of the spatial loops as well as of their initial, central, or terminal sectors. Loops and sectors are superposed as best as possible, with the aim of overcoming tracing variability of nonpathological origin. As a result, optimal measures of evolution are computed and a tabular summary of measurements is dynamically configured with respect to the patient's history and is then printed. A multivariate classifier assigns each couple of tracings to one of four classes of evolution. Color graphic displays corresponding to several modes of representation may also be plotted.
Student Engagement in a Computer Rich Science Classroom
NASA Astrophysics Data System (ADS)
Hunter, Jeffrey C.
The purpose of this study was to examine the student lived experience when using computers in a rural science classroom. The overarching question the project sought to examine was: How do rural students relate to computers as a learning tool in comparison to a traditional science classroom? Participant data were collected using a pre-study survey, Experience Sampling during class and post-study interviews. Students want to use computers in their classrooms. Students shared that they overwhelmingly (75%) preferred a computer rich classroom to a traditional classroom (25%). Students reported a higher level of engagement in classes that use technology/computers (83%) versus those that do not use computers (17%). A computer rich classroom increased student control and motivation as reflected by a participant who shared; "by using computers I was more motivated to get the work done" (Maggie, April 25, 2014, survey). The researcher explored a rural school environment. Rural populations represent a large number of students and appear to be underrepresented in current research. The participants, tenth grade Biology students, were sampled in a traditional teacher led class without computers for one week followed by a week using computers daily. Data supported that there is a new gap that separates students, a device divide. This divide separates those who have access to devices that are robust enough to do high level class work from those who do not. Although cellular phones have reduced the number of students who cannot access the Internet, they may have created a false feeling that access to a computer is no longer necessary at home. As this study shows, although most students have Internet access, fewer have access to a device that enables them to complete rigorous class work at home. Participants received little or no training at school in proper, safe use of a computer and the Internet. It is clear that the majorities of students are self-taught or receive guidance from peers resulting in lower self-confidence or the development of misconceptions of their skill or ability.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Computer Graphics Instruction in VizClass
ERIC Educational Resources Information Center
Grimes, Douglas; Warschauer, Mark; Hutchinson, Tara; Kuester, Falko
2005-01-01
"VizClass" is a university classroom environment designed to offer students in computer graphics and engineering courses up-to-date visualization technologies. Three digital whiteboards and a three-dimensional stereoscopic display provide complementary display surfaces. Input devices include touchscreens on the digital whiteboards, remote…
Tri-Laboratory Linux Capacity Cluster 2007 SOW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seager, M
2007-03-22
The Advanced Simulation and Computing (ASC) Program (formerly know as Accelerated Strategic Computing Initiative, ASCI) has led the world in capability computing for the last ten years. Capability computing is defined as a world-class platform (in the Top10 of the Top500.org list) with scientific simulations running at scale on the platform. Example systems are ASCI Red, Blue-Pacific, Blue-Mountain, White, Q, RedStorm, and Purple. ASC applications have scaled to multiple thousands of CPUs and accomplished a long list of mission milestones on these ASC capability platforms. However, the computing demands of the ASC and Stockpile Stewardship programs also include a vastmore » number of smaller scale runs for day-to-day simulations. Indeed, every 'hero' capability run requires many hundreds to thousands of much smaller runs in preparation and post processing activities. In addition, there are many aspects of the Stockpile Stewardship Program (SSP) that can be directly accomplished with these so-called 'capacity' calculations. The need for capacity is now so great within the program that it is increasingly difficult to allocate the computer resources required by the larger capability runs. To rectify the current 'capacity' computing resource shortfall, the ASC program has allocated a large portion of the overall ASC platforms budget to 'capacity' systems. In addition, within the next five to ten years the Life Extension Programs (LEPs) for major nuclear weapons systems must be accomplished. These LEPs and other SSP programmatic elements will further drive the need for capacity calculations and hence 'capacity' systems as well as future ASC capability calculations on 'capability' systems. To respond to this new workload analysis, the ASC program will be making a large sustained strategic investment in these capacity systems over the next ten years, starting with the United States Government Fiscal Year 2007 (GFY07). However, given the growing need for 'capability' systems as well, the budget demands are extreme and new, more cost effective ways of fielding these systems must be developed. This Tri-Laboratory Linux Capacity Cluster (TLCC) procurement represents the ASC first investment vehicle in these capacity systems. It also represents a new strategy for quickly building, fielding and integrating many Linux clusters of various sizes into classified and unclassified production service through a concept of Scalable Units (SU). The programmatic objective is to dramatically reduce the overall Total Cost of Ownership (TCO) of these 'capacity' systems relative to the best practices in Linux Cluster deployments today. This objective only makes sense in the context of these systems quickly becoming very robust and useful production clusters under the crushing load that will be inflicted on them by the ASC and SSP scientific simulation capacity workload.« less
ERIC Educational Resources Information Center
Owston, Ronald D.; And Others
A study assessed the impact of word processing on the writing of junior high school students, experienced in working with computers, for a number of tasks, including writing. Subjects, 111 eighth grade students in four communications arts classes at a Canadian middle-class suburban school, who had been using computers for writing for a year and a…
A CS1 pedagogical approach to parallel thinking
NASA Astrophysics Data System (ADS)
Rague, Brian William
Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.
CHARMM: The Biomolecular Simulation Program
Brooks, B.R.; Brooks, C.L.; MacKerell, A.D.; Nilsson, L.; Petrella, R.J.; Roux, B.; Won, Y.; Archontis, G.; Bartels, C.; Boresch, S.; Caflisch, A.; Caves, L.; Cui, Q.; Dinner, A.R.; Feig, M.; Fischer, S.; Gao, J.; Hodoscek, M.; Im, W.; Kuczera, K.; Lazaridis, T.; Ma, J.; Ovchinnikov, V.; Paci, E.; Pastor, R.W.; Post, C.B.; Pu, J.Z.; Schaefer, M.; Tidor, B.; Venable, R. M.; Woodcock, H. L.; Wu, X.; Yang, W.; York, D.M.; Karplus, M.
2009-01-01
CHARMM (Chemistry at HARvard Molecular Mechanics) is a highly versatile and widely used molecular simulation program. It has been developed over the last three decades with a primary focus on molecules of biological interest, including proteins, peptides, lipids, nucleic acids, carbohydrates and small molecule ligands, as they occur in solution, crystals, and membrane environments. For the study of such systems, the program provides a large suite of computational tools that include numerous conformational and path sampling methods, free energy estimators, molecular minimization, dynamics, and analysis techniques, and model-building capabilities. In addition, the CHARMM program is applicable to problems involving a much broader class of many-particle systems. Calculations with CHARMM can be performed using a number of different energy functions and models, from mixed quantum mechanical-molecular mechanical force fields, to all-atom classical potential energy functions with explicit solvent and various boundary conditions, to implicit solvent and membrane models. The program has been ported to numerous platforms in both serial and parallel architectures. This paper provides an overview of the program as it exists today with an emphasis on developments since the publication of the original CHARMM paper in 1983. PMID:19444816
The feasibility of universal DLP-to-risk conversion coefficients for body CT protocols
NASA Astrophysics Data System (ADS)
Li, Xiang; Samei, Ehsan; Segars, W. Paul; Paulson, Erik K.; Frush, Donald P.
2011-03-01
The effective dose associated with computed tomography (CT) examinations is often estimated from dose-length product (DLP) using scanner-independent conversion coefficients. Such conversion coefficients are available for a small number of examinations, each covering an entire region of the body (e.g., head, neck, chest, abdomen and/or pelvis). Similar conversion coefficients, however, do not exist for examinations that cover a single organ or a sub-region of the body, as in the case of a multi-phase liver examination. In this study, we extended the DLP-to-effective dose conversion coefficient (k factor) to a wide range of body CT protocols and derived the corresponding DLP-to-cancer risk conversion coefficient (q factor). An extended cardiactorso (XCAT) computational model was used, which represented a reference adult male patient. A range of body CT protocols used in clinical practice were categorized based on anatomical regions examined into 10 protocol classes. A validated Monte Carlo program was used to estimate the organ dose associated with each protocol class. Assuming the reference model to be 20 years old, effective dose and risk index (an index of the total risk for cancer incidence) were then calculated and normalized by DLP to obtain the k and q factors. The k and q factors varied across protocol classes; the coefficients of variation were 28% and 9%, respectively. The small variation exhibited by the q factor suggested the feasibility of universal q factors for a wide range of body CT protocols.
Quantitative estimation of pesticide-likeness for agrochemical discovery.
Avram, Sorin; Funar-Timofei, Simona; Borota, Ana; Chennamaneni, Sridhar Rao; Manchala, Anil Kumar; Muresan, Sorel
2014-12-01
The design of chemical libraries, an early step in agrochemical discovery programs, is frequently addressed by means of qualitative physicochemical and/or topological rule-based methods. The aim of this study is to develop quantitative estimates of herbicide- (QEH), insecticide- (QEI), fungicide- (QEF), and, finally, pesticide-likeness (QEP). In the assessment of these definitions, we relied on the concept of desirability functions. We found a simple function, shared by the three classes of pesticides, parameterized particularly, for six, easy to compute, independent and interpretable, molecular properties: molecular weight, logP, number of hydrogen bond acceptors, number of hydrogen bond donors, number of rotatable bounds and number of aromatic rings. Subsequently, we describe the scoring of each pesticide class by the corresponding quantitative estimate. In a comparative study, we assessed the performance of the scoring functions using extensive datasets of patented pesticides. The hereby-established quantitative assessment has the ability to rank compounds whether they fail well-established pesticide-likeness rules or not, and offer an efficient way to prioritize (class-specific) pesticides. These findings are valuable for the efficient estimation of pesticide-likeness of vast chemical libraries in the field of agrochemical discovery. Graphical AbstractQuantitative models for pesticide-likeness were derived using the concept of desirability functions parameterized for six, easy to compute, independent and interpretable, molecular properties: molecular weight, logP, number of hydrogen bond acceptors, number of hydrogen bond donors, number of rotatable bounds and number of aromatic rings.
TMS for Instantiating a Knowledge Base With Incomplete Data
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.
Xu, Yiling; Oh, Heesoo; Lagravère, Manuel O
2017-09-01
The purpose of this study was to locate traditionally-used landmarks in two-dimensional (2D) images and newly-suggested ones in three-dimensional (3D) images (cone-beam computer tomographies [CBCTs]) and determine possible relationships between them to categorize patients with Class II-1 malocclusion. CBCTs from 30 patients diagnosed with Class II-1 malocclusion were obtained from the University of Alberta Graduate Orthodontic Program database. The reconstructed images were downloaded and visualized using the software platform AVIZO ® . Forty-two landmarks were chosen and the coordinates were then obtained and analyzed using linear and angular measurements. Ten images were analyzed three times to determine the reliability and measurement error of each landmark using Intra-Class Correlation coefficient (ICC). Descriptive statistics were done using the SPSS statistical package to determine any relationships. ICC values were excellent for all landmarks in all axes, with the highest measurement error of 2mm in the y-axis for the Gonion Left landmark. Linear and angular measurements were calculated using the coordinates of each landmark. Descriptive statistics showed that the linear and angular measurements used in the 2D images did not correlate well with the 3D images. The lowest standard deviation obtained was 0.6709 for S-GoR/N-Me, with a mean of 0.8016. The highest standard deviation was 20.20704 for ANS-InfraL, with a mean of 41.006. The traditional landmarks used for 2D malocclusion analysis show good reliability when transferred to 3D images. However, they did not reveal specific skeletal or dental patterns when trying to analyze 3D images for malocclusion. Thus, another technique should be considered when classifying 3D CBCT images for Class II-1malocclusion. Copyright © 2017 CEO. Published by Elsevier Masson SAS. All rights reserved.