Sample records for sophisticated computer programs

  1. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  2. A HyperCard Program for Business German.

    ERIC Educational Resources Information Center

    Paulsell, Patricia R.

    Although the use of computer-assisted language instruction software has been mainly limited to grammatical/syntactical drills, the increasing number of language professionals with programming skills is leading to the development of more sophisticated language education programs. This report describes the generation of such a program using the…

  3. Using Visual Basic to Teach Programming for Geographers.

    ERIC Educational Resources Information Center

    Slocum, Terry A.; Yoder, Stephen C.

    1996-01-01

    Outlines reasons why computer programming should be taught to geographers. These include experience using macro (scripting) languages and sophisticated visualization software, and developing a deeper understanding of general hardware and software capabilities. Discusses the distinct advantages and few disadvantages of the programming language…

  4. Railroads and the Environment : Estimation of Fuel Consumption in Rail Transportation : Volume 3. Comparison of Computer Simulations with Field Measurements

    DOT National Transportation Integrated Search

    1978-09-01

    This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...

  5. Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.

    ERIC Educational Resources Information Center

    Strenglein, Denise

    1980-01-01

    It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…

  6. Design and performance analysis of solid-propellant rocket motors using a simplified computer program

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.

    1972-01-01

    An analysis and a computer program are presented which represent a compromise between the more sophisticated programs using precise burning geometric relations and the textbook type of solutions. The program requires approximately 900 computer cards including a set of 20 input data cards required for a typical problem. The computer operating time for a single configuration is approximately 1 minute and 30 seconds on the IBM 360 computer. About l minute and l5 seconds of the time is compilation time so that additional configurations input at the same time require approximately 15 seconds each. The program uses approximately 11,000 words on the IBM 360. The program is written in FORTRAN 4 and is readily adaptable for use on a number of different computers: IBM 7044, IBM 7094, and Univac 1108.

  7. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  8. High-Performance Computing Act of 1991. Report of the Senate Committee on Commerce, Science, and Transportation on S. 272. Senate, 102d Congress, 1st Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.

    This report discusses Senate Bill no. 272, which provides for a coordinated federal research and development program to ensure continued U.S. leadership in high-performance computing. High performance computing is defined as representing the leading edge of technological advancement in computing, i.e., the most sophisticated computer chips, the…

  9. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  10. STAF: A Powerful and Sophisticated CAI System.

    ERIC Educational Resources Information Center

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  11. Available for the Apple II: FIRM: Florida InteRactive Modeler.

    ERIC Educational Resources Information Center

    Levy, C. Michael; And Others

    1983-01-01

    The Apple II microcomputer program described allows instructors with minimal programing experience to construct computer models of psychological phenomena for students to investigate. Use of these models eliminates need to maintain/house/breed animals or purchase sophisticated laboratory equipment. Several content models are also described,…

  12. Teaching Conversations with the XDS Sigma 7. Systems Description.

    ERIC Educational Resources Information Center

    Bork, Alfred M.; Mosmann, Charles

    Some computers permit conventional programing languages to be extended by the use of macro-instructions, a sophisticated programing tool which is especially useful in writing instructional dialogs. Macro-instructions (or "macro's") are complex commands defined in terms of the machine language or other macro-instructions. Like terms in…

  13. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  14. V/STOL AND digital avionics system for UH-1H

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1978-01-01

    A hardware and software system for the Bell UH-1H helicopter was developed that provides sophisticated navigation, guidance, control, display, and data acquisition capabilities for performing terminal area navigation, guidance and control research. Two Sperry 1819B general purpose digital computers were used. One contains the development software that performs all the specified system flight computations. The second computer is available to NASA for experimental programs that run simultaneously with the other computer programs and which may, at the push of a button, replace selected computer computations. Other features that provide research flexibility include keyboard selectable gains and parameters and software generated alphanumeric and CRT displays.

  15. Evaluation of Imagine Learning English, a Computer-Assisted Instruction of Language and Literacy for Kindergarten Students

    ERIC Educational Resources Information Center

    Longberg, Pauline Oliphant

    2012-01-01

    As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…

  16. Contemporary issues in HIM. The application layer--III.

    PubMed

    Wear, L L; Pinkert, J R

    1993-07-01

    We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.

  17. The RCM: A Resource Management and Program Budgeting Approach for State and Local Educational Agencies.

    ERIC Educational Resources Information Center

    Chambers, Jay G.; Parrish, Thomas B.

    The Resource Cost Model (RCM) is a resource management system that combines the technical advantages of sophisticated computer simulation software with the practical benefits of group decision making to provide detailed information about educational program costs. The first section of this document introduces the conceptual framework underlying…

  18. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  19. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  20. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  1. Experience with a sophisticated computer based authoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, P.R.

    1984-04-01

    In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less

  2. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  3. An introduction to UGRS: the ultimate grading and remanufacturing system

    Treesearch

    John Moody; Charles J. Gatchell; Elizabeth S. Walker; Powsiri Klinkhachorn

    1998-01-01

    The Ultimate Grading and Remanufactming System (UGRS) is an advanced computer program for grading and remanufacturing lumber. It is an interactive program that will both grade lumber according to NHLA rules and remanufacture it for maximum value. UGRS is written to run under Microsoft Windows 3.0 or later updates and provides a sophisticated graphical user interface....

  4. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  5. Cutter Resource Effectiveness Evaluation (CREE) Program : A Guide for Users and Analysts

    DOT National Transportation Integrated Search

    1978-03-01

    The Cutter Resource Effectiveness Evaluation (CREE) project has developed a sophisticated, user-oriented computer model which can evaluate the effectiveness of any existing Coast Guard craft, or the effectiveness of any of a number of proposed altern...

  6. Chandelier: Picturing Potential

    ERIC Educational Resources Information Center

    Tebbs, Trevor J.

    2014-01-01

    The author--artist, scientist, educator, and visual-spatial thinker--describes the genesis of, and provides insight into, an innovative, strength-based, visually dynamic computer-aided communication system called Chandelier©. This system is the powerful combination of a sophisticated, user-friendly software program and an organizational…

  7. Writing and Computing across the USM Chemistry Curriculum

    NASA Astrophysics Data System (ADS)

    Gordon, Nancy R.; Newton, Thomas A.; Rhodes, Gale; Ricci, John S.; Stebbins, Richard G.; Tracy, Henry J.

    2001-01-01

    The faculty of the University of Southern Maine believes the ability to communicate effectively is one of the most important skills required of successful chemists. To help students achieve that goal, the faculty has developed a Writing and Computer Program consisting of writing and computer assignments of gradually increasing sophistication for all our laboratory courses. The assignments build in complexity until, at the junior level, students are writing full journal-quality laboratory reports. Computer assignments also increase in difficulty as students attack more complicated subjects. We have found the program easy to initiate and our part-time faculty concurs as well. The Writing and Computing across the Curriculum Program also serves to unite the entire chemistry curriculum. We believe the program is helping to reverse what the USM chemistry faculty and other educators have found to be a steady deterioration in the writing skills of many of today's students.

  8. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  9. NECAP: NASA's Energy-Cost Analysis Program. Part 1: User's manual

    NASA Technical Reports Server (NTRS)

    Henninger, R. H. (Editor)

    1975-01-01

    The NECAP is a sophisticated building design and energy analysis tool which has embodied within it all of the latest ASHRAE state-of-the-art techniques for performing thermal load calculation and energy usage predictions. It is a set of six individual computer programs which include: response factor program, data verification program, thermal load analysis program, variable temperature program, system and equipment simulation program, and owning and operating cost program. Each segment of NECAP is described, and instructions are set forth for preparing the required input data and for interpreting the resulting reports.

  10. InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.

    PubMed

    Schenkelberg, Christian D; Bystroff, Christopher

    2015-12-15

    Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  12. Suicide Prevention in a Treatment Setting.

    ERIC Educational Resources Information Center

    Litman, Robert E.

    1995-01-01

    The author anticipates that sophisticated interactive computer programs will be effective in improving screening and case finding of the suicidal and that they will become invaluable in improving training for primary care providers and outpatient mental health workers. Additionally, improved communication networks will help maintain continuity of…

  13. A survey of computational aerodynamics in the United States

    NASA Technical Reports Server (NTRS)

    Gessow, A.; Morris, D. J.

    1977-01-01

    Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.

  14. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  15. Research and Development in Natural Language Understanding as Part of the Strategic Computing Program.

    DTIC Science & Technology

    1987-04-01

    facilities. BBN is developing a series of increasingly sophisticated natural language understanding systems which will serve as an integrated interface...Haas, A.R. A Syntactic Theory of Belief and Action. Artificial Intelligence. 1986. Forthcoming. [6] Hinrichs, E. Temporale Anaphora im Englischen

  16. Transmission loss optimization in acoustic sandwich panels

    NASA Astrophysics Data System (ADS)

    Makris, S. E.; Dym, C. L.; MacGregor Smith, J.

    1986-06-01

    Considering the sound transmission loss (TL) of a sandwich panel as the single objective, different optimization techniques are examined and a sophisticated computer program is used to find the optimum TL. Also, for one of the possible case studies such as core optimization, closed-form expressions are given between TL and the core-design variables for different sets of skins. The significance of these functional relationships lies in the fact that the panel designer can bypass the necessity of using a sophisticated software package in order to assess explicitly the dependence of the TL on core thickness and density.

  17. CBES--An Efficient Implementation of the Coursewriter Language.

    ERIC Educational Resources Information Center

    Franks, Edward W.

    An extensive computer based education system (CBES) built around the IBM Coursewriter III program product at Ohio State University is described. In this system, numerous extensions have been added to the Coursewriter III language to provide capabilities needed to implement sophisticated instructional strategies. CBES design goals include lower CPU…

  18. A Performance Support Tool for Cisco Training Program Managers

    ERIC Educational Resources Information Center

    Benson, Angela D.; Bothra, Jashoda; Sharma, Priya

    2004-01-01

    Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…

  19. Robotics for Computer Scientists: What's the Big Idea?

    ERIC Educational Resources Information Center

    Touretzky, David S.

    2013-01-01

    Modern robots, like today's smartphones, are complex devices with intricate software systems. Introductory robot programming courses must evolve to reflect this reality, by teaching students to make use of the sophisticated tools their robots provide rather than reimplementing basic algorithms. This paper focuses on teaching with Tekkotsu, an open…

  20. Decision-Tree Program

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1994-01-01

    IND computer program introduces Bayesian and Markov/maximum-likelihood (MML) methods and more-sophisticated methods of searching in growing trees. Produces more-accurate class-probability estimates important in applications like diagnosis. Provides range of features and styles with convenience for casual user, fine-tuning for advanced user or for those interested in research. Consists of four basic kinds of routines: data-manipulation, tree-generation, tree-testing, and tree-display. Written in C language.

  1. RighTime: A real time clock correcting program for MS-DOS-based computer systems

    NASA Technical Reports Server (NTRS)

    Becker, G. Thomas

    1993-01-01

    A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.

  2. Let's Dance the "Robot Hokey-Pokey!": Children's Programming Approaches and Achievement throughout Early Cognitive Development

    ERIC Educational Resources Information Center

    Flannery, Louise P.; Bers, Marina Umaschi

    2013-01-01

    Young learners today generate, express, and interact with sophisticated ideas using a range of digital tools to explore interactive stories, animations, computer games, and robotics. In recent years, new developmentally appropriate robotics kits have been entering early childhood classrooms. This paper presents a retrospective analysis of one…

  3. A description of the thruster attitude control simulation and its application to the HEAO-C study

    NASA Technical Reports Server (NTRS)

    Brandon, L. B.

    1971-01-01

    During the design and evaluation of a reaction control system (RCS), it is desirable to have a digital computer program simulating vehicle dynamics, disturbance torques, control torques, and RCS logic. The thruster attitude control simulation (TACS) is just such a computer program. The TACS is a relatively sophisticated digital computer program that includes all the major parameters involved in the attitude control of a vehicle using an RCS for control. It includes the effects of gravity gradient torques and HEAO-C aerodynamic torques so that realistic runs can be made in the areas of fuel consumption and engine actuation rates. Also, the program is general enough that any engine configuration and logic scheme can be implemented in a reasonable amount of time. The results of the application of the TACS in the HEAO-C study are included.

  4. Analysis on laser plasma emission for characterization of colloids by video-based computer program

    NASA Astrophysics Data System (ADS)

    Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni

    2016-02-01

    Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.

  5. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  6. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.

  7. Program Integrity, Controlled Growth Spell Success for Roots of Empathy

    ERIC Educational Resources Information Center

    Gordon, Mary; Letchford, Donna

    2009-01-01

    Childhood is a universal aspect of the human condition. Yet the landscape of childhood is changing rapidly. On playgrounds young children carry cell phones, and in classrooms children are more sophisticated in their use of computers and digital media than the adults in their lives. Most young adolescents are prolific communicators via text and…

  8. Conceptual Memory: A Theory and Computer Program for Processing the Meaning Content of Natural Language Utterances

    DTIC Science & Technology

    1974-07-01

    iiWU -immmemmmmm This document was generated by the Stanford Artificial Intelligence Laboratory’s document compiler, "PUB" and reproducec’ on a...for more sophisticated artificial (programming) languages. The new issues became those of how to represent a grammar as precise syntactic structures...challenge lies in discovering - either by synthesis of an artificial system, or by analysis of a natural one - the underlying logical (a. opposed to

  9. [Application of virtual instrumentation technique in toxicological studies].

    PubMed

    Moczko, Jerzy A

    2005-01-01

    Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.

  10. Computer Program for Analysis, Design and Optimization of Propulsion, Dynamics, and Kinematics of Multistage Rockets

    NASA Astrophysics Data System (ADS)

    Lali, Mehdi

    2009-03-01

    A comprehensive computer program is designed in MATLAB to analyze, design and optimize the propulsion, dynamics, thermodynamics, and kinematics of any serial multi-staging rocket for a set of given data. The program is quite user-friendly. It comprises two main sections: "analysis and design" and "optimization." Each section has a GUI (Graphical User Interface) in which the rocket's data are entered by the user and by which the program is run. The first section analyzes the performance of the rocket that is previously devised by the user. Numerous plots and subplots are provided to display the performance of the rocket. The second section of the program finds the "optimum trajectory" via billions of iterations and computations which are done through sophisticated algorithms using numerical methods and incremental integrations. Innovative techniques are applied to calculate the optimal parameters for the engine and designing the "optimal pitch program." This computer program is stand-alone in such a way that it calculates almost every design parameter in regards to rocket propulsion and dynamics. It is meant to be used for actual launch operations as well as educational and research purposes.

  11. Program For A Pushbutton Display

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M.; Luck, William S., Jr.

    1989-01-01

    Programmable Display Pushbutton (PDP) is pushbutton device available from Micro Switch having programmable 16X35 matrix of light-emitting diodes on pushbutton surface. Any desired legends display on PDP's, producing user-friendly applications reducing need for dedicated manual controls. Interacts with operator, calls for correct response before transmitting next message. Both simple manual control and sophisticated programmable link between operator and host system. Programmable Display Pushbutton Legend Editor (PDPE) computer program used to create light-emitting-diode (LED) displays for pushbuttons. Written in FORTRAN.

  12. Revisiting Mathematical Problem Solving and Posing in the Digital Era: Toward Pedagogically Sound Uses of Modern Technology

    ERIC Educational Resources Information Center

    Abramovich, S.

    2014-01-01

    The availability of sophisticated computer programs such as "Wolfram Alpha" has made many problems found in the secondary mathematics curriculum somewhat obsolete for they can be easily solved by the software. Against this background, an interplay between the power of a modern tool of technology and educational constraints it presents is…

  13. Interactive Software For Astrodynamical Calculations

    NASA Technical Reports Server (NTRS)

    Schlaifer, Ronald S.; Skinner, David L.; Roberts, Phillip H.

    1995-01-01

    QUICK computer program provides user with facilities of sophisticated desk calculator performing scalar, vector, and matrix arithmetic; propagate conic-section orbits; determines planetary and satellite coordinates; and performs other related astrodynamic calculations within FORTRAN-like software environment. QUICK is interpreter, and no need to use compiler or linker to run QUICK code. Outputs plotted in variety of formats on variety of terminals. Written in RATFOR.

  14. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  15. MorphoHawk: Geometric-based Software for Manufacturing and More

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith Arterburn

    2001-04-01

    Hollywood movies portray facial recognition as a perfected technology, but reality is that sophisticated computers and algorithmic calculations are far from perfect. In fact, the most sophisticated and successful computer for recognizing faces and other imagery already is the human brain with more than 10 billion nerve cells. Beginning at birth, humans process data and connect optical and sensory experiences that create unparalleled accumulation of data for people to associate images with life experiences, emotions and knowledge. Computers are powerful, rapid and tireless, but still cannot compare to the highly sophisticated relational calculations and associations that the human computer canmore » produce in connecting ‘what we see with what we know.’« less

  16. Viewing CAD Drawings on the Internet

    ERIC Educational Resources Information Center

    Schwendau, Mark

    2004-01-01

    Computer aided design (CAD) has been producing 3-D models for years. AutoCAD software is frequently used to create sophisticated 3-D models. These CAD files can be exported as 3DS files for import into Autodesk's 3-D Studio Viz. In this program, the user can render and modify the 3-D model before exporting it out as a WRL (world file hyperlinked)…

  17. Thermal Analysis

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The University of Georgia used NASTRAN, a COSMIC program that predicts how a design will stand up under stress, to develop a model for monitoring the transient cooling of vegetables. The winter use of passive solar heating for poultry houses is also under investigation by the Agricultural Engineering Dept. Another study involved thermal analysis of black and green nursery containers. The use of NASTRAN has encouraged student appreciation of sophisticated computer analysis.

  18. Sydney Observatory and astronomy teaching in the 90s

    NASA Astrophysics Data System (ADS)

    Lomb, N.

    1996-05-01

    Computers and the Internet have created a revolution in the way astronomy can be communicated to the public. At Sydney Observatory we make full use of these recent developments. In our lecture room a variety of sophisticated computer programs can show, with the help of a projection TV system, the appearance and motion of the sky at any place, date or time. The latest HST images obtained from the Internet can be shown, as can images taken through our own Meade 16 inch telescope. This recently installed computer-controlled telescope with its accurate pointing is an ideal instrument for a light-polluted site such as ours.

  19. Simulating motivated cognition

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.

  20. Flight test validation of a design procedure for digital autopilots

    NASA Technical Reports Server (NTRS)

    Bryant, W. H.

    1983-01-01

    Commercially available general aviation autopilots are currently in transition from an analogue circuit system to a computer implemented digital flight control system. Well known advantages of the digital autopilot include enhanced modes, self-test capacity, fault detection, and greater computational capacity. A digital autopilot's computational capacity can be used to full advantage by increasing the sophistication of the digital autopilot's chief function, stability and control. NASA's Langley Research Center has been pursuing the development of direct digital design tools for aircraft stabilization systems for several years. This effort has most recently been directed towards the development and realization of multi-mode digital autopilots for GA aircraft, conducted under a SPIFR-related program called the General Aviation Terminal Operations Research (GATOR) Program. This presentation focuses on the implementation and testing of a candidate multi-mode autopilot designed using these newly developed tools.

  1. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  2. Petascale computation performance of lightweight multiscale cardiac models using hybrid programming models.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-01-01

    Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.

  3. An integrated decision support system for TRAC: A proposal

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    Optimal allocation and usage of resources is a key to effective management. Resources of concern to TRAC are: Manpower (PSY), Money (Travel, contracts), Computing, Data, Models, etc. Management activities of TRAC include: Planning, Programming, Tasking, Monitoring, Updating, and Coordinating. Existing systems are insufficient, not completely automated, manpower intensive, and has the potential for data inconsistency exists. A system is proposed which suggests a means to integrate all project management activities of TRAC through the development of a sophisticated software and by utilizing the existing computing systems and network resources. The systems integration proposal is examined in detail.

  4. The design and implementation of CRT displays in the TCV real-time simulation

    NASA Technical Reports Server (NTRS)

    Leavitt, J. B.; Tariq, S. I.; Steinmetz, G. G.

    1975-01-01

    The design and application of computer graphics to the Terminal Configured Vehicle (TCV) program were described. A Boeing 737-100 series aircraft was modified with a second flight deck and several computers installed in the passenger cabin. One of the elements in support of the TCV program is a sophisticated simulation system developed to duplicate the operation of the aft flight deck. This facility consists of an aft flight deck simulator, equipped with realistic flight instrumentation, a CDC 6600 computer, and an Adage graphics terminal; this terminal presents to the simulator pilot displays similar to those used on the aircraft with equivalent man-machine interactions. These two displays form the primary flight instrumentation for the pilot and are dynamic images depicting critical flight information. The graphics terminal is a high speed interactive refresh-type graphics system. To support the cockpit display, two remote CRT's were wired in parallel with two of the Adage scopes.

  5. Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.

    PubMed

    Lötsch, J; Kobal, G; Geisslinger, G

    2004-01-01

    Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

  6. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation.

    PubMed

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud

    2008-12-01

    Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.

  7. Producing picture-perfect posters.

    PubMed

    Bach, D B; Vellet, A D; Karlik, S J; Downey, D B; Levin, M F; Munk, P L

    1993-06-01

    Scientific posters form an integral part of many radiology meetings. They provide the opportunity for interested parties to read the material at an individualized pace, to study the images in detail, and to return to the exhibit numerous times. Although the content of the poster is undoubtedly its most important component, the visual presentation of the material can enhance or detract from the clarity of the message. With the wide availability of sophisticated computer programs for desktop publishing (DTP), one can now create the poster on a computer monitor with full control of the form as well as the content. This process will result in a professional-appearing poster, yet still allow the author the opportunity to make innumerable revisions, as the poster is visualized in detail on the computer monitor before printing. Furthermore, this process is less expensive than the traditional method of typesetting individual sections separately and mounting them on cardboard for display. The purpose of this article is to present our approach to poster production using commercially available DTP computer programs.

  8. IPAD: Integrated Programs for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.

    1985-01-01

    Early work was performed to apply data base technology in support of the management of engineering data in the design and manufacturing environments. The principal objective of the IPAD project is to develop a computer software system for use in the design of aerospace vehicles. Two prototype systems are created for this purpose. Relational Information Manager (RIM) is a successful commercial product. The IPAD Information Processor (IPIP), a much more sophisticated system, is still under development.

  9. Teaching biomedical applications to secondary students.

    PubMed

    Openshaw, S; Fleisher, A; Ljunggren, C

    1999-01-01

    Certain aspects of biomedical engineering applications lend themselves well to experimentation that can be done by high school students. This paper describes two experiments done during a six-week summer internship program in which two high school students used electrodes, circuit boards, and computers to mimic a sophisticated heart monitor and also to control a robotic car. Our experience suggests that simple illustrations of complex instrumentation can be effective in introducing adolescents to the biomedical engineering field.

  10. High resolution image processing on low-cost microcomputers

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1993-01-01

    Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.

  11. Scaling of data communications for an advanced supercomputer network

    NASA Technical Reports Server (NTRS)

    Levin, E.; Eaton, C. K.; Young, Bruce

    1986-01-01

    The goal of NASA's Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations and by remote communication to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. The implications of a projected 20-fold increase in processing power on the data communications requirements are described.

  12. VPython: Writing Real-time 3D Physics Programs

    NASA Astrophysics Data System (ADS)

    Chabay, Ruth

    2001-06-01

    VPython (http://cil.andrew.cmu.edu/projects/visual) combines the Python programming language with an innovative 3D graphics module called Visual, developed by David Scherer. Designed to make 3D physics simulations accessible to novice programmers, VPython allows the programmer to write a purely computational program without any graphics code, and produces an interactive realtime 3D graphical display. In a program 3D objects are created and their positions modified by computational algorithms. Running in a separate thread, the Visual module monitors the positions of these objects and renders them many times per second. Using the mouse, one can zoom and rotate to navigate through the scene. After one hour of instruction, students in an introductory physics course at Carnegie Mellon University, including those who have never programmed before, write programs in VPython to model the behavior of physical systems and to visualize fields in 3D. The Numeric array processing module allows the construction of more sophisticated simulations and models as well. VPython is free and open source. The Visual module is based on OpenGL, and runs on Windows, Linux, and Macintosh.

  13. Automatic Compilation from High-Level Biologically-Oriented Programming Language to Genetic Regulatory Networks

    PubMed Central

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    Background The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. Methodology/Principal Findings To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes () and latency of the optimized engineered gene networks. Conclusions/Significance Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems. PMID:21850228

  14. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    PubMed

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  15. A Computer Based Data Management System for Automating the Air Force Vehicle Master Plan

    DTIC Science & Technology

    1989-09-01

    and columns -- but a much more sophisticated approach defining the relations among the data. Dr. Edgar F . Codd , who first proposed the relational...39 f . Vehicle Types Screen ... .......... 41 7. NSN Selection Screen ... .......... 41 8. NSN Retry Screen ..... ............ 9. F:ve-Year Outlook...had a single source o: inormation on the vehicle fleet, to be used in deve- uctn..: f .. ng and prioratizing the vehicle program- re~suireo to meet the

  16. The Matter Simulation (R)evolution

    PubMed Central

    2018-01-01

    To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime. PMID:29532014

  17. Can An Evolutionary Process Create English Text?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less

  18. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation

    PubMed Central

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597

  19. DYNA3D: A computer code for crashworthiness engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallquist, J.O.; Benson, D.J.

    1986-09-01

    A finite element program with crashworthiness applications has been developed at LLNL. DYNA3D, an explicit, fully vectorized, finite deformation structural dynamics program, has four capabilities that are critical for the efficient and realistic modeling crash phenomena: (1) fully optimized nonlinear solid, shell, and beam elements for representing a structure; (2) a broad range of constitutive models for simulating material behavior; (3) sophisticated contact algorithms for impact interactions; (4) a rigid body capability to represent the bodies away from the impact region at a greatly reduced cost without sacrificing accuracy in the momentum calculations. Basic methodologies of the program are brieflymore » presented along with several crashworthiness calculations. Efficiencies of the Hughes-Liu and Belytschko-Tsay shell formulations are considered.« less

  20. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  1. Toward a molecular programming language for algorithmic self-assembly

    NASA Astrophysics Data System (ADS)

    Patitz, Matthew John

    Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.

  2. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    PubMed

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  3. Depicting 3D shape using lines

    NASA Astrophysics Data System (ADS)

    DeCarlo, Doug

    2012-03-01

    Over the last few years, researchers in computer graphics have developed sophisticated mathematical descriptions of lines on 3D shapes that can be rendered convincingly as strokes in drawings. These innovations highlight fundamental questions about how human perception takes strokes in drawings as evidence of 3D structure. Answering these questions will lead to a greater scientific understanding of the flexibility and richness of human perception, as well as to practical techniques for synthesizing clearer and more compelling drawings. This paper reviews what is known about the mathematics and perception of computer-generated line drawings of shape and motivates an ongoing program of research to better characterize the shapes people see when they look at such drawings.

  4. V/STOLAND digital avionics system for XV-15 tilt rotor

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1980-01-01

    A digital flight control system for the tilt rotor research aircraft provides sophisticated navigation, guidance, control, display and data acquisition capabilities for performing terminal area navigation, guidance and control research. All functions of the XV-15 V/STOLAND system were demonstrated on the NASA-ARC S-19 simulation facility under a comprehensive dynamic acceptance test. The most noteworthy accomplishments of the system are: (1) automatic configuration control of a tilt-rotor aircraft over the total operating range; (2) total hands-off landing to touchdown on various selectable straight-in glide slopes and on a flight path that includes a two-revolution helix; (3) automatic guidance along a programmed three-dimensional reference flight path; (4) navigation data for the automatic guidance computed on board, based on VOR/DME, TACAN, or MLS navid data; and (5) integration of a large set of functions in a single computer, utilizing 16k words of storage for programs and data.

  5. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  6. More details...
  7. Earth Science Informatics Comes of Age

    NASA Technical Reports Server (NTRS)

    Jodha, Siri; Khalsa, S.; Ramachandran, Rahul

    2014-01-01

    The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.

  8. Evolution of a minimal parallel programming model

    DOE PAGES

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-04-30

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  9. Speeding Up Ecological and Evolutionary Computations in R; Essentials of High Performance Computing for Biologists

    PubMed Central

    Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke

    2015-01-01

    Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842

  10. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  12. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  13. M.S.L.A.P. Modular Spectral Line Analysis Program documentation

    NASA Technical Reports Server (NTRS)

    Joseph, Charles L.; Jenkins, Edward B.

    1991-01-01

    MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.

  14. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  15. Lessons from a doctoral thesis.

    PubMed

    Peiris, A N; Mueller, R A; Sheridan, D P

    1990-01-01

    The production of a doctoral thesis is a time-consuming affair that until recently was done in conjunction with professional publishing services. Advances in computer technology have made many sophisticated desktop publishing techniques available to the microcomputer user. We describe the computer method used, the problems encountered, and the solutions improvised in the production of a doctoral thesis by computer. The Apple Macintosh was selected for its ease of use and intrinsic graphics capabilities. A scanner was used to incorporate text from published papers into a word processing program. The body of the text was updated and supplemented with new sections. Scanned graphics from the published papers were less suitable for publication, and the original data were replotted and modified with a graphics-drawing program. Graphics were imported and incorporated in the text. Final hard copy was produced by a laser printer and bound with both conventional and rapid new binding techniques. Microcomputer-based desktop processing methods provide a rapid and cost-effective means of communicating the written word. We anticipate that this evolving technology will have increased use by physicians in both the private and academic sectors.

  16. Combined Numerical/Analytical Perturbation Solutions of the Navier-Stokes Equations for Aerodynamic Ejector/Mixer Nozzle Flows

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence Justin

    1998-01-01

    In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.

  17. Development of a New System for Transport Simulation and Analysis at General Atomics

    NASA Astrophysics Data System (ADS)

    St. John, H. E.; Peng, Q.; Freeman, J.; Crotinger, J.

    1997-11-01

    General Atomics has begun a long term program to improve all aspects of experimental data analysis related to DIII--D. The object is to make local and visiting physicists as productive as possible, with only a small investment in training, by developing intuitive, sophisticated interfaces to existing and newly created computer programs. Here we describe our initial work and results of a pilot project in this program. The pilot project is a collaboratory effort between LLNL and GA which will ultimately result in the merger of Corsica and ONETWO (and selected modules from other codes) into a new advanced transport code system. The initial goal is to produce a graphical user interface to the transport code ONETWO which will couple to a programmable (steerable) front end designed for the transport system. This will be an object oriented scheme written primarily in python. The programmable application will integrate existing C, C^++, and Fortran methods in a single computational paradigm. Its most important feature is the use of plug in physics modules which will allow a high degree of customization.

  18. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  19. Explosive Transient Camera (ETC) Program

    NASA Technical Reports Server (NTRS)

    Ricker, George

    1991-01-01

    Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

  20. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species

    PubMed Central

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien

    2017-01-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities. PMID:29112973

  21. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    PubMed

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien; Masi, Shelly; Daunizeau, Jean

    2017-11-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  1. Coupled rotor/airframe vibration analysis

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.

    1982-01-01

    A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.

  2. MHOST: An efficient finite element program for inelastic analysis of solids and structures

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1988-01-01

    An efficient finite element program for 3-D inelastic analysis of gas turbine hot section components was constructed and validated. A novel mixed iterative solution strategy is derived from the augmented Hu-Washizu variational principle in order to nodally interpolate coordinates, displacements, deformation, strains, stresses and material properties. A series of increasingly sophisticated material models incorporated in MHOST include elasticity, secant plasticity, infinitesimal and finite deformation plasticity, creep and unified viscoplastic constitutive model proposed by Walker. A library of high performance elements is built into this computer program utilizing the concepts of selective reduced integrations and independent strain interpolations. A family of efficient solution algorithms is implemented in MHOST for linear and nonlinear equation solution including the classical Newton-Raphson, modified, quasi and secant Newton methods with optional line search and the conjugate gradient method.

  3. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  4. Data communication requirements for the advanced NAS network

    NASA Technical Reports Server (NTRS)

    Levin, Eugene; Eaton, C. K.; Young, Bruce

    1986-01-01

    The goal of the Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations, and by remote communications to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. In the 1987/1988 time period it is anticipated that a computer with 4 times the processing speed of a Cray 2 will be obtained and by 1990 an additional supercomputer with 16 times the speed of the Cray 2. The implications of this 20-fold increase in processing power on the data communications requirements are described. The analysis was based on models of the projected workload and system architecture. The results are presented together with the estimates of their sensitivity to assumptions inherent in the models.

  5. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed Central

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  6. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed

    Gulotta, M

    1995-11-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.

  7. The Center for Computational Biology: resources, achievements, and challenges

    PubMed Central

    Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2011-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221

  8. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  9. Nursing informatics: the trend of the future.

    PubMed

    Nagelkerk, J; Ritola, P M; Vandort, P J

    1998-01-01

    Nursing informatics is a combination of computer, information, and nursing sciences. This new and expanding field addresses the efficient and effective use of information for nurses. Preparing nurses for computerization is essential to confront an explosion of sophisticated computerized technology in the workplace. It is critical in a competitive health care market for preparing nurses to use the most cost-effective methods. A model is presented that identifies six essential factors for preparing nurses for computerization. Strong leadership, effective communication, organized training sessions, established time frames, planned change, and tailored software are the essential factors to consider for development of a successful educational program.

  10. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  11. The personal computer and GP-B management. [Gravity Probe experiment

    NASA Technical Reports Server (NTRS)

    Neighbors, A. K.

    1986-01-01

    The Gravity Probe-B (GP-B) experiment is one of the most sophisticated and challenging developments to be undertaken by NASA. Its objective is to measure the relativistic drift of gyroscopes in orbit about the earth. In this paper, the experiment is described, and the strategy of phased procurements for accomplishing the engineering development of the hardware is discussed. The microcomputer is a very convenient and powerful tool in the management of GP-B. It is used in creating and monitoring such project data as schedules, budgets, hardware procurements and technical and interface requirements. Commercially available software in word processing, database management, communications, spreadsheet, graphics and program management are used. Examples are described of the efficacy of the application of the computer by the management team.

  12. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  13. The Next Computer Revolution.

    ERIC Educational Resources Information Center

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  14. Ocean Models and Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Salas-de-Leon, D. A.

    2007-05-01

    The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.

  15. Allen Newell's Program of Research: The Video-Game Test.

    PubMed

    Gobet, Fernand

    2017-04-01

    Newell (1973) argued that progress in psychology was slow because research focused on experiments trying to answer binary questions, such as serial versus parallel processing. In addition, not enough attention was paid to the strategies used by participants, and there was a lack of theories implemented as computer models offering sufficient precision for being tested rigorously. He proposed a three-headed research program: to develop computational models able to carry out the task they aimed to explain; to study one complex task in detail, such as chess; and to build computational models that can account for multiple tasks. This article assesses the extent to which the papers in this issue advance Newell's program. While half of the papers devote much attention to strategies, several papers still average across them, a capital sin according to Newell. The three courses of action he proposed were not popular in these papers: Only two papers used computational models, with no model being both able to carry out the task and to account for human data; there was no systematic analysis of a specific video game; and no paper proposed a computational model accounting for human data in several tasks. It is concluded that, while they use sophisticated methods of analysis and discuss interesting results, overall these papers contribute only little to Newell's program of research. In this respect, they reflect the current state of psychology and cognitive science. This is a shame, as Newell's ideas might help address the current crisis of lack of replication and fraud in psychology. Copyright © 2017 The Author. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  16. Knowledge-based computer systems for radiotherapy planning.

    PubMed

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  17. New method for identifying features of an image on a digital video display

    NASA Astrophysics Data System (ADS)

    Doyle, Michael D.

    1991-04-01

    The MetaMap process extends the concept of direct manipulation human-computer interfaces to new limits. Its specific capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. The correlation is accomplished through reprogramming of both the color map and the image so that discrete image elements comprise unique sets of color indices. This process allows the correlation to be accomplished with very efficient data storage and program execution times. Image databases adapted to this process become object-oriented as a result. Very sophisticated interrelationships can be set up between images text and program control mechanisms using this process. An application of this interfacing process to the design of an interactive atlas of medical histology as well as other possible applications are described. The MetaMap process is protected by U. S. patent #4

  18. Goals and Objectives for Computing in the Associated Colleges of the St. Lawrence Valley.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    A forecast of the computing requirements of the Associated Colleges of the St. Lawrence Valley, an analysis of their needs, and specifications for a joint computer system are presented. Problems encountered included the lack of resources and computer sophistication at the member schools and a dearth of experience with long-term computer consortium…

  19. The flight telerobotic servicer and technology transfer

    NASA Technical Reports Server (NTRS)

    Andary, James F.; Bradford, Kayland Z.

    1991-01-01

    The Flight Telerobotic Servicer (FTS) project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station Freedom (SSF). The FTS will provide a telerobotic capability in the early phases of the SSF program and will be employed for assembly, maintenance, and inspection applications. The current state of space technology and the general nature of the FTS tasks dictate that the FTS be designed with sophisticated teleoperational capabilities for its internal primary operating mode. However, technologies such as advanced computer vision and autonomous planning techniques would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Another objective of the FTS program is to accelerate technology transfer from research to U.S. industry.

  20. Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.

    ERIC Educational Resources Information Center

    McDonough, Kristin

    The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…

  1. Wing Leading Edge RCC Rapid Response Damage Prediction Tool (IMPACT2)

    NASA Technical Reports Server (NTRS)

    Clark, Robert; Cottter, Paul; Michalopoulos, Constantine

    2013-01-01

    This rapid response computer program predicts Orbiter Wing Leading Edge (WLE) damage caused by ice or foam impact during a Space Shuttle launch (Program "IMPACT2"). The program was developed after the Columbia accident in order to assess quickly WLE damage due to ice, foam, or metal impact (if any) during a Shuttle launch. IMPACT2 simulates an impact event in a few minutes for foam impactors, and in seconds for ice and metal impactors. The damage criterion is derived from results obtained from one sophisticated commercial program, which requires hours to carry out simulations of the same impact events. The program was designed to run much faster than the commercial program with prediction of projectile threshold velocities within 10 to 15% of commercial-program values. The mathematical model involves coupling of Orbiter wing normal modes of vibration to nonlinear or linear springmass models. IMPACT2 solves nonlinear or linear impact problems using classical normal modes of vibration of a target, and nonlinear/ linear time-domain equations for the projectile. Impact loads and stresses developed in the target are computed as functions of time. This model is novel because of its speed of execution. A typical model of foam, or other projectile characterized by material nonlinearities, impacting an RCC panel is executed in minutes instead of hours needed by the commercial programs. Target damage due to impact can be assessed quickly, provided that target vibration modes and allowable stress are known.

  2. Using multimedia virtual patients to enhance the clinical curriculum for medical students.

    PubMed

    McGee, J B; Neill, J; Goldman, L; Casey, E

    1998-01-01

    Changes in the environment in which clinical medical education takes place in the United States has profoundly affected the quality of the learning experience. A shift to out-patient based care, minimization of hospitalization time, and shrinking clinical revenues has changed the teaching hospital or "classroom" to a degree that we must develop innovative approaches to medical education. One solution is the Virtual Patient Project. Utilizing state-of-the-art computer-based multimedia technology, we are building a library of simulated patient encounters that will serve to fill some of the educational gaps that the current health care system has created. This project is part of a newly formed and unique organization, the Harvard Medical School-Beth Israel Deaconess Mount Auburn Institute for Education and Research (the Institute), which supports in-house educational design, production, and faculty time to create Virtual Patients. These problem-based clinical cases allow the medical student to evaluate a patient at initial presentation, order diagnostic tests, observe the outcome and obtain context-sensitive feedback through a computer program designed at the Institute. Multimedia technology and authoring programs have reached a level of sophistication to allow content experts (the teaching faculty) to design and create the majority of the program themselves and to allow students to adapt the program to their individual learning needs.

  3. Image labeling. The need for a better look.

    PubMed

    Hunter, T

    1994-10-01

    The important message in this editorial is for radiologists to critically examine how well images are labeled in their own department. If it is not satisfactory, then institute corrective measures. These can range from sophisticated computer programs for printing flashcards to merely sending the chief technologist all those films one comes across with unreadable labels. The quality of the image labeling should also be a consideration when purchasing CT, MRI, ultrasound, computed radiography and digital angiography equipment. The fact that you consider this important should be communicated to equipment manufacturers in the hope that they will pay more attention to it and offer more flexibility for each department to design its own labels. In any event, I feel consistently bad film labeling results in sloppy radiology with possible patient harm and unpleasant legal consequences for the radiologist.

  4. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  5. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  6. A review of propeller discrete frequency noise prediction technology with emphasis on two current methods for time domain calculations

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Succi, G. P.

    1980-01-01

    A review of propeller noise prediction technology is presented which highlights the developments in the field from the successful attempt of Gutin to the current sophisticated techniques. Two methods for the predictions of the discrete frequency noise from conventional and advanced propellers in forward flight are described. These methods developed at MIT and NASA Langley Research Center are based on different time domain formulations. Brief description of the computer algorithms based on these formulations are given. The output of these two programs, which is the acoustic pressure signature, is Fourier analyzed to get the acoustic pressure spectrum. The main difference between the programs as they are coded now is that the Langley program can handle propellers with supersonic tip speed while the MIT program is for subsonic tip speed propellers. Comparisons of the calculated and measured acoustic data for a conventional and an advanced propeller show good agreement in general.

  7. Commentary: The Materials Project: A materials genome approach to accelerating materials innovation

    NASA Astrophysics Data System (ADS)

    Jain, Anubhav; Ong, Shyue Ping; Hautier, Geoffroy; Chen, Wei; Richards, William Davidson; Dacek, Stephen; Cholia, Shreyas; Gunter, Dan; Skinner, David; Ceder, Gerbrand; Persson, Kristin A.

    2013-07-01

    Accelerating the discovery of advanced materials is essential for human welfare and sustainable, clean energy. In this paper, we introduce the Materials Project (www.materialsproject.org), a core program of the Materials Genome Initiative that uses high-throughput computing to uncover the properties of all known inorganic materials. This open dataset can be accessed through multiple channels for both interactive exploration and data mining. The Materials Project also seeks to create open-source platforms for developing robust, sophisticated materials analyses. Future efforts will enable users to perform ``rapid-prototyping'' of new materials in silico, and provide researchers with new avenues for cost-effective, data-driven materials design.

  8. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  9. How You Can Protect Public Access Computers "and" Their Users

    ERIC Educational Resources Information Center

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  10. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  11. How the strengths of Lisp-family languages facilitate building complex and flexible bioinformatics applications

    PubMed Central

    Khomtchouk, Bohdan B; Weitz, Edmund; Karp, Peter D; Wahlestedt, Claes

    2018-01-01

    Abstract We present a rationale for expanding the presence of the Lisp family of programming languages in bioinformatics and computational biology research. Put simply, Lisp-family languages enable programmers to more quickly write programs that run faster than in other languages. Languages such as Common Lisp, Scheme and Clojure facilitate the creation of powerful and flexible software that is required for complex and rapidly evolving domains like biology. We will point out several important key features that distinguish languages of the Lisp family from other programming languages, and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSLs): languages that are specialized to a particular area, and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the ‘programmable programming language’. We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and artificial intelligence research in bioinformatics and computational biology. PMID:28040748

  12. How the strengths of Lisp-family languages facilitate building complex and flexible bioinformatics applications.

    PubMed

    Khomtchouk, Bohdan B; Weitz, Edmund; Karp, Peter D; Wahlestedt, Claes

    2018-05-01

    We present a rationale for expanding the presence of the Lisp family of programming languages in bioinformatics and computational biology research. Put simply, Lisp-family languages enable programmers to more quickly write programs that run faster than in other languages. Languages such as Common Lisp, Scheme and Clojure facilitate the creation of powerful and flexible software that is required for complex and rapidly evolving domains like biology. We will point out several important key features that distinguish languages of the Lisp family from other programming languages, and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSLs): languages that are specialized to a particular area, and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the 'programmable programming language'. We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and artificial intelligence research in bioinformatics and computational biology.

  13. Evolution of Computational Toxicology-from Primitive ...

    EPA Pesticide Factsheets

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  14. Evaluating coastal and river valley communities evacuation network performance using macroscopic productivity.

    DOT National Transportation Integrated Search

    2017-06-30

    The ever-increasing processing speed and computational power of computers and simulation systems has led to correspondingly larger, more sophisticated representations of evacuation traffic processes. Today, micro-level analyses can be conducted for m...

  15. Computers for Interactive Learning.

    ERIC Educational Resources Information Center

    Grabowski, Barbara; Aggen, William

    1984-01-01

    Analyzes features of computer-based interactive video including sophisticated answer judging, diagnostic feedback, simulation, animation, audible tones, touch sensitive screen, function keys, and video enhancements, and matches these to the characteristics and pedagogical styles of learners. The learner characteristics discussed include internal…

  16. Interactive Forecasting with the National Weather Service River Forecast System

    NASA Technical Reports Server (NTRS)

    Smith, George F.; Page, Donna

    1993-01-01

    The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.

  17. Development and preliminary validation of an index for indicating the risks of the design of working hours to health and wellbeing.

    PubMed

    Schomann, Carsten; Giebel, Ole; Nachreiner, Friedhelm

    2006-01-01

    BASS 4, a computer program for the design and evaluation of workings hours, is an example of an ergonomics-based software tool that can be used by safety practitioners at the shop floor with regard to legal, ergonomic, and economic criteria. Based on experiences with this computer program, a less sophisticated Working-Hours-Risk Index for assessing the quality of work schedules (including flexible work hours) to indicate risks to health and wellbeing has been developed to provide a quick and easy applicable tool for legally required risk assessments. The results of a validation study show that this risk index seems to be a promising indicator for predicting risks of health complaints and wellbeing. The purpose of the Risk Index is to simplify the evaluation process at the shop floor and provide some more general information about the quality of a work schedule that can be used for triggering preventive interventions. Such a risk index complies with practitioners' expectations and requests for easy, useful, and valid instruments.

  18. BOOK REVIEW: Mathematica for Theoretical Physics: Electrodynamics, Quantum Mechanics, General Relativity and Fractals

    NASA Astrophysics Data System (ADS)

    Heusler, Stefan

    2006-12-01

    The main focus of the second, enlarged edition of the book Mathematica for Theoretical Physics is on computational examples using the computer program Mathematica in various areas in physics. It is a notebook rather than a textbook. Indeed, the book is just a printout of the Mathematica notebooks included on the CD. The second edition is divided into two volumes, the first covering classical mechanics and nonlinear dynamics, the second dealing with examples in electrodynamics, quantum mechanics, general relativity and fractal geometry. The second volume is not suited for newcomers because basic and simple physical ideas which lead to complex formulas are not explained in detail. Instead, the computer technology makes it possible to write down and manipulate formulas of practically any length. For researchers with experience in computing, the book contains a lot of interesting and non-trivial examples. Most of the examples discussed are standard textbook problems, but the power of Mathematica opens the path to more sophisticated solutions. For example, the exact solution for the perihelion shift of Mercury within general relativity is worked out in detail using elliptic functions. The virial equation of state for molecules' interaction with Lennard-Jones-like potentials is discussed, including both classical and quantum corrections to the second virial coefficient. Interestingly, closed solutions become available using sophisticated computing methods within Mathematica. In my opinion, the textbook should not show formulas in detail which cover three or more pages—these technical data should just be contained on the CD. Instead, the textbook should focus on more detailed explanation of the physical concepts behind the technicalities. The discussion of the virial equation would benefit much from replacing 15 pages of Mathematica output with 15 pages of further explanation and motivation. In this combination, the power of computing merged with physical intuition would be of benefit even for newcomers. In summary, this book shows in a convincing manner how classical problems in physics can be attacked with modern computing technology. The second volume is interesting for experienced users of Mathematica. For students, the textbook can be very useful in combination with a seminar.

  19. The Einstein Center for Epigenomics: studying the role of epigenomic dysregulation in human disease.

    PubMed

    McLellan, Andrew S; Dubin, Robert A; Jing, Qiang; Maqbool, Shahina B; Olea, Raul; Westby, Gael; Broin, Pilib Ó; Fazzari, Melissa J; Zheng, Deyou; Suzuki, Masako; Greally, John M

    2009-10-01

    There is increasing interest in the role of epigenetic and transcriptional dysregulation in the pathogenesis of a range of human diseases, not just in the best-studied example of cancer. It is, however, quite difficult for an individual investigator to perform these studies, as they involve genome-wide molecular assays combined with sophisticated computational analytical approaches of very large datasets that may be generated from various resources and technologies. In 2008, the Albert Einstein College of Medicine in New York, USA established a Center for Epigenomics to facilitate the research programs of its investigators, providing shared resources for genome-wide assays and for data analysis. As a result, several avenues of research are now expanding, with cancer epigenomics being complemented by studies of the epigenomics of infectious disease and a neuroepigenomics program.

  20. Structural variation discovery in the cancer genome using next generation sequencing: Computational solutions and perspectives

    PubMed Central

    Liu, Biao; Conroy, Jeffrey M.; Morrison, Carl D.; Odunsi, Adekunle O.; Qin, Maochun; Wei, Lei; Trump, Donald L.; Johnson, Candace S.; Liu, Song; Wang, Jianmin

    2015-01-01

    Somatic Structural Variations (SVs) are a complex collection of chromosomal mutations that could directly contribute to carcinogenesis. Next Generation Sequencing (NGS) technology has emerged as the primary means of interrogating the SVs of the cancer genome in recent investigations. Sophisticated computational methods are required to accurately identify the SV events and delineate their breakpoints from the massive amounts of reads generated by a NGS experiment. In this review, we provide an overview of current analytic tools used for SV detection in NGS-based cancer studies. We summarize the features of common SV groups and the primary types of NGS signatures that can be used in SV detection methods. We discuss the principles and key similarities and differences of existing computational programs and comment on unresolved issues related to this research field. The aim of this article is to provide a practical guide of relevant concepts, computational methods, software tools and important factors for analyzing and interpreting NGS data for the detection of SVs in the cancer genome. PMID:25849937

  1. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  2. A Computer Analysis of Library Postcards. (CALP)

    ERIC Educational Resources Information Center

    Stevens, Norman D.

    1974-01-01

    A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)

  3. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  4. Factors Affecting Utilization of Information Output of Computer-Based Modeling Procedures in Local Government Organizations.

    ERIC Educational Resources Information Center

    Komsky, Susan

    Fiscal Impact Budgeting Systems (FIBS) are sophisticated computer based modeling procedures used in local government organizations, whose results, however, are often overlooked or ignored by decision makers. A study attempted to discover the reasons for this situation by focusing on four factors: potential usefulness, faith in computers,…

  5. Use of Computer-Assisted Technologies (CAT) to Enhance Social, Communicative, and Language Development in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Ploog, Bertram O.; Scharf, Alexa; Nelson, DeShawn; Brooks, Patricia J.

    2013-01-01

    Major advances in multimedia computer technology over the past decades have made sophisticated computer games readily available to the public. This, combined with the observation that most children, including those with autism spectrum disorders (ASD), show an affinity to computers, has led researchers to recognize the potential of computer…

  6. Chemical Education from Programs for Learning, Inc.

    ERIC Educational Resources Information Center

    Petrich, James A.

    1981-01-01

    This software review focuses on five concept-related packages of programs in the Apple version and are viewed as well-written in terms of both educational sophistication and programing expertise. (MP)

  7. Career transitions for persons with severe physical disabilities: integrating technological and psychosocial skills and accommodations.

    PubMed

    Lash, M; Licenziato, V

    1995-01-01

    This article describes a vocational training program entitled, 'Careers in Automation for Persons with Severe Physical Disabilities', that was developed by the Department of Physical Medicine and Rehabilitation at Tufts University School of Medicine in collaboration with the Massachusetts Rehabilitation Commission. Its goal is to secure employment for individuals with severe physical impairments by using computers and technology as job related accommodations. Psychosocial, educational, and vocational profiles are presented for 24 clients over 4 years. Three case studies involving persons with traumatic, chronic and developmental disabilities illustrate the importance of matching technological accommodations with employer needs and personal preferences. Discussion of employment outcomes illustrates that the effective use of computers and technology by persons with disabilities is best measured not by the degree of sophistication and engineering of systems and devices, but by employer and employee satisfaction with job performance and productivity.

  8. Study on the application of NASA energy management techniques for control of a terrestrial solar water heating system

    NASA Technical Reports Server (NTRS)

    Swanson, T. D.; Ollendorf, S.

    1979-01-01

    This paper addresses the potential for enhanced solar system performance through sophisticated control of the collector loop flow rate. Computer simulations utilizing the TRNSYS solar energy program were performed to study the relative effect on system performance of eight specific control algorithms. Six of these control algorithms are of the proportional type: two are concave exponentials, two are simple linear functions, and two are convex exponentials. These six functions are typical of what might be expected from future, more advanced, controllers. The other two algorithms are of the on/off type and are thus typical of existing control devices. Results of extensive computer simulations utilizing actual weather data indicate that proportional control does not significantly improve system performance. However, it is shown that thermal stratification in the liquid storage tank may significantly improve performance.

  9. Human-machine interface hardware: The next decade

    NASA Technical Reports Server (NTRS)

    Marcus, Elizabeth A.

    1991-01-01

    In order to understand where human-machine interface hardware is headed, it is important to understand where we are today, how we got there, and what our goals for the future are. As computers become more capable, faster, and programs become more sophisticated, it becomes apparent that the interface hardware is the key to an exciting future in computing. How can a user interact and control a seemingly limitless array of parameters effectively? Today, the answer is most often a limitless array of controls. The link between these controls and human sensory motor capabilities does not utilize existing human capabilities to their full extent. Interface hardware for teleoperation and virtual environments is now facing a crossroad in design. Therefore, we as developers need to explore how the combination of interface hardware, human capabilities, and user experience can be blended to get the best performance today and in the future.

  10. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  11. Verification of Software: The Textbook and Real Problems

    NASA Technical Reports Server (NTRS)

    Carlson, Jan-Renee

    2006-01-01

    The process of verification, or determining the order of accuracy of computational codes, can be problematic when working with large, legacy computational methods that have been used extensively in industry or government. Verification does not ensure that the computer program is producing a physically correct solution, it ensures merely that the observed order of accuracy of solutions are the same as the theoretical order of accuracy. The Method of Manufactured Solutions (MMS) is one of several ways for determining the order of accuracy. MMS is used to verify a series of computer codes progressing in sophistication from "textbook" to "real life" applications. The degree of numerical precision in the computations considerably influenced the range of mesh density to achieve the theoretical order of accuracy even for 1-D problems. The choice of manufactured solutions and mesh form shifted the observed order in specific areas but not in general. Solution residual (iterative) convergence was not always achieved for 2-D Euler manufactured solutions. L(sub 2,norm) convergence differed variable to variable therefore an observed order of accuracy could not be determined conclusively in all cases, the cause of which is currently under investigation.

  12. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    PubMed

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  13. Progress in Computational Electron-Molecule Collisions

    NASA Astrophysics Data System (ADS)

    Rescigno, Tn

    1997-10-01

    The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.

  14. New Ways of Using Computers in Language Teaching. New Ways in TESOL Series II. Innovative Classroom Techniques.

    ERIC Educational Resources Information Center

    Boswood, Tim, Ed.

    A collection of classroom approaches and activities using computers for language learning is presented. Some require sophisticated installations, but most do not, and most use software readily available on most workplace computer systems. The activities were chosen because they use sound language learning strategies. The book is divided into five…

  15. Home, Hearth and Computing.

    ERIC Educational Resources Information Center

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  16. Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows

    DTIC Science & Technology

    2015-06-01

    sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994

  17. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  18. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  19. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  20. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  1. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  2. Development of BEM for ceramic composites

    NASA Technical Reports Server (NTRS)

    Henry, D. P.; Banerjee, P. K.; Dargush, G. F.

    1990-01-01

    Details on the progress made during the first three years of a five-year program towards the development of a boundary element code are presented. This code was designed for the micromechanical studies of advance ceramic composites. Additional effort was made in generalizing the implementation to allow the program to be applicable to real problems in the aerospace industry. The ceramic composite formulations developed were implemented in the three-dimensional boundary element computer code BEST3D. BEST3D was adopted as the base for the ceramic composite program, so that many of the enhanced features of this general purpose boundary element code could by utilized. Some of these facilities include sophisticated numerical integration, the capability of local definition of boundary conditions, and the use of quadratic shape functions for modeling geometry and field variables on the boundary. The multi-region implementation permits a body to be modeled in substructural parts; thus dramatically reducing the cost of the analysis. Furthermore, it allows a body consisting of regions of different ceramic matrices and inserts to be studied.

  3. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  4. The interactive digital video interface

    NASA Technical Reports Server (NTRS)

    Doyle, Michael D.

    1989-01-01

    A frequent complaint in the computer oriented trade journals is that current hardware technology is progressing so quickly that software developers cannot keep up. A example of this phenomenon can be seen in the field of microcomputer graphics. To exploit the advantages of new mechanisms of information storage and retrieval, new approaches must be made towards incorporating existing programs as well as developing entirely new applications. A particular area of need is the correlation of discrete image elements to textural information. The interactive digital video (IDV) interface embodies a new concept in software design which addresses these needs. The IDV interface is a patented device and language independent process for identifying image features on a digital video display and which allows a number of different processes to be keyed to that identification. Its capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. Sophisticated interrelationships can be set up between images, text, and program control mechanisms.

  5. Enhancing Web applications in radiology with Java: estimating MR imaging relaxation times.

    PubMed

    Dagher, A P; Fitzpatrick, M; Flanders, A E; Eng, J

    1998-01-01

    Java is a relatively new programming language that has been used to develop a World Wide Web-based tool for estimating magnetic resonance (MR) imaging relaxation times, thereby demonstrating how Java may be used for Web-based radiology applications beyond improving the user interface of teaching files. A standard processing algorithm coded with Java is downloaded along with the hypertext markup language (HTML) document. The user (client) selects the desired pulse sequence and inputs data obtained from a region of interest on the MR images. The algorithm is used to modify selected MR imaging parameters in an equation that models the phenomenon being evaluated. MR imaging relaxation times are estimated, and confidence intervals and a P value expressing the accuracy of the final results are calculated. Design features such as simplicity, object-oriented programming, and security restrictions allow Java to expand the capabilities of HTML by offering a more versatile user interface that includes dynamic annotations and graphics. Java also allows the client to perform more sophisticated information processing and computation than is usually associated with Web applications. Java is likely to become a standard programming option, and the development of stand-alone Java applications may become more common as Java is integrated into future versions of computer operating systems.

  6. Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques

    NASA Astrophysics Data System (ADS)

    Elliott, Louie C.

    This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.

  7. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  8. The Application of a Massively Parallel Computer to the Simulation of Electrical Wave Propagation Phenomena in the Heart Muscle Using Simplified Models

    NASA Technical Reports Server (NTRS)

    Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.

    1995-01-01

    The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.

  9. Fast computation of close-coupling exchange integrals using polynomials in a tree representation

    NASA Astrophysics Data System (ADS)

    Wallerberger, Markus; Igenbergs, Katharina; Schweinzer, Josef; Aumayr, Friedrich

    2011-03-01

    The semi-classical atomic-orbital close-coupling method is a well-known approach for the calculation of cross sections in ion-atom collisions. It strongly relies on the fast and stable computation of exchange integrals. We present an upgrade to earlier implementations of the Fourier-transform method. For this purpose, we implement an extensive library for symbolic storage of polynomials, relying on sophisticated tree structures to allow fast manipulation and numerically stable evaluation. Using this library, we considerably speed up creation and computation of exchange integrals. This enables us to compute cross sections for more complex collision systems. Program summaryProgram title: TXINT Catalogue identifier: AEHS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 12 332 No. of bytes in distributed program, including test data, etc.: 157 086 Distribution format: tar.gz Programming language: Fortran 95 Computer: All with a Fortran 95 compiler Operating system: All with a Fortran 95 compiler RAM: Depends heavily on input, usually less than 100 MiB Classification: 16.10 Nature of problem: Analytical calculation of one- and two-center exchange matrix elements for the close-coupling method in the impact parameter model. Solution method: Similar to the code of Hansen and Dubois [1], we use the Fourier-transform method suggested by Shakeshaft [2] to compute the integrals. However, we heavily speed up the calculation using a library for symbolic manipulation of polynomials. Restrictions: We restrict ourselves to a defined collision system in the impact parameter model. Unusual features: A library for symbolic manipulation of polynomials, where polynomials are stored in a space-saving left-child right-sibling binary tree. This provides stable numerical evaluation and fast mutation while maintaining full compatibility with the original code. Additional comments: This program makes heavy use of the new features provided by the Fortran 90 standard, most prominently pointers, derived types and allocatable structures and a small portion of Fortran 95. Only newer compilers support these features. Following compilers support all features needed by the program. GNU Fortran Compiler "gfortran" from version 4.3.0 GNU Fortran 95 Compiler "g95" from version 4.2.0 Intel Fortran Compiler "ifort" from version 11.0

  10. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    PubMed Central

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  11. Peering into the Future of Advertising.

    ERIC Educational Resources Information Center

    Hsia, H. J.

    All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…

  12. Improving Undergraduate Computer Instruction: Experiments and Strategies

    ERIC Educational Resources Information Center

    Kalman, Howard K.; Ellis, Maureen L.

    2007-01-01

    Today, undergraduate students enter college with increasingly more sophisticated computer skills compared to their counterparts of 20 years ago. However, many instructors are still using traditional instructional strategies to teach this new generation. This research study discusses a number of strategies that were employed to teach a…

  13. Modifications to an interactive model of the human body during exercise: With special emphasis on thermoregulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Scherb, Megan Kay

    1993-01-01

    Since 1988 an interactive computer model of the human body during exercise has been under development by a number of undergraduate students in the Department of Chemical Engineering at Iowa State University. The program, written under the direction of Dr. Richard C. Seagrave, uses physical characteristics of the user, environmental conditions and activity information to predict the onset of hypothermia, hyperthermia, dehydration, or exhaustion for various levels and durations of a specified exercise. The program however, was severely limited in predicting the onset of dehydration due to the lack of sophistication with which the program predicts sweat rate and its relationship to sensible water loss, degree of acclimatization, and level of physical training. Additionally, it was not known whether sweat rate also depended on age and gender. For these reasons, the goal of this creative component was to modify the program in the above mentioned areas by applying known information and empirical relationships from literature. Furthermore, a secondary goal was to improve the consistency with which the program was written by modifying user input statements and improving the efficiency and logic of the program calculations.

  14. Student Thinking Processes. The Influence of Immediate Computer Access on Students' Thinking. First- and Second-Year Findings. ACOT Report #3.

    ERIC Educational Resources Information Center

    Tierney, Robert J.

    This 2-year longitudinal study explored whether computers promote more sophisticated thinking, and examined how students' thinking changes as they become experienced computer users. The first-year study examined the thinking process of four ninth-grade Apple Classrooms of Tomorrow (ACOT) students. The second-year study continued following these…

  15. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  16. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  17. Electronic Networking as an Avenue of Enhanced Professional Interchange.

    ERIC Educational Resources Information Center

    Ratcliff, James L.

    Electronic networking is communication between two or more people that involves one or more telecommunications media. There is electronic networking software available for most computers, including IBM, Apple, and Radio Shack personal computers. Depending upon the sophistication of the hardware and software used, individuals and groups can…

  18. Teaching for CAD Expertise

    ERIC Educational Resources Information Center

    Chester, Ivan

    2007-01-01

    CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…

  19. Computer Aided Tests.

    ERIC Educational Resources Information Center

    Steinke, Elisabeth

    An approach to using the computer to assemble German tests is described. The purposes of the system would be: (1) an expansion of the bilingual lexical memory bank to list and store idioms of all degrees of difficulty, with frequency data and with complete and sophisticated retrieval possibility for assembly; (2) the creation of an…

  20. Intergenerational Projects: Idea Book.

    ERIC Educational Resources Information Center

    Clay, Rebecca; Ventura-Merkel, Cathy; Eades-Goudy, Dianne; Dubich, Teresa

    This book profiles 74 intergenerational programs in the United States. The programs range from basic tutoring projects to a sophisticated corporate-based day care center. Project selection was based on replicatable programs involving mutually beneficial exchanges. Grouped by subjects, profiles include programs targeting both young and old. Most…

  1. MADANALYSIS 5, a user-friendly framework for collider phenomenology

    NASA Astrophysics Data System (ADS)

    Conte, Eric; Fuks, Benjamin; Serret, Guillaume

    2013-01-01

    We present MADANALYSIS 5, a new framework for phenomenological investigations at particle colliders. Based on a C++ kernel, this program allows us to efficiently perform, in a straightforward and user-friendly fashion, sophisticated physics analyses of event files such as those generated by a large class of Monte Carlo event generators. MADANALYSIS 5 comes with two modes of running. The first one, easier to handle, uses the strengths of a powerful PYTHON interface in order to implement physics analyses by means of a set of intuitive commands. The second one requires one to implement the analyses in the C++ programming language, directly within the core of the analysis framework. This opens unlimited possibilities concerning the level of complexity which can be reached, being only limited by the programming skills and the originality of the user. Program summaryProgram title: MadAnalysis 5 Catalogue identifier: AENO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Permission to use, copy, modify and distribute this program is granted under the terms of the GNU General Public License. No. of lines in distributed program, including test data, etc.: 31087 No. of bytes in distributed program, including test data, etc.: 399105 Distribution format: tar.gz Programming language: PYTHON, C++. Computer: All platforms on which Python version 2.7, Root version 5.27 and the g++ compiler are available. Compatibility with newer versions of these programs is also ensured. However, the Python version must be below version 3.0. Operating system: Unix, Linux and Mac OS operating systems on which the above-mentioned versions of Python and Root, as well as g++, are available. Classification: 11.1. External routines: ROOT (http://root.cern.ch/drupal/) Nature of problem: Implementing sophisticated phenomenological analyses in high-energy physics through a flexible, efficient and straightforward fashion, starting from event files such as those produced by Monte Carlo event generators. The event files can have been matched or not to parton-showering and can have been processed or not by a (fast) simulation of a detector. According to the sophistication level of the event files (parton-level, hadron-level, reconstructed-level), one must note that several input formats are possible. Solution method: We implement an interface allowing the production of predefined as well as user-defined histograms for a large class of kinematical distributions after applying a set of event selection cuts specified by the user. This therefore allows us to devise robust and novel search strategies for collider experiments, such as those currently running at the Large Hadron Collider at CERN, in a very efficient way. Restrictions: Unsupported event file format. Unusual features: The code is fully based on object representations for events, particles, reconstructed objects and cuts, which facilitates the implementation of an analysis. Running time: It depends on the purposes of the user and on the number of events to process. It varies from a few seconds to the order of the minute for several millions of events.

  2. Early MIMD experience on the CRAY X-MP

    NASA Astrophysics Data System (ADS)

    Rhoades, Clifford E.; Stevens, K. G.

    1985-07-01

    This paper describes some early experience with converting four physics simulation programs to the CRAY X-MP, a current Multiple Instruction, Multiple Data (MIMD) computer consisting of two processors each with an architecture similar to that of the CRAY-1. As a multi-processor, the CRAY X-MP together with the high speed Solid-state Storage Device (SSD) in an ideal machine upon which to study MIMD algorithms for solving the equations of mathematical physics because it is fast enough to run real problems. The computer programs used in this study are all FORTRAN versions of original production codes. They range in sophistication from a one-dimensional numerical simulation of collisionless plasma to a two-dimensional hydrodynamics code with heat flow to a couple of three-dimensional fluid dynamics codes with varying degrees of viscous modeling. Early research with a dual processor configuration has shown speed-ups ranging from 1.55 to 1.98. It has been observed that a few simple extensions to FORTRAN allow a typical programmer to achieve a remarkable level of efficiency. These extensions involve the concept of memory local to a concurrent subprogram and memory common to all concurrent subprograms.

  3. Implementation and use of direct-flow connections in a coupled ground-water and surface-water model

    USGS Publications Warehouse

    Swain, Eric D.

    1994-01-01

    The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.

  4. BIRCH: a user-oriented, locally-customizable, bioinformatics system.

    PubMed

    Fristensky, Brian

    2007-02-09

    Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.

  5. BIRCH: A user-oriented, locally-customizable, bioinformatics system

    PubMed Central

    Fristensky, Brian

    2007-01-01

    Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351

  6. A user view of office automation or the integrated workstation

    NASA Technical Reports Server (NTRS)

    Schmerling, E. R.

    1984-01-01

    Central data bases are useful only if they are kept up to date and easily accessible in an interactive (query) mode rather than in monthly reports that may be out of date and must be searched by hand. The concepts of automatic data capture, data base management and query languages require good communications and readily available work stations to be useful. The minimal necessary work station is a personal computer which can be an important office tool if connected into other office machines and properly integrated into an office system. It has a great deal of flexibility and can often be tailored to suit the tastes, work habits and requirements of the user. Unlike dumb terminals, there is less tendency to saturate a central computer, since its free standing capabilities are available after down loading a selection of data. The PC also permits the sharing of many other facilities, like larger computing power, sophisticated graphics programs, laser printers and communications. It can provide rapid access to common data bases able to provide more up to date information than printed reports. Portable computers can access the same familiar office facilities from anywhere in the world where a telephone connection can be made.

  7. Real-time Java simulations of multiple interference dielectric filters

    NASA Astrophysics Data System (ADS)

    Kireev, Alexandre N.; Martin, Olivier J. F.

    2008-12-01

    An interactive Java applet for real-time simulation and visualization of the transmittance properties of multiple interference dielectric filters is presented. The most commonly used interference filters as well as the state-of-the-art ones are embedded in this platform-independent applet which can serve research and education purposes. The Transmittance applet can be freely downloaded from the site http://cpc.cs.qub.ac.uk. Program summaryProgram title: Transmittance Catalogue identifier: AEBQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5778 No. of bytes in distributed program, including test data, etc.: 90 474 Distribution format: tar.gz Programming language: Java Computer: Developed on PC-Pentium platform Operating system: Any Java-enabled OS. Applet was tested on Windows ME, XP, Sun Solaris, Mac OS RAM: Variable Classification: 18 Nature of problem: Sophisticated wavelength selective multiple interference filters can include some tens or even hundreds of dielectric layers. The spectral response of such a stack is not obvious. On the other hand, there is a strong demand from application designers and students to get a quick insight into the properties of a given filter. Solution method: A Java applet was developed for the computation and the visualization of the transmittance of multilayer interference filters. It is simple to use and the embedded filter library can serve educational purposes. Also, its ability to handle complex structures will be appreciated as a useful research and development tool. Running time: Real-time simulations

  8. Marshall information retrieval and display system (MIRADS)

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; Jones, S. C.; King, W. L.

    1974-01-01

    Program for data management system allows sophisticated inquiries while utilizing simplified language. Online system is composed of several programs. System is written primarily in COBOL with routines in ASSEMBLER and FORTRAN V.

  9. Better Crunching: Recommendations for Multivariate Data Analysis Approaches for Program Impact Evaluations

    ERIC Educational Resources Information Center

    Braverman, Marc T.

    2016-01-01

    Extension program evaluations often present opportunities to analyze data in multiple ways. This article suggests that program evaluations can involve more sophisticated data analysis approaches than are often used. On the basis of a hypothetical program scenario and corresponding data set, two approaches to testing for evidence of program impact…

  10. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  11. Quantum computing: a prime modality in neurosurgery's future.

    PubMed

    Lee, Brian; Liu, Charles Y; Apuzzo, Michael L J

    2012-11-01

    With each significant development in the field of neurosurgery, our dependence on computers, small and large, has continuously increased. From something as mundane as bipolar cautery to sophisticated intraoperative navigation with real-time magnetic resonance imaging-assisted surgical guidance, both technologies, however simple or complex, require computational processing power to function. The next frontier for neurosurgery involves developing a greater understanding of the brain and furthering our capabilities as surgeons to directly affect brain circuitry and function. This has come in the form of implantable devices that can electronically and nondestructively influence the cortex and nuclei with the purpose of restoring neuronal function and improving quality of life. We are now transitioning from devices that are turned on and left alone, such as vagus nerve stimulators and deep brain stimulators, to "smart" devices that can listen and react to the body as the situation may dictate. The development of quantum computers and their potential to be thousands, if not millions, of times faster than current "classical" computers, will significantly affect the neurosciences, especially the field of neurorehabilitation and neuromodulation. Quantum computers may advance our understanding of the neural code and, in turn, better develop and program implantable neural devices. When quantum computers reach the point where we can actually implant such devices in patients, the possibilities of what can be done to interface and restore neural function will be limitless. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Integrating a Narrative Medicine Telephone Interview with Online Life Review Education for Cancer Patients: Lessons Learned and Future Directions

    PubMed Central

    Wise, Meg; Marchand, Lucille; Cleary, James F.; Aeschlimann, Elizabeth; Causier, Daniel

    2012-01-01

    We describe an online narrative and life review education program for cancer patients and the results of a small implementation test to inform future directions for further program development and full-scale evaluation research. The intervention combined three types of psycho-oncology narrative interventions that have been shown to help patients address emotional and existential issues: 1) a physician-led dignity-enhancing telephone interview to elicit the life narrative and delivery of an edited life manuscript, 2) life review education, delivered via 3) a website self-directed instructional materials and expert consultation to help people revise and share their story. Eleven cancer patients tested the intervention and provided feedback in an in-depth exit interview. While everyone said telling and receiving the edited story manuscript was helpful and meaningful, only people with high death salience and prior computer experience used the web tools to enhance and share their story. Computer users prodded us to provide more sophisticated tools and older (>70 years) users needed more staff and family support. We conclude that combining a telephone expert-led interview with online life review education can extend access to integrative oncology services, are most feasible for computer-savvy patients with advanced cancer, and must use platforms that allow patients to upload files and invite their social network. PMID:19476731

  13. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  14. Microcomputer Based Computer-Assisted Learning System: CASTLE.

    ERIC Educational Resources Information Center

    Garraway, R. W. T.

    The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…

  15. Multi Agent Systems with Symbiotic Learning and Evolution using GNP

    NASA Astrophysics Data System (ADS)

    Eguchi, Toru; Hirasawa, Kotaro; Hu, Jinglu; Murata, Junichi

    Recently, various attempts relevant to Multi Agent Systems (MAS) which is one of the most promising systems based on Distributed Artificial Intelligence have been studied to control large and complicated systems efficiently. In these trends of MAS, Multi Agent Systems with Symbiotic Learning and Evolution named Masbiole has been proposed. In Masbiole, symbiotic phenomena among creatures are considered in the process of learning and evolution of MAS. So we can expect more flexible and sophisticated solutions than conventional MAS. In this paper, we apply Masbiole to Iterative Prisoner’s Dilemma Games (IPD Games) using Genetic Network Programming (GNP) which is a newly developed evolutionary computation method for constituting agents. Some characteristics of Masbiole using GNP in IPD Games are clarified.

  16. Evolution of the INMARSAT aeronautical system: Service, system, and business considerations

    NASA Technical Reports Server (NTRS)

    Sengupta, Jay R.

    1995-01-01

    A market-driven approach was adopted to develop enhancements to the Inmarsat-Aeronautical system, to address the requirements of potential new market segments. An evolutionary approach and well differentiated product/service portfolio was required, to minimize system upgrade costs and market penetration, respectively. The evolved system definition serves to minimize equipment cost/size/mass for short/medium range aircraft, by reducing the antenna gain requirement and relaxing the performance requirements for non safety-related communications. A validation program involving simulation, laboratory tests, over-satellite tests and flight trials is being conducted to confirm the system definition. Extensive market research has been conducted to determine user requirements and to quantify market demand for future Inmarsat Aero-1 AES, using sophisticated computer assisted survey techniques.

  17. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  18. Cloud computing can simplify HIT infrastructure management.

    PubMed

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  19. On the Application of a Response Surface Technique to Analyze Roll-over Stability of Capsules with Airbags Using LS-Dyna

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.

    2008-01-01

    As NASA moves towards developing technologies needed to implement its new Exploration program, studies conducted for Apollo in the 1960's to understand the rollover stability of capsules landing are being revisited. Although rigid body kinematics analyses of the roll-over behavior of capsules on impact provided critical insight to the Apollo problem, extensive ground test programs were also used. For the new Orion spacecraft being developed to implement today's Exploration program, new air-bag designs have improved sufficiently for NASA to consider their use to mitigate landing loads to ensure crew safety and to enable re-usability of the capsule. Simple kinematics models provide only limited understanding of the behavior of these air bag systems, and more sophisticated tools must be used. In particular, NASA and its contractors are using the LS-Dyna nonlinear simulation code for impact response predictions of the full Orion vehicle with air bags by leveraging the extensive air bag prediction work previously done by the automotive industry. However, even in today's computational environment, these analyses are still high-dimensional, time consuming, and computationally intensive. To alleviate the computational burden, this paper presents an approach that uses deterministic sampling techniques and an adaptive response surface method to not only use existing LS-Dyna solutions but also to interpolate from LS-Dyna solutions to predict the stability boundaries for a capsule on airbags. Results for the stability boundary in terms of impact velocities, capsule attitude, impact plane orientation, and impact surface friction are discussed.

  20. Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised

    NASA Technical Reports Server (NTRS)

    Key, Jeffrey R.; Schweiger, Axel J.

    1998-01-01

    Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.

  1. SpecPad: device-independent NMR data visualization and processing based on the novel DART programming language and Html5 Web technology.

    PubMed

    Guigas, Bruno

    2017-09-01

    SpecPad is a new device-independent software program for the visualization and processing of one-dimensional and two-dimensional nuclear magnetic resonance (NMR) time domain (FID) and frequency domain (spectrum) data. It is the result of a project to investigate whether the novel programming language DART, in combination with Html5 Web technology, forms a suitable base to write an NMR data evaluation software which runs on modern computing devices such as Android, iOS, and Windows tablets as well as on Windows, Linux, and Mac OS X desktop PCs and notebooks. Another topic of interest is whether this technique also effectively supports the required sophisticated graphical and computational algorithms. SpecPad is device-independent because DART's compiled executable code is JavaScript and can, therefore, be run by the browsers of PCs and tablets. Because of Html5 browser cache technology, SpecPad may be operated off-line. Network access is only required during data import or export, e.g. via a Cloud service, or for software updates. A professional and easy to use graphical user interface consistent across all hardware platforms supports touch screen features on mobile devices for zooming and panning and for NMR-related interactive operations such as phasing, integration, peak picking, or atom assignment. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  2. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  3. Existing and Emerging Third-Party: Certification Programs

    ERIC Educational Resources Information Center

    Wagner, Dan

    2012-01-01

    When one considers the necessary elements of a green cleaning program, it is tough to know where to begin. After all, green cleaning has evolved considerably from the days when a program simply involved using a couple of "green" chemicals. Over the last several years, successful green cleaning programs have grown in sophistication and are now…

  4. SuML: A Survey Markup Language for Generalized Survey Encoding

    PubMed Central

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  5. Simulation and animation of sensor-driven robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.; Trivedi, M.M.; Bidlack, C.R.

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less

  6. Using hypermedia to develop an intelligent tutorial/diagnostic system for the Space Shuttle Main Engine Controller Lab

    NASA Technical Reports Server (NTRS)

    Oreilly, Daniel; Williams, Robert; Yarborough, Kevin

    1988-01-01

    This is a tutorial/diagnostic system for training personnel in the use of the Space Shuttle Main Engine Controller (SSMEC) Simulation Lab. It also provides a diagnostic capable of isolating lab failures at least to the major lab component. The system was implemented using Hypercard, which is an program of hypermedia running on Apple Macintosh computers. Hypercard proved to be a viable platform for the development and use of sophisticated tutorial systems and moderately capable diagnostic systems. This tutorial/diagnostic system uses the basic Hypercard tools to provide the tutorial. The diagnostic part of the system uses a simple interpreter written in the Hypercard language (Hypertalk) to implement the backward chaining rule based logic commonly found in diagnostic systems using Prolog. Some of the advantages of Hypercard in developing this type of system include sophisticated graphics, animation, sound and voice capabilities, its ability as a hypermedia tool, and its ability to include digitized pictures. The major disadvantage is the slow execution time for evaluation of rules (due to the interpretive processing of the language). Other disadvantages include the limitation on the size of the cards, that color is not supported, that it does not support grey scale graphics, and its lack of selectable fonts for text fields.

  7. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  8. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  9. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  10. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  11. Ground Processing of Data From the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Sturdevant, Kathryn; Noble, David

    2006-01-01

    A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.

  12. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  13. Listening to Students: How I Came to Love My Low-Residency Program

    ERIC Educational Resources Information Center

    Atwood, Megan

    2009-01-01

    Finding an academic program that caters to children's literature is hard. Many people consider children's literature no more sophisticated than its audience--an arena for those who cannot hack it either as writers or as teachers of adult literature. This author, however, found a new program--a "low residency program"--at Hamline…

  14. Haptic augmentation of science instruction: Does touch matter?

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Minogue, James; Tretter, Thomas R.; Negishi, Atsuko; Taylor, Russell

    2006-01-01

    This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.

  15. Microgravity

    NASA Image and Video Library

    2004-04-15

    Ribbons is a program developed at UAB used worldwide to graphically depict complicated protein structures in a simplified format. The program uses sophisticated computer systems to understand the implications of protein structures. The Influenza virus remains a major causative agent for a large number of deaths among the elderly and young children and huge economic losses due to illness. Finding a cure will have a general impact both on the basic research of viral pathologists of fast evolving infectious agents and clinical treatment of influenza virus infection. The reproduction process of all strains of influenza are dependent on the same enzyme neuraminidase. Shown here is a segmented representation of the neuraminidase inhibitor compound sitting inside a cave-like contour of the neuraminidase enzyme surface. This cave-like formation present in every neuraminidase enzyme is the active site crucial to the flu's ability to infect. The space-grown crystals of neuraminidase have provided significant new details about the three-dimensional characteristics of this active site thus allowing researchers to design drugs that fit tighter into the site. Principal Investigator: Dr. Larry DeLucas

  16. A reconfigurable visual-programming library for real-time closed-loop cellular electrophysiology

    PubMed Central

    Biró, István; Giugliano, Michele

    2015-01-01

    Most of the software platforms for cellular electrophysiology are limited in terms of flexibility, hardware support, ease of use, or re-configuration and adaptation for non-expert users. Moreover, advanced experimental protocols requiring real-time closed-loop operation to investigate excitability, plasticity, dynamics, are largely inaccessible to users without moderate to substantial computer proficiency. Here we present an approach based on MATLAB/Simulink, exploiting the benefits of LEGO-like visual programming and configuration, combined to a small, but easily extendible library of functional software components. We provide and validate several examples, implementing conventional and more sophisticated experimental protocols such as dynamic-clamp or the combined use of intracellular and extracellular methods, involving closed-loop real-time control. The functionality of each of these examples is demonstrated with relevant experiments. These can be used as a starting point to create and support a larger variety of electrophysiological tools and methods, hopefully extending the range of default techniques and protocols currently employed in experimental labs across the world. PMID:26157385

  17. Programming Hierarchical Self-Assembly of Patchy Particles into Colloidal Crystals via Colloidal Molecules.

    PubMed

    Morphew, Daniel; Shaw, James; Avins, Christopher; Chakrabarti, Dwaipayan

    2018-03-27

    Colloidal self-assembly is a promising bottom-up route to a wide variety of three-dimensional structures, from clusters to crystals. Programming hierarchical self-assembly of colloidal building blocks, which can give rise to structures ordered at multiple levels to rival biological complexity, poses a multiscale design problem. Here we explore a generic design principle that exploits a hierarchy of interaction strengths and employ this design principle in computer simulations to demonstrate the hierarchical self-assembly of triblock patchy colloidal particles into two distinct colloidal crystals. We obtain cubic diamond and body-centered cubic crystals via distinct clusters of uniform size and shape, namely, tetrahedra and octahedra, respectively. Such a conceptual design framework has the potential to reliably encode hierarchical self-assembly of colloidal particles into a high level of sophistication. Moreover, the design framework underpins a bottom-up route to cubic diamond colloidal crystals, which have remained elusive despite being much sought after for their attractive photonic applications.

  18. HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.

    PubMed

    Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C

    2004-07-01

    A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.

  19. Overview of the NASA SETI Program

    NASA Technical Reports Server (NTRS)

    Oliver, B. M.

    1986-01-01

    The NASA Search of Extraterrestrial Intelligence (SETI) program plan is to scan the microwave window from 1 to 10 GHz with existing radio telescopes and sophisticated signal processing equipment looking for narrow band features that might represent artificial signals. A microwave spectrometer was built and is being field tested. A pattern recognition computer to search for drifting continuous wave signals and pulse trains in the output spectra is being designed. Equipment to characterize the radio frequency interference environment was also built. The plan is to complete the hardware and software by FY-88. Then, with increased funding, this equipment will be replicated in Very Large Scale Integration form. Observations, both a complete sky survey and a search fo nearby solar type stars, will begin in about 1990. The hypothesis that very powerful signals exist or that signals are being beamed at us will be tested. To detect the kinds of signals radiated at distances of 100 light years will require a collecting area kilometers in diameter.

  20. That Elusive, Eclectic Thing Called Thermal Environment: What a Board Should Know About It

    ERIC Educational Resources Information Center

    Schutte, Frederick

    1970-01-01

    Discussion of proper thermal environment for protection of sophisticated educational equipment such as computer and data-processing machines, magnetic tapes, closed-circuit television and video tape communications systems.

  1. School Architecture: New Activities Dictate New Designs.

    ERIC Educational Resources Information Center

    Hill, Robert

    1984-01-01

    Changing educational requirements have led to many school building design developments in recent years, including technologically sophisticated music and computer rooms, large school kitchens, and Title IX-mandated equal facilities available for both sexes. (MLF)

  2. Lockheed L-1011 TriStar first flight to support Adaptive Performance Optimization study

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Bearing the logos of the National Aeronautics and Space Administration and Orbital Sciences Corporation, Orbital's L-1011 Tristar lifts off the Meadows Field Runway at Bakersfield, California, on its first flight May 21, 1997, in NASA's Adaptive Performance Optimization project. Developed by engineers at NASA's Dryden Flight Research Center, Edwards, California, the experiment seeks to reduce fuel consumption of large jetliners by improving the aerodynamic efficency of their wings at cruise conditions. A research computer employing a sophisticated software program adapts to changing flight conditions by commanding small movements of the L-1011's outboard ailerons to give the wings the most efficient - or optimal - airfoil. Up to a dozen research flights will be flown in the current and follow-on phases of the project over the next couple years.

  3. An SPSS implementation of the nonrecursive outlier deletion procedure with shifting z score criterion (Van Selst & Jolicoeur, 1994).

    PubMed

    Thompson, Glenn L

    2006-05-01

    Sophisticated univariate outlier screening procedures are not yet available in widely used statistical packages such as SPSS. However, SPSS can accept user-supplied programs for executing these procedures. Failing this, researchers tend to rely on simplistic alternatives that can distort data because they do not adjust to cell-specific characteristics. Despite their popularity, these simple procedures may be especially ill suited for some applications (e.g., data from reaction time experiments). A user friendly SPSS Production Facility implementation of the shifting z score criterion procedure (Van Selst & Jolicoeur, 1994) is presented in an attempt to make it easier to use. In addition to outlier screening, optional syntax modules can be added that will perform tedious database management tasks (e.g., restructuring or computing means).

  4. Test, Control and Monitor System (TCMS) operations plan

    NASA Technical Reports Server (NTRS)

    Macfarlane, C. K.; Conroy, M. P.

    1993-01-01

    The purpose is to provide a clear understanding of the Test, Control and Monitor System (TCMS) operating environment and to describe the method of operations for TCMS. TCMS is a complex and sophisticated checkout system focused on support of the Space Station Freedom Program (SSFP) and related activities. An understanding of the TCMS operating environment is provided and operational responsibilities are defined. NASA and the Payload Ground Operations Contractor (PGOC) will use it as a guide to manage the operation of the TCMS computer systems and associated networks and workstations. All TCMS operational functions are examined. Other plans and detailed operating procedures relating to an individual operational function are referenced within this plan. This plan augments existing Technical Support Management Directives (TSMD's), Standard Practices, and other management documentation which will be followed where applicable.

  5. Computer-based diagnostic decisionmaking.

    PubMed

    Miller, R A

    1987-12-01

    The three decisionmaking aids described by the authors attack the generic problem of "see no evil, hear no evil, speak no evil"--improving the detection, diagnosis, and therapy of psychiatric disorders in the primary care setting. The three systems represent interventions at different steps in the process of providing appropriate care to psychiatric patients. The DSPW system of Robins and Marcus offers the potential of increasing the recognition of psychiatric disease in the physician's office. Politser's IDS program is representative of the sort of sophisticated microcomputer-based decisionmaking support tools that will become available to physicians in the not-too-distant future. Erdman's study of the impact of explanation capabilities on the acceptability of therapy recommending systems points out the need for careful scientific evaluations of features added to diagnostic and therapeutic systems.

  6. Integrating the iPod Touch in K-12 Education: Visions and Vices

    ERIC Educational Resources Information Center

    Banister, Savilla

    2010-01-01

    Advocates of ubiquitous computing have long been documenting classroom benefits of one-to-one ratios of students to handheld or laptop computers. The recent sophisticated capabilities of the iPod Touch, iPhone, and iPad have encouraged further speculation on exactly how K-12 teaching and learning might be energized by such devices. This paper…

  7. Use of three-dimensional computer graphic animation to illustrate cleft lip and palate surgery.

    PubMed

    Cutting, C; Oliker, A; Haring, J; Dayan, J; Smith, D

    2002-01-01

    Three-dimensional (3D) computer animation is not commonly used to illustrate surgical techniques. This article describes the surgery-specific processes that were required to produce animations to teach cleft lip and palate surgery. Three-dimensional models were created using CT scans of two Chinese children with unrepaired clefts (one unilateral and one bilateral). We programmed several custom software tools, including an incision tool, a forceps tool, and a fat tool. Three-dimensional animation was found to be particularly useful for illustrating surgical concepts. Positioning the virtual "camera" made it possible to view the anatomy from angles that are impossible to obtain with a real camera. Transparency allows the underlying anatomy to be seen during surgical repair while maintaining a view of the overlaying tissue relationships. Finally, the representation of motion allows modeling of anatomical mechanics that cannot be done with static illustrations. The animations presented in this article can be viewed on-line at http://www.smiletrain.org/programs/virtual_surgery2.htm. Sophisticated surgical procedures are clarified with the use of 3D animation software and customized software tools. The next step in the development of this technology is the creation of interactive simulators that recreate the experience of surgery in a safe, digital environment. Copyright 2003 Wiley-Liss, Inc.

  8. Simulation of cold magnetized plasmas with the 3D electromagnetic software CST Microwave Studio®

    NASA Astrophysics Data System (ADS)

    Louche, Fabrice; Křivská, Alena; Messiaen, André; Wauters, Tom

    2017-10-01

    Detailed designs of ICRF antennas were made possible by the development of sophisticated commercial 3D codes like CST Microwave Studio® (MWS). This program allows for very detailed geometries of the radiating structures, but was only considering simple materials like equivalent isotropic dielectrics to simulate the reflection and the refraction of RF waves at the vacuum/plasma interface. The code was nevertheless used intensively, notably for computing the coupling properties of the ITER ICRF antenna. Until recently it was not possible to simulate gyrotropic medias like magnetized plasmas, but recent improvements have allowed programming any material described by a general dielectric or/and diamagnetic tensor. A Visual Basic macro was developed to exploit this feature and was tested for the specific case of a monochromatic plane wave propagating longitudinally with respect to the magnetic field direction. For specific cases the exact solution can be expressed in 1D as the sum of two circularly polarized waves connected by a reflection coefficient that can be analytically computed. Solutions for stratified media can also be derived. This allows for a direct comparison with MWS results. The agreement is excellent but accurate simulations for realistic geometries require large memory resources that could significantly restrict the possibility of simulating cold plasmas to small-scale machines.

  9. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  10. Computation of repetitions and regularities of biologically weighted sequences.

    PubMed

    Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K

    2006-01-01

    Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.

  11. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  12. Verification for measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-06-01

    Blind quantum computing is a new secure quantum computing protocol where a client who does not have any sophisticated quantum technology can delegate her quantum computing to a server without leaking any privacy. It is known that a client who has only a measurement device can perform blind quantum computing [T. Morimae and K. Fujii, Phys. Rev. A 87, 050301(R) (2013), 10.1103/PhysRevA.87.050301]. It has been an open problem whether the protocol can enjoy the verification, i.e., the ability of the client to check the correctness of the computing. In this paper, we propose a protocol of verification for the measurement-only blind quantum computing.

  13. Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris

    NASA Technical Reports Server (NTRS)

    Hill, Nicole M.

    2009-01-01

    There are over 13,000 pieces of catalogued objects 10cm and larger in orbit around Earth [ODQN, January 2009, p12]. More than 6000 of these objects are fragments from explosions and collisions. As the earth-orbiting object count increases, debris-generating collisions in the future become a statistical inevitability. To aid in understanding this collision risk, the NASA Orbital Debris Program Office has developed computer models that calculate quantity and orbits of debris both currently in orbit and in future epochs. In order to create a reasonable computer model of the orbital debris environment, it is important to understand the mechanics of creation of debris as a result of a collision. The measurement of the physical characteristics of debris resulting from ground-based, hypervelocity impact testing aids in understanding the sizes and shapes of debris produced from potential impacts in orbit. To advance the accuracy of fragment shape/size determination, the NASA Orbital Debris Program Office recently implemented a computerized measurement system. The goal of this system is to improve knowledge and understanding of the relation between commonly used dimensions and overall shape. The technique developed involves scanning a single fragment with a hand-held laser device, measuring its size properties using a sophisticated software tool, and creating a three-dimensional computer model to demonstrate how the object might appear in orbit. This information is used to aid optical techniques in shape determination. This more automated and repeatable method provides higher accuracy in the size and shape determination of debris.

  14. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  15. Web-based hydrodynamics computing

    NASA Astrophysics Data System (ADS)

    Shimoide, Alan; Lin, Luping; Hong, Tracie-Lynne; Yoon, Ilmi; Aragon, Sergio R.

    2005-01-01

    Proteins are long chains of amino acids that have a definite 3-d conformation and the shape of each protein is vital to its function. Since proteins are normally in solution, hydrodynamics (describes the movement of solvent around a protein as a function of shape and size of the molecule) can be used to probe the size and shape of proteins compared to those derived from X-ray crystallography. The computation chain needed for these hydrodynamics calculations consists of several separate programs by different authors on various platforms and often requires 3D visualizations of intermediate results. Due to the complexity, tools developed by a particular research group are not readily available for use by other groups, nor even by the non-experts within the same research group. To alleviate this situation, and to foment the easy and wide distribution of computational tools worldwide, we developed a web based interactive computational environment (WICE) including interactive 3D visualization that can be used with any web browser. Java based technologies were used to provide a platform neutral, user-friendly solution. Java Server Pages (JSP), Java Servlets, Java Beans, JOGL (Java bindings for OpenGL), and Java Web Start were used to create a solution that simplifies the computing chain for the user allowing the user to focus on their scientific research. WICE hides complexity from the user and provides robust and sophisticated visualization through a web browser.

  16. Web-based hydrodynamics computing

    NASA Astrophysics Data System (ADS)

    Shimoide, Alan; Lin, Luping; Hong, Tracie-Lynne; Yoon, Ilmi; Aragon, Sergio R.

    2004-12-01

    Proteins are long chains of amino acids that have a definite 3-d conformation and the shape of each protein is vital to its function. Since proteins are normally in solution, hydrodynamics (describes the movement of solvent around a protein as a function of shape and size of the molecule) can be used to probe the size and shape of proteins compared to those derived from X-ray crystallography. The computation chain needed for these hydrodynamics calculations consists of several separate programs by different authors on various platforms and often requires 3D visualizations of intermediate results. Due to the complexity, tools developed by a particular research group are not readily available for use by other groups, nor even by the non-experts within the same research group. To alleviate this situation, and to foment the easy and wide distribution of computational tools worldwide, we developed a web based interactive computational environment (WICE) including interactive 3D visualization that can be used with any web browser. Java based technologies were used to provide a platform neutral, user-friendly solution. Java Server Pages (JSP), Java Servlets, Java Beans, JOGL (Java bindings for OpenGL), and Java Web Start were used to create a solution that simplifies the computing chain for the user allowing the user to focus on their scientific research. WICE hides complexity from the user and provides robust and sophisticated visualization through a web browser.

  17. 32 CFR 250.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... joint sponsorship, the controlling office is determined by advance agreement and may be either a party... materials; and (4) goods accompanied by sophisticated operation, application, or maintenance know-how that... Cooperative Research and Development Program. (3) The Department of the Air Force Potential Contractor Program...

  18. Kids on the Move: Afterschool Programs Promoting Healthy Eating and Physical Activity. America After 3PM Special Report

    ERIC Educational Resources Information Center

    Afterschool Alliance, 2015

    2015-01-01

    Afterschool programs have continued to grow in sophistication, increase their offerings and improve quality. As the role of afterschool programs has evolved from primarily providing a safe and supervised environment to a resource that provides a host of supports for their students, programs have become valuable partners in helping students reach…

  19. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  20. Use of Microcomputers and Personal Computers in Pacing

    PubMed Central

    Sasmor, L.; Tarjan, P.; Mumford, V.; Smith, E.

    1983-01-01

    This paper describes the evolution from the early discrete circuit pacemaker of the past to the sophisticated microprocessor based pacemakers of today. The necessary computerized supporting instrumentation is also described. Technological and economical reasons for this evolution are discussed.

  1. Adaptive Language Games with Robots

    NASA Astrophysics Data System (ADS)

    Steels, Luc

    2010-11-01

    This paper surveys recent research into language evolution using computer simulations and robotic experiments. This field has made tremendous progress in the past decade going from simple simulations of lexicon formation with animallike cybernetic robots to sophisticated grammatical experiments with humanoid robots.

  2. Toward a New Voice

    ERIC Educational Resources Information Center

    Murphy, Patti

    2007-01-01

    Frequently linked to sophisticated speech communication devices resembling laptop computers, augmentative and alternative communication (AAC) encompasses a spectrum of tools and strategies ranging from pointing, writing, gestures, and facial expressions to sign language, manual alphabet boards, picture symbols, and photographs used to convey…

  3. Television camera as a scientific instrument

    NASA Technical Reports Server (NTRS)

    Smokler, M. I.

    1970-01-01

    Rigorous calibration program, coupled with a sophisticated data-processing program that introduced compensation for system response to correct photometry, geometric linearity, and resolution, converted a television camera to a quantitative measuring instrument. The output data are in the forms of both numeric printout records and photographs.

  4. The journey from forensic to predictive materials science using density functional theory

    DOE PAGES

    Schultz, Peter A.

    2017-09-12

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  5. The journey from forensic to predictive materials science using density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter A.

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  6. Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones

    DTIC Science & Technology

    2015-07-01

    COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES Sb. GRANT NUMBER N/A Sc. PROGRAM ELEMENT NUMBER 6 1101E 6. AUTHOR(S) Sd...machine analysis system to detect novel, sophisticated Android malware. (c) An innovative library summarization technique and its incorporation in

  7. Emerging Techniques 2: Architectural Programming.

    ERIC Educational Resources Information Center

    Evans, Benjamin H.; Wheeler, C. Herbert, Jr.

    A selected collection of architectural programming techniques has been assembled to aid architects in building design. Several exciting and sophisticated techniques for determining a basis for environmental design have been developed in recent years. These extend to the logic of environmental design and lead to more appropriate and useful…

  8. Using genetic information while protecting the privacy of the soul.

    PubMed

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  9. At Home in the Cell.

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    1999-01-01

    Argues that biologists' understanding of the cell has become richer over the past 30 years. Describes how genetic engineering and sophisticated computer technology have provided an increased knowledge of genes, gene products, components of cells, and the structure and function of proteins. (CCM)

  10. The High-Tech Surge. Focus on Careers.

    ERIC Educational Resources Information Center

    Vo, Chuong-Dai Hong

    1996-01-01

    The computer industry is growing at a phenomenal rate as technology advances and prices fall, stimulating unprecedented demand from business, government, and individuals. Higher levels of education will be the key to securing employment as organizations increasingly rely on sophisticated technology. (Author)

  11. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    PubMed

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  12. When Machines Think: Radiology's Next Frontier.

    PubMed

    Dreyer, Keith J; Geis, J Raymond

    2017-12-01

    Artificial intelligence (AI), machine learning, and deep learning are terms now seen frequently, all of which refer to computer algorithms that change as they are exposed to more data. Many of these algorithms are surprisingly good at recognizing objects in images. The combination of large amounts of machine-consumable digital data, increased and cheaper computing power, and increasingly sophisticated statistical models combine to enable machines to find patterns in data in ways that are not only cost-effective but also potentially beyond humans' abilities. Building an AI algorithm can be surprisingly easy. Understanding the associated data structures and statistics, on the other hand, is often difficult and obscure. Converting the algorithm into a sophisticated product that works consistently in broad, general clinical use is complex and incompletely understood. To show how these AI products reduce costs and improve outcomes will require clinical translation and industrial-grade integration into routine workflow. Radiology has the chance to leverage AI to become a center of intelligently aggregated, quantitative, diagnostic information. Centaur radiologists, formed as a synergy of human plus computer, will provide interpretations using data extracted from images by humans and image-analysis computer algorithms, as well as the electronic health record, genomics, and other disparate sources. These interpretations will form the foundation of precision health care, or care customized to an individual patient. © RSNA, 2017.

  13. Template Authoring Environment for the Automatic Generation of Narrative Content

    ERIC Educational Resources Information Center

    Caropreso, Maria Fernanda; Inkpen, Diana; Keshtkar, Fazel; Khan, Shahzad

    2012-01-01

    Natural Language Generation (NLG) systems can make data accessible in an easily digestible textual form; but using such systems requires sophisticated linguistic and sometimes even programming knowledge. We have designed and implemented an environment for creating and modifying NLG templates that requires no programming knowledge, and can operate…

  14. Medical Ethics Training: A Clinical Partnership.

    ERIC Educational Resources Information Center

    Thomasma, David C.

    1979-01-01

    The ethics training program at the University of Tennessee Center for the Health Sciences involves a four-way dialogue among clinical faculty and house staff, ethics faculty and fellows, the medical students, and philosophy ethics students. The program's clinical basis allows participants to become sophisticated about ethical issues in practice.…

  15. Microgravity

    NASA Image and Video Library

    1999-05-26

    Looking for a faster computer? How about an optical computer that processes data streams simultaneously and works with the speed of light? In space, NASA researchers have formed optical thin-film. By turning these thin-films into very fast optical computer components, scientists could improve computer tasks, such as pattern recognition. Dr. Hossin Abdeldayem, physicist at NASA/Marshall Space Flight Center (MSFC) in Huntsville, Al, is working with lasers as part of an optical system for pattern recognition. These systems can be used for automated fingerprinting, photographic scarning and the development of sophisticated artificial intelligence systems that can learn and evolve. Photo credit: NASA/Marshall Space Flight Center (MSFC)

  16. Promoting Convergence: The Integrated Graduate Program in Physical and Engineering Biology at Yale University, a New Model for Graduate Education

    ERIC Educational Resources Information Center

    Noble, Dorottya B.; Mochrie, Simon G. J.; O'Hern, Corey S.; Pollard, Thomas D.; Regan, Lynne

    2016-01-01

    In 2008, we established the Integrated Graduate Program in Physical and Engineering Biology (IGPPEB) at Yale University. Our goal was to create a comprehensive graduate program to train a new generation of scientists who possess a sophisticated understanding of biology and who are capable of applying physical and quantitative methodologies to…

  17. Biomolecular computing systems: principles, progress and potential.

    PubMed

    Benenson, Yaakov

    2012-06-12

    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  18. Development of a Searchable Metabolite Database and Simulator of Xenobiotic Metabolism

    EPA Science Inventory

    A computational tool (MetaPath) has been developed for storage and analysis of metabolic pathways and associated metadata. The system is capable of sophisticated text and chemical structure/substructure searching as well as rapid comparison of metabolites formed across chemicals,...

  19. High-Tech Conservation: Information-Age Tools Have Revolutionized the Work of Ecologists.

    ERIC Educational Resources Information Center

    Chiles, James R.

    1992-01-01

    Describes a new direction for conservation efforts influenced by the advance of the information age and the introduction of many technologically sophisticated information collecting devices. Devices include microscopic computer chips, miniature electronic components, and Earth-observation satellite. (MCO)

  20. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  1. Resistance Is Futile

    ERIC Educational Resources Information Center

    O'Hanlon, Charlene

    2009-01-01

    How odd it should seem that even today, with interactive whiteboards, content management systems, wireless broadband, handhelds, and every sort of sophisticated computing device penetrating and improving the classroom experience for students, David Roh, general manager for Follett Digital Resources can still say, "There are hundreds of…

  2. SoftWAXS: a computational tool for modeling wide-angle X-ray solution scattering from biomolecules.

    PubMed

    Bardhan, Jaydeep; Park, Sanghyun; Makowski, Lee

    2009-10-01

    This paper describes a computational approach to estimating wide-angle X-ray solution scattering (WAXS) from proteins, which has been implemented in a computer program called SoftWAXS. The accuracy and efficiency of SoftWAXS are analyzed for analytically solvable model problems as well as for proteins. Key features of the approach include a numerical procedure for performing the required spherical averaging and explicit representation of the solute-solvent boundary and the surface of the hydration layer. These features allow the Fourier transform of the excluded volume and hydration layer to be computed directly and with high accuracy. This approach will allow future investigation of different treatments of the electron density in the hydration shell. Numerical results illustrate the differences between this approach to modeling the excluded volume and a widely used model that treats the excluded-volume function as a sum of Gaussians representing the individual atomic excluded volumes. Comparison of the results obtained here with those from explicit-solvent molecular dynamics clarifies shortcomings inherent to the representation of solvent as a time-averaged electron-density profile. In addition, an assessment is made of how the calculated scattering patterns depend on input parameters such as the solute-atom radii, the width of the hydration shell and the hydration-layer contrast. These results suggest that obtaining predictive calculations of high-resolution WAXS patterns may require sophisticated treatments of solvent.

  3. MCdevelop - a universal framework for Stochastic Simulations

    NASA Astrophysics Data System (ADS)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 48 136 No. of bytes in distributed program, including test data, etc.: 355 698 Distribution format: tar.gz Programming language: ANSI C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system. Operating system: Most UNIX systems, Linux. The application programs were thoroughly tested under Ubuntu 7.04, 8.04 and CERN Scientific Linux 5. Has the code been vectorised or parallelised?: Tools (scripts) for optional parallelisation on a PC farm are included. RAM: 500 bytes Classification: 11.3 External routines: ROOT package version 5.0 or higher ( http://root.cern.ch/drupal/). Nature of problem: Developing any type of stochastic simulation program for high energy physics and other areas. Solution method: Object Oriented programming in C++ with added persistency mechanism, batch scripts for running on PC farms and Autotools.

  4. Load Balancing Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pearce, Olga Tkachyshyn

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one atmore » the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.« less

  5. Simulation training tools for nonlethal weapons using gaming environments

    NASA Astrophysics Data System (ADS)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  6. A qualitative analysis of bus simulator training on transit incidents : a case study in Florida. [Summary].

    DOT National Transportation Integrated Search

    2013-01-01

    The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...

  7. Sharing Digital Data

    ERIC Educational Resources Information Center

    Benedis-Grab, Gregory

    2011-01-01

    Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…

  8. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  9. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  10. betaFIT: A computer program to fit pointwise potentials to selected analytic functions

    NASA Astrophysics Data System (ADS)

    Le Roy, Robert J.; Pashov, Asen

    2017-01-01

    This paper describes program betaFIT, which performs least-squares fits of sets of one-dimensional (or radial) potential function values to four different types of sophisticated analytic potential energy functional forms. These families of potential energy functions are: the Expanded Morse Oscillator (EMO) potential [J Mol Spectrosc 1999;194:197], the Morse/Long-Range (MLR) potential [Mol Phys 2007;105:663], the Double Exponential/Long-Range (DELR) potential [J Chem Phys 2003;119:7398], and the "Generalized Potential Energy Function (GPEF)" form introduced by Šurkus et al. [Chem Phys Lett 1984;105:291], which includes a wide variety of polynomial potentials, such as the Dunham [Phys Rev 1932;41:713], Simons-Parr-Finlan [J Chem Phys 1973;59:3229], and Ogilvie-Tipping [Proc R Soc A 1991;378:287] polynomials, as special cases. This code will be useful for providing the realistic sets of potential function shape parameters that are required to initiate direct fits of selected analytic potential functions to experimental data, and for providing better analytical representations of sets of ab initio results.

  11. Neuraminidase Ribbon Diagram

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Ribbons is a program developed at UAB used worldwide to graphically depict complicated protein structures in a simplified format. The program uses sophisticated computer systems to understand the implications of protein structures. The Influenza virus remains a major causative agent for a large number of deaths among the elderly and young children and huge economic losses due to illness. Finding a cure will have a general impact both on the basic research of viral pathologists of fast evolving infectious agents and clinical treatment of influenza virus infection. The reproduction process of all strains of influenza are dependent on the same enzyme neuraminidase. Shown here is a segmented representation of the neuraminidase inhibitor compound sitting inside a cave-like contour of the neuraminidase enzyme surface. This cave-like formation present in every neuraminidase enzyme is the active site crucial to the flu's ability to infect. The space-grown crystals of neuraminidase have provided significant new details about the three-dimensional characteristics of this active site thus allowing researchers to design drugs that fit tighter into the site. Principal Investigator: Dr. Larry DeLucas

  12. Flat-plate solar array project. Volume 8: Project analysis and integration

    NASA Technical Reports Server (NTRS)

    Mcguire, P.; Henry, P.

    1986-01-01

    Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.

  13. An Engineering Report in Civil Engineering and Management.

    DTIC Science & Technology

    1987-12-01

    programs as the Apollo program and the Canaveral program. Progress in the late 70s and the 80s has seen advancements in the application of sophisticated...other forces in military operations; subsequent combat service support ashore and defense against overt or clandestine enemy attacks directed toward...construction execution plans; assigns construction projects to NCF units; monitors progress and assures adherence to quality standards: directs

  14. Medical technology advances from space research.

    NASA Technical Reports Server (NTRS)

    Pool, S. L.

    1971-01-01

    NASA-sponsored medical R & D programs for space applications are reviewed with particular attention to the benefits of these programs to earthbound medical services and to the general public. Notable among the results of these NASA programs is an integrated medical laboratory equipped with numerous advanced systems such as digital biotelemetry and automatic visual field mapping systems, sponge electrode caps for electroencephalograms, and sophisticated respiratory analysis equipment.

  15. Teaching, Learning, and Leading with Schools and Communities: Preparing Sophisticated and Resilient Elementary STEM Educators

    ERIC Educational Resources Information Center

    Smetana, Lara K.; Coleman, Elizabeth R.; Ryan, Ann Marie; Tocci, Charles

    2013-01-01

    Loyola University Chicago's Teaching, Learning, and Leading With Schools and Communities (TLLSC) program is an ambitious break from traditional university-based teacher preparation models. This clinically based initial teacher preparation program, fully embedded in local schools and community organizations, takes an ecological perspective on the…

  16. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  17. A uniform object-oriented solution to the eigenvalue problem for real symmetric and Hermitian matrices

    NASA Astrophysics Data System (ADS)

    Castro, María Eugenia; Díaz, Javier; Muñoz-Caro, Camelia; Niño, Alfonso

    2011-09-01

    We present a system of classes, SHMatrix, to deal in a unified way with the computation of eigenvalues and eigenvectors in real symmetric and Hermitian matrices. Thus, two descendant classes, one for the real symmetric and other for the Hermitian cases, override the abstract methods defined in a base class. The use of the inheritance relationship and polymorphism allows handling objects of any descendant class using a single reference of the base class. The system of classes is intended to be the core element of more sophisticated methods to deal with large eigenvalue problems, as those arising in the variational treatment of realistic quantum mechanical problems. The present system of classes allows computing a subset of all the possible eigenvalues and, optionally, the corresponding eigenvectors. Comparison with well established solutions for analogous eigenvalue problems, as those included in LAPACK, shows that the present solution is competitive against them. Program summaryProgram title: SHMatrix Catalogue identifier: AEHZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2616 No. of bytes in distributed program, including test data, etc.: 127 312 Distribution format: tar.gz Programming language: Standard ANSI C++. Computer: PCs and workstations. Operating system: Linux, Windows. Classification: 4.8. Nature of problem: The treatment of problems involving eigensystems is a central topic in the quantum mechanical field. Here, the use of the variational approach leads to the computation of eigenvalues and eigenvectors of real symmetric and Hermitian Hamiltonian matrices. Realistic models with several degrees of freedom leads to large (sometimes very large) matrices. Different techniques, such as divide and conquer, can be used to factorize the matrices in order to apply a parallel computing approach. However, it is still interesting to have a core procedure able to tackle the computation of eigenvalues and eigenvectors once the matrix has been factorized to pieces of enough small size. Several available software packages, such as LAPACK, tackled this problem under the traditional imperative programming paradigm. In order to ease the modelling of complex quantum mechanical models it could be interesting to apply an object-oriented approach to the treatment of the eigenproblem. This approach offers the advantage of a single, uniform treatment for the real symmetric and Hermitian cases. Solution method: To reach the above goals, we have developed a system of classes: SHMatrix. SHMatrix is composed by an abstract base class and two descendant classes, one for real symmetric matrices and the other for the Hermitian case. The object-oriented characteristics of inheritance and polymorphism allows handling both cases using a single reference of the base class. The basic computing strategy applied in SHMatrix allows computing subsets of eigenvalues and (optionally) eigenvectors. The tests performed show that SHMatrix is competitive, and more efficient for large matrices, than the equivalent routines of the LAPACK package. Running time: The examples included in the distribution take only a couple of seconds to run.

  18. Education in the Information Age.

    ERIC Educational Resources Information Center

    Hay, Lee

    1983-01-01

    This essay considers the revolutionized education of a projected future of cheap and sophisticated technology. Predictions include a redefinition of literacy and basic skills and a restructuring of educational delivery employing computers to dispense information in order to free teachers to work directly with students on cognitive development.…

  19. 12 Math Rules That Expire in the Middle Grades

    ERIC Educational Resources Information Center

    Karp, Karen S.; Bush, Sarah B.; Dougherty, Barbara J.

    2015-01-01

    Many rules taught in mathematics classrooms "expire" when students develop knowledge that is more sophisticated, such as using new number systems. For example, in elementary grades, students are sometimes taught that "addition makes bigger" or "subtraction makes smaller" when learning to compute with whole numbers,…

  20. Symbionic Technology and Education. Report 83-02.

    ERIC Educational Resources Information Center

    Cartwright, Glenn F.

    Research findings indicate that major breakthroughs in education will have to occur through direct cortical intervention, using either chemical or electronic means. It will eventually be possible to build sophisticated intelligence amplifiers that will be internal extensions of our brains, significantly more powerful than present day computers,…

  1. Assessing Higher Order Thinking in Video Games

    ERIC Educational Resources Information Center

    Rice, John

    2007-01-01

    Computer video games have become highly interesting to educators and researchers since their sophistication has improved considerably over the last decade. Studies indicate simple video games touting educational benefits are common in classrooms. However, a need for identifying truly useful games for educational purposes exists. This article…

  2. Interactive Video-Based Industrial Training in Basic Electronics.

    ERIC Educational Resources Information Center

    Mirkin, Barry

    The Wisconsin Foundation for Vocational, Technical, and Adult Education is currently involved in the development, implementation, and distribution of a sophisticated interactive computer and video learning system. Designed to offer trainees an open entry and open exit opportunity to pace themselves through a comprehensive competency-based,…

  3. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  4. Climate Models

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  5. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  6. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  7. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  8. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  9. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  10. pH-programmable DNA logic arrays powered by modular DNAzyme libraries.

    PubMed

    Elbaz, Johann; Wang, Fuan; Remacle, Francoise; Willner, Itamar

    2012-12-12

    Nature performs complex information processing circuits, such the programmed transformations of versatile stem cells into targeted functional cells. Man-made molecular circuits are, however, unable to mimic such sophisticated biomachineries. To reach these goals, it is essential to construct programmable modular components that can be triggered by environmental stimuli to perform different logic circuits. We report on the unprecedented design of artificial pH-programmable DNA logic arrays, constructed by modular libraries of Mg(2+)- and UO(2)(2+)-dependent DNAzyme subunits and their substrates. By the appropriate modular design of the DNA computation units, pH-programmable logic arrays of various complexities are realized, and the arrays can be erased, reused, and/or reprogrammed. Such systems may be implemented in the near future for nanomedical applications by pH-controlled regulation of cellular functions or may be used to control biotransformations stimulated by bacteria.

  11. Coordinated scheduling for dynamic real-time systems

    NASA Technical Reports Server (NTRS)

    Natarajan, Swaminathan; Zhao, Wei

    1994-01-01

    In this project, we addressed issues in coordinated scheduling for dynamic real-time systems. In particular, we concentrated on design and implementation of a new distributed real-time system called R-Shell. The design objective of R-Shell is to provide computing support for space programs that have large, complex, fault-tolerant distributed real-time applications. In R-shell, the approach is based on the concept of scheduling agents, which reside in the application run-time environment, and are customized to provide just those resource management functions which are needed by the specific application. With this approach, we avoid the need for a sophisticated OS which provides a variety of generalized functionality, while still not burdening application programmers with heavy responsibility for resource management. In this report, we discuss the R-Shell approach, summarize the achievement of the project, and describe a preliminary prototype of R-Shell system.

  12. Specificity of software cooperating with an optoelectronic sensor in the pulse oximeter system

    NASA Astrophysics Data System (ADS)

    Cysewska-Sobusiak, Anna; Wiczynski, Grzegorz; Jedwabny, Tomasz

    1995-06-01

    Specificity of a software package composed of two parts which control an optoelectronic sensor of the computer-aided system made to realize the noninvasive measurements of the arterial blood oxygen saturation as well as some parameters of the peripheral pulse waveform, has been described. Principles of the transmission variant of the one and only noninvasive measurement method, so-called pulse oximetry, has been utilized. The software co-ordinates the suitable cooperation of an IBM PC compatible microcomputer with the sensor and one specialized card. This novel card is a key part of the whole measuring system which some application fields are extended in comparison to pulse oximeters commonly attainable. The user-friendly MS Windows graphical environment which creates the system to be multitask and non-preemptive, has been used to design the specific part of the programming presented here. With this environment, sophisticated tasks of the software package can be performed without excessive complication.

  13. A Study of Upgraded Phenolic Curing for RSRM Nozzle Rings

    NASA Technical Reports Server (NTRS)

    Smartt, Ziba

    2000-01-01

    A thermochemical cure model for predicting temperature and degree of cure profiles in curing phenolic parts was developed, validated and refined over several years. The model supports optimization of cure cycles and allows input of properties based upon the types of material and the process by which these materials are used to make nozzle components. The model has been refined to use sophisticated computer graphics to demonstrate the changes in temperature and degree of cure during the curing process. The effort discussed in the paper will be the conversion from an outdated solid modeling input program and SINDA analysis code to an integrated solid modeling and analysis package (I-DEAS solid model and TMG). Also discussed will be the incorporation of updated material properties obtained during full scale curing tests into the cure models and the results for all the Reusable Solid Rocket Motor (RSRM) nozzle rings.

  14. Determining open cluster membership. A Bayesian framework for quantitative member classification

    NASA Astrophysics Data System (ADS)

    Stott, Jonathan J.

    2018-01-01

    Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.

  15. Challenges facing the development of the Arabic chatbot

    NASA Astrophysics Data System (ADS)

    AlHagbani, Eman Saad; Khan, Muhammad Badruddin

    2016-07-01

    The future information systems are expected to be more intelligent and will take human queries in natural language as input and answer them promptly. To develop a chatbot or a computer program that can chat with humans in realistic manner to extent that human get impressions that he/she is talking with other human is a challenging task. To make such chatbots, different technologies will work together ranging from artificial intelligence to development of semantic resources. Sophisticated chatbots are developed to perform conversation in number of languages. Arabic chatbots can be helpful in automating many operations and serve people who only know Arabic language. However, the technology for Arabic language is still in its infancy stage due to some challenges surrounding the Arabic language. This paper offers an overview of the chatbot application and the several obstacles and challenges that need to be resolved to develop an effective Arabic chatbot.

  16. Root Traits and Phenotyping Strategies for Plant Improvement

    PubMed Central

    Paez-Garcia, Ana; Motes, Christy M.; Scheible, Wolf-Rüdiger; Chen, Rujin; Blancaflor, Elison B.; Monteros, Maria J.

    2015-01-01

    Roots are crucial for nutrient and water acquisition and can be targeted to enhance plant productivity under a broad range of growing conditions. A current challenge for plant breeding is the limited ability to phenotype and select for desirable root characteristics due to their underground location. Plant breeding efforts aimed at modifying root traits can result in novel, more stress-tolerant crops and increased yield by enhancing the capacity of the plant for soil exploration and, thus, water and nutrient acquisition. Available approaches for root phenotyping in laboratory, greenhouse and field encompass simple agar plates to labor-intensive root digging (i.e., shovelomics) and soil boring methods, the construction of underground root observation stations and sophisticated computer-assisted root imaging. Here, we summarize root architectural traits relevant to crop productivity, survey root phenotyping strategies and describe their advantages, limitations and practical value for crop and forage breeding programs. PMID:27135332

  17. Root Traits and Phenotyping Strategies for Plant Improvement.

    PubMed

    Paez-Garcia, Ana; Motes, Christy M; Scheible, Wolf-Rüdiger; Chen, Rujin; Blancaflor, Elison B; Monteros, Maria J

    2015-06-15

    Roots are crucial for nutrient and water acquisition and can be targeted to enhance plant productivity under a broad range of growing conditions. A current challenge for plant breeding is the limited ability to phenotype and select for desirable root characteristics due to their underground location. Plant breeding efforts aimed at modifying root traits can result in novel, more stress-tolerant crops and increased yield by enhancing the capacity of the plant for soil exploration and, thus, water and nutrient acquisition. Available approaches for root phenotyping in laboratory, greenhouse and field encompass simple agar plates to labor-intensive root digging (i.e., shovelomics) and soil boring methods, the construction of underground root observation stations and sophisticated computer-assisted root imaging. Here, we summarize root architectural traits relevant to crop productivity, survey root phenotyping strategies and describe their advantages, limitations and practical value for crop and forage breeding programs.

  18. QMCPACK: an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids.

    PubMed

    Kim, Jeongnim; Baczewski, Andrew T; Beaudet, Todd D; Benali, Anouar; Bennett, M Chandler; Berrill, Mark A; Blunt, Nick S; Borda, Edgar Josué Landinez; Casula, Michele; Ceperley, David M; Chiesa, Simone; Clark, Bryan K; Clay, Raymond C; Delaney, Kris T; Dewing, Mark; Esler, Kenneth P; Hao, Hongxia; Heinonen, Olle; Kent, Paul R C; Krogel, Jaron T; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M Graham; Luo, Ye; Malone, Fionn D; Martin, Richard M; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A; Mitas, Lubos; Morales, Miguel A; Neuscamman, Eric; Parker, William D; Pineda Flores, Sergio D; Romero, Nichols A; Rubenstein, Brenda M; Shea, Jacqueline A R; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F; Townsend, Joshua P; Tubman, Norm M; Van Der Goetz, Brett; Vincent, Jordan E; Yang, D ChangMo; Yang, Yubo; Zhang, Shuai; Zhao, Luning

    2018-05-16

    QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program's capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.

  19. Improving Access to Data While Protecting Confidentiality: Prospects for the Future.

    ERIC Educational Resources Information Center

    Duncan, George T.; Pearson, Robert W.

    Providing researchers, especially those in the social sciences, with access to publicly collected microdata furthers research while advancing public policy goals in a democratic society. However, while technological improvements have eased remote access to these databases and enabled computer using researchers to perform sophisticated statistical…

  20. MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)

    EPA Science Inventory

    A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...

  1. Analysis of an Anti-Phishing Lab Activity

    ERIC Educational Resources Information Center

    Werner, Laurie A.; Courte, Jill

    2010-01-01

    Despite advances in spam detection software, anti-spam laws, and increasingly sophisticated users, the number of successful phishing scams continues to grow. In addition to monetary losses attributable to phishing, there is also a loss of confidence that stifles use of online services. Using in-class activities in an introductory computer course…

  2. Artificial Intelligence Applications in Special Education: How Feasible? Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.; Ferrara, Joseph M.

    The research project investigated whether expert system tools have become sophisticated enough to be applied efficiently to problems in special education. (Expert systems are a development of artificial intelligence that combines the computer's capacity for storing specialized knowledge with a general set of rules intended to replicate the…

  3. Instructional Design Considerations in Converting Non-CBT Materials into CBT Courses.

    ERIC Educational Resources Information Center

    Ng, Raymond

    Instructional designers who are asked to convert existing training materials into computer-based training (CBT) must take special precautions to avoid making the product into a sophisticated page turner. Although conversion may save considerable time on subject research and analysis, courses to be delivered through microcomputers may require…

  4. Detecting Satisficing in Online Surveys

    ERIC Educational Resources Information Center

    Salifu, Shani

    2012-01-01

    The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…

  5. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  6. Data management in the mission data system

    NASA Technical Reports Server (NTRS)

    Wagner, David A.

    2005-01-01

    As spacecraft evolve from simple embedded devices to become more sophisticated computing platforms with complex behaviors it is increasingly necessary to model and manage the flow of data, and to provide uniform models for managing data that promote adaptability, yet pay heed to the physical limitations of the embedded and space environments.

  7. Technology Acceptance in Social Work Education: Implications for the Field Practicum

    ERIC Educational Resources Information Center

    Colvin, Alex Don; Bullock, Angela N.

    2014-01-01

    The exponential growth and sophistication of new information and computer technology (ICT) have greatly influenced human interactions and provided new metaphors for understanding the world. The acceptance and integration of ICT into social work field education are examined here using the technological acceptance model. This article also explores…

  8. Models and Methodologies for Multimedia Courseware Production.

    ERIC Educational Resources Information Center

    Barker, Philip; Giller, Susan

    Many new technologies are now available for delivering and/or providing access to computer-based learning (CBL) materials. These technologies vary in sophistication in many important ways, depending upon the bandwidth that they provide, the interactivity that they offer and the types of end-user connectivity that they support.Invariably,…

  9. A Primer on High-Throughput Computing for Genomic Selection

    PubMed Central

    Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized genetic gain). Eventually, HTC may change our view of data analysis as well as decision-making in the post-genomic era of selection programs in animals and plants, or in the study of complex diseases in humans. PMID:22303303

  10. Retention and application of Skylab experiences to future programs. [a postflight review of technical programs.

    NASA Technical Reports Server (NTRS)

    Gillespie, V. G.; Kelly, R. O.

    1974-01-01

    The problems encountered and special techniques and procedures developed on the Skylab program are described along with the experiences and practical benefits obtained for dissemination and use on future programs. Three major topics are discussed: electrical problems, mechanical problems, and special techniques. Special techniques and procedures are identified that were either developed or refined during the Skylab program. These techniques and procedures came from all manufacturing and test phases of the Skylab program and include both flight and GSE items from component level to sophisticated spaceflight systems.

  11. A Sequence of Sorting Strategies.

    ERIC Educational Resources Information Center

    Duncan, David R.; Litwiller, Bonnie H.

    1984-01-01

    Describes eight increasingly sophisticated and efficient sorting algorithms including linear insertion, binary insertion, shellsort, bubble exchange, shakersort, quick sort, straight selection, and tree selection. Provides challenges for the reader and the student to program these efficiently. (JM)

  12. Natural three-qubit interactions in one-way quantum computing

    NASA Astrophysics Data System (ADS)

    Tame, M. S.; Paternostro, M.; Kim, M. S.; Vedral, V.

    2006-02-01

    We address the effects of natural three-qubit interactions on the computational power of one-way quantum computation. A benefit of using more sophisticated entanglement structures is the ability to construct compact and economic simulations of quantum algorithms with limited resources. We show that the features of our study are embodied by suitably prepared optical lattices, where effective three-spin interactions have been theoretically demonstrated. We use this to provide a compact construction for the Toffoli gate. Information flow and two-qubit interactions are also outlined, together with a brief analysis of relevant sources of imperfection.

  13. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  14. Using a graphical programming language to write CAMAC/GPIB instrument drivers

    NASA Technical Reports Server (NTRS)

    Zambrana, Horacio; Johanson, William

    1991-01-01

    To reduce the complexities of conventional programming, graphical software was used in the development of instrumentation drivers. The graphical software provides a standard set of tools (graphical subroutines) which are sufficient to program the most sophisticated CAMAC/GPIB drivers. These tools were used and instrumentation drivers were successfully developed for operating CAMAC/GPIB hardware from two different manufacturers: LeCroy and DSP. The use of these tools is presented for programming a LeCroy A/D Waveform Analyzer.

  15. A Low Cost Remote Sensing System Using PC and Stereo Equipment

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Flood, Michael A.; Prasad, Narasimha S.; Hodson, Wade D.

    2011-01-01

    A system using a personal computer, speaker, and a microphone is used to detect objects, and make crude measurements using a carrier modulated by a pseudorandom noise (PN) code. This system can be constructed using a personal computer and audio equipment commonly found in the laboratory or at home, or more sophisticated equipment that can be purchased at reasonable cost. We demonstrate its value as an instructional tool for teaching concepts of remote sensing and digital signal processing.

  16. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times. Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server ( http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles. Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions. Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators. Running time: Real time operations. References: [1] The main LCG MCDB server, http://mcdb.cern.ch/. [2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241. [3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti, O. Barring, CASTOR: status and evolution, cs.oh/0305047.

  17. Visualization of bioelectric phenomena.

    PubMed

    Palmer, T C; Simpson, E V; Kavanagh, K M; Smith, W M

    1992-01-01

    Biomedical investigators are currently able to acquire and analyze physiological and anatomical data from three-dimensional structures in the body. Often, multiple kinds of data can be recorded simultaneously. The usefulness of this information, either for exploratory viewing or for presentation to others, is limited by the lack of techniques to display it in intuitive, accessible formats. Unfortunately, the complexity of scientific visualization techniques and the inflexibility of commercial packages deter investigators from using sophisticated visualization methods that could provide them added insight into the mechanisms of the phenomena under study. Also, the sheer volume of such data is a problem. High-performance computing resources are often required for storage and processing, in addition to visualization. This chapter describes a novel, language-based interface that allows scientists with basic programming skills to classify and render multivariate volumetric data with a modest investment in software training. The interface facilitates data exploration by enabling experimentation with various algorithms to compute opacity and color from volumetric data. The value of the system is demonstrated using data from cardiac mapping studies, in which multiple electrodes are placed in an on the heart to measure the cardiac electrical activity intrinsic to the heart and its response to external stimulation.

  18. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    PubMed

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  19. The Infrared Space Observatory (ISO)

    NASA Technical Reports Server (NTRS)

    Helou, George; Kessler, Martin F.

    1995-01-01

    ISO, scheduled to launch in 1995, will carry into orbit the most sophisticated infrared observatory of the decade. Overviews of the mission, instrument payload and scientific program are given, along with a comparison of the strengths of ISO and SOFIA.

  20. Analysis of Qualitative Interviews about the Impact of Information Technology on Pressure Ulcer Prevention Programs: Implications for the Wound Ostomy Continence Nurse

    PubMed Central

    Shepherd, Marilyn Murphy; Wipke-Tevis, Deidre D.; Alexander, Gregory L.

    2015-01-01

    Purpose The purpose of this study was to compare pressure ulcer prevention programs in 2 long term care facilities (LTC) with diverse Information Technology Sophistication (ITS), one with high sophistication and one with low sophistication, and to identify implications for the Wound Ostomy Continence Nurse (WOC Nurse) Design Secondary analysis of narrative data obtained from a mixed methods study. Subjects and Setting The study setting was 2 LTC facilities in the Midwestern United States. The sample comprised 39 staff from 2 facilities, including 26 from a high ITS facility and 13 from the low ITS facility. Respondents included Certified Nurse Assistants,, Certified Medical Technicians, Restorative Medical Technicians, Social Workers, Registered Nurses, Licensed Practical Nurses, Information Technology staff, Administrators, and Directors. Methods This study is a secondary analysis of interviews regarding communication and education strategies in two longterm care agencies. This analysis focused on focus group interviews, which included both direct and non-direct care providers. Results Eight themes (codes) were identified in the analysis. Three themes are presented individually with exemplars of communication and education strategies. The analysis revealed specific differences between the high ITS and low ITS facility in regards to education and communication involving pressure ulcer prevention. These differences have direct implications for WOC nurses consulting in the LTC setting. Conclusions Findings from this study suggest that effective strategies for staff education and communication regarding PU prevention differ based on the level of ITS within a given facility. Specific strategies for education and communication are suggested for agencies with high ITS and agencies with low ITS sophistication. PMID:25945822

  1. The Matrix Element Method: Past, Present, and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.

    2013-07-12

    The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less

  2. Bio-Intelligence: A Research Program Facilitating the Development of New Paradigms for Tomorrow's Patient Care

    NASA Astrophysics Data System (ADS)

    Phan, Sieu; Famili, Fazel; Liu, Ziying; Peña-Castillo, Lourdes

    The advancement of omics technologies in concert with the enabling information technology development has accelerated biological research to a new realm in a blazing speed and sophistication. The limited single gene assay to the high throughput microarray assay and the laborious manual count of base-pairs to the robotic assisted machinery in genome sequencing are two examples to name. Yet even more sophisticated, the recent development in literature mining and artificial intelligence has allowed researchers to construct complex gene networks unraveling many formidable biological puzzles. To harness these emerging technologies to their full potential to medical applications, the Bio-intelligence program at the Institute for Information Technology, National Research Council Canada, aims to develop and exploit artificial intelligence and bioinformatics technologies to facilitate the development of intelligent decision support tools and systems to improve patient care - for early detection, accurate diagnosis/prognosis of disease, and better personalized therapeutic management.

  3. Accelerating Monte Carlo simulations with an NVIDIA ® graphics processor

    NASA Astrophysics Data System (ADS)

    Martinsen, Paul; Blaschke, Johannes; Künnemeyer, Rainer; Jordan, Robert

    2009-10-01

    Modern graphics cards, commonly used in desktop computers, have evolved beyond a simple interface between processor and display to incorporate sophisticated calculation engines that can be applied to general purpose computing. The Monte Carlo algorithm for modelling photon transport in turbid media has been implemented on an NVIDIA ® 8800 GT graphics card using the CUDA toolkit. The Monte Carlo method relies on following the trajectory of millions of photons through the sample, often taking hours or days to complete. The graphics-processor implementation, processing roughly 110 million scattering events per second, was found to run more than 70 times faster than a similar, single-threaded implementation on a 2.67 GHz desktop computer. Program summaryProgram title: Phoogle-C/Phoogle-G Catalogue identifier: AEEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 51 264 No. of bytes in distributed program, including test data, etc.: 2 238 805 Distribution format: tar.gz Programming language: C++ Computer: Designed for Intel PCs. Phoogle-G requires a NVIDIA graphics card with support for CUDA 1.1 Operating system: Windows XP Has the code been vectorised or parallelized?: Phoogle-G is written for SIMD architectures RAM: 1 GB Classification: 21.1 External routines: Charles Karney Random number library. Microsoft Foundation Class library. NVIDA CUDA library [1]. Nature of problem: The Monte Carlo technique is an effective algorithm for exploring the propagation of light in turbid media. However, accurate results require tracing the path of many photons within the media. The independence of photons naturally lends the Monte Carlo technique to implementation on parallel architectures. Generally, parallel computing can be expensive, but recent advances in consumer grade graphics cards have opened the possibility of high-performance desktop parallel-computing. Solution method: In this pair of programmes we have implemented the Monte Carlo algorithm described by Prahl et al. [2] for photon transport in infinite scattering media to compare the performance of two readily accessible architectures: a standard desktop PC and a consumer grade graphics card from NVIDIA. Restrictions: The graphics card implementation uses single precision floating point numbers for all calculations. Only photon transport from an isotropic point-source is supported. The graphics-card version has no user interface. The simulation parameters must be set in the source code. The desktop version has a simple user interface; however some properties can only be accessed through an ActiveX client (such as Matlab). Additional comments: The random number library used has a LGPL ( http://www.gnu.org/copyleft/lesser.html) licence. Running time: Runtime can range from minutes to months depending on the number of photons simulated and the optical properties of the medium. References:http://www.nvidia.com/object/cuda_home.html. S. Prahl, M. Keijzer, Sl. Jacques, A. Welch, SPIE Institute Series 5 (1989) 102.

  4. Novel opportunities for computational biology and sociology in drug discovery☆

    PubMed Central

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  5. The marketing concept applied to an education program.

    PubMed

    Parks, S C; Moody, D L; Barbrow, E P

    1984-09-01

    Dietetic education programs seeking to maintain their enrollment levels may find it necessary to adopt more sophisticated marketing strategies. This article describes the application of the marketing process to an extended degree dietetic program that serves a national audience. It also presents a strategy for initiating a marketing study and marketing orientation by analyzing its internal program data. The article discusses the specific market characteristics of the program's primary market segments, and it presents further implications for dietitians at work in health care facilities, in businesses, or in private practice.

  6. A fundamental look at fire spread in California chaparral

    Treesearch

    David R. Weise; Thomas Fletcher; Larry Baxter; Shankar Mahalingam; Xiangyang Zhou; Patrick Pagni; Rod Linn; Bret Butler

    2004-01-01

    The USDA Forest Service National Fire Plan funded a research program to study fire spread in live fuels of the southwestern United States. In the U.S. current operational fire spread models do not distinguish between live and dead fuels in a sophisticated manner because the study of live fuels has been limited. The program is experimentally examining fire spread at 3...

  7. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  8. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  9. CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.

    2013-01-01

    The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.

  10. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  11. GiveMe Shelter: A People-Centred Design Process for Promoting Independent Inquiry-Led Learning in Engineering

    ERIC Educational Resources Information Center

    Dyer, Mark; Grey, Thomas; Kinnane, Oliver

    2017-01-01

    It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…

  12. C-SWAT: The Soil and Water Assessment Tool with consolidated input files in alleviating computational burden of recursive simulations

    USDA-ARS?s Scientific Manuscript database

    The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...

  13. A Developing Market for Continuing Higher Education: The Reserve Components.

    ERIC Educational Resources Information Center

    Watt, David M.

    Due to increasingly sophisticated military equipment, the Reserve Components of the armed forces need to raise the educational standards for recruits. A number of U.S. educational institutions have responded to their needs for continuing higher education in the areas of job skill enhancement (such as computer operation), regular courses directly…

  14. Deep FIFO Surge Buffer

    NASA Technical Reports Server (NTRS)

    Temple, Gerald; Siegel, Marc; Amitai, Zwie

    1991-01-01

    First-in/first-out (FIFO) temporarily stores short surges of data generated by data-acquisition system at excessively high rate and releases data at lower rate suitable for processing by computer. Size and complexity reduced while capacity enhanced by use of newly developed, sophisticated integrated circuits and by "byte-folding" scheme doubling effective depth and data rate.

  15. Using Excel's Solver Function to Facilitate Reciprocal Service Department Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.

    2013-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated and theoretically incorrect direct or step-down methods. This article illustrates how Excel's Solver…

  16. ALFIL: A Crowd Simulation Serious Game for Massive Evacuation Training and Awareness

    ERIC Educational Resources Information Center

    García-García, César; Fernández-Robles, José Luis; Larios-Rosillo, Victor; Luga, Hervé

    2012-01-01

    This article presents the current development of a serious game for the simulation of massive evacuations. The purpose of this project is to promote self-protection through awareness of the procedures and different possible scenarios during the evacuation of a massive event. Sophisticated behaviors require massive computational power and it has…

  17. Children's Behavior toward and Understanding of Robotic and Living Dogs

    ERIC Educational Resources Information Center

    Melson, Gail F.; Kahn, Peter H., Jr.; Beck, Alan; Friedman, Batya; Roberts, Trace; Garrett, Erik; Gill, Brian T.

    2009-01-01

    This study investigated children's reasoning about and behavioral interactions with a computationally sophisticated robotic dog (Sony's AIBO) compared to a live dog (an Australian Shepherd). Seventy-two children from three age groups (7-9 years, 10-12 years, and 13-15 years) participated in this study. Results showed that more children…

  18. Using Novel Word Context Measures to Predict Human Ratings of Lexical Proficiency

    ERIC Educational Resources Information Center

    Berger, Cynthia M.; Crossley, Scott A.; Kyle, Kristopher

    2017-01-01

    This study introduces a model of lexical proficiency based on novel computational indices related to word context. The indices come from an updated version of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES) and include associative, lexical, and semantic measures of word context. Human ratings of holistic lexical proficiency…

  19. Using Excel's Matrix Operations to Facilitate Reciprocal Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.; Kizirian, Tim

    2009-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated direct or step-down methods. Here is a short example demonstrating how Excel's sometimes unknown matrix…

  20. Sanibel Symposium in the Petascale-Exascale Computational Era

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Hai-Ping

    The 56 th Sanibel Symposium was held February 14-19 2016 at the King and Prince Hotel, St. Simons Island, GA. It successfully brought quantum chemists and chemical and condensed matter physicists together in presentations, posters, and informal discussions bridging those two communities. The Symposium has had a significant role in preparing generations of quantum theorists. As computational potency and algorithmic sophistication have grown, the Symposium has evolved to emphasize more heavily computationally oriented method development in chemistry and materials physics, including nanoscience, complex molecular phenomena, and even bio-molecular methods and problems. Given this context, the 56 th Sanibel meeting systematicallymore » and deliberately had sessions focused on exascale computation. A selection of outstanding theoretical problems that need serious attention was included. Five invited sessions, two contributed sessions (hot topics), and a poster session were organized with the exascale theme. This was a historic milestone in the evolution of the Symposia. Just as years ago linear algebra, perturbation theory, density matrices, and band-structure methods dominated early Sanibel Symposia, the exascale sessions of the 56 thmeeting contributed a transformative influence to add structure and strength to the computational physical science community in an unprecedented way. A copy of the full program of the 56 th Symposium is attached. The exascale sessions were Linear Scaling, Non-Adabatic Dynamics, Interpretive Theory and Models, Computation, Software, and Algorithms, and Quantum Monte Carlo. The Symposium Proceedings will be published in Molecular Physics (2017). Note that the Sanibel proceedings from 2015 and 2014 were published as Molecular Physics vol. 114, issue 3-4 (2016) and vol. 113, issue 3-4 (2015) respectively.« less

  1. Assessing Core Competencies

    NASA Astrophysics Data System (ADS)

    Narayanan, M.

    2004-12-01

    Catherine Palomba and Trudy Banta offer the following definition of assessment, adapted from one provided by Marches in 1987. Assessment in the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. (Palomba and Banta 1999). It is widely recognized that sophisticated computing technologies are becoming a key element in today's classroom instructional techniques. Regardless, the Professor must be held responsible for creating an instructional environment in which the technology actually supplements learning outcomes of the students. Almost all academic disciplines have found a niche for computer-based instruction in their respective professional domain. In many cases, it is viewed as an essential and integral part of the educational process. Educational institutions are committing substantial resources to the establishment of dedicated technology-based laboratories, so that they will be able to accommodate and fulfill students' desire to master certain of these specific skills. This type of technology-based instruction may raise some fundamental questions about the core competencies of the student learner. Some of the most important questions are : 1. Is the utilization of these fast high-powered computers and user-friendly software programs creating a totally non-challenging instructional environment for the student learner ? 2. Can technology itself all too easily overshadow the learning outcomes intended ? 3. Are the educational institutions simply training students how to use technology rather than educating them in the appropriate field ? 4. Are we still teaching content-driven courses and analysis oriented subject matter ? 5. Are these sophisticated modern era technologies contributing to a decline in the Critical Thinking Capabilities of the 21st century technology-savvy students ? The author tries to focus on technology as a tool and not on the technology itself. He further argues that students must demonstrate that they have the have the ability to think critically before they make an attempt to use technology in a chosen application-specific environment. The author further argues that training-based instruction has a very narrow focus that puts modern technology at the forefront of the learning enterprise system. The author promotes education-oriented strategies to provide the students with a broader perspective of the subject matter. The author is also of the opinion that students entering the workplace should clearly understand the context in which modern technologies are influencing the productive outcomes of the industrialized world. References : Marchese, T. J. (1987). Third Down, Ten Years to go. AAHE Bulletin, Vol. 40, pages 3-8. Marchese, T. J. (1994). Assessment, Quality and Undergraduate Improvement. Assessment Update, Vol. 6, No. 3. pages 1-14. Montagu, A. S. (2001). High-technology instruction: A framework for teaching computer-based technologies. Journal on Excellence in College Teaching, 12 (1), 109-128. Palomba, Catherine A. and Banta, Trudy W.(1999). Assessment Essentials :Planning, Implementing and Improving Assessment in Higher Education. San Francisco : Jossey Bass Publishers.

  2. Pilot interaction with automated airborne decision making systems

    NASA Technical Reports Server (NTRS)

    Hammer, John M.; Wan, C. Yoon; Vasandani, Vijay

    1987-01-01

    The current research is focused on detection of human error and protection from its consequences. A program for monitoring pilot error by comparing pilot actions to a script was described. It dealt primarily with routine errors (slips) that occurred during checklist activity. The model to which operator actions were compared was a script. Current research is an extension along these two dimensions. The ORS fault detection aid uses a sophisticated device model rather than a script. The newer initiative, the model-based and constraint-based warning system, uses an even more sophisticated device model and is to prevent all types of error, not just slips or bad decision.

  3. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    NASA Technical Reports Server (NTRS)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  4. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  5. Computer Assisted Navigation in Knee Arthroplasty

    PubMed Central

    Bae, Dae Kyung

    2011-01-01

    Computer assisted surgery (CAS) was used to improve the positioning of implants during total knee arthroplasty (TKA). Most studies have reported that computer assisted navigation reduced the outliers of alignment and component malpositioning. However, additional sophisticated studies are necessary to determine if the improvement of alignment will improve long-term clinical results and increase the survival rate of the implant. Knowledge of CAS-TKA technology and understanding the advantages and limitations of navigation are crucial to the successful application of the CAS technique in TKA. In this article, we review the components of navigation, classification of the system, surgical method, potential error, clinical results, advantages, and disadvantages. PMID:22162787

  6. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  7. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  8. Planetary investigation utilizing an imaging spectrometer system based upon charge injection technology

    NASA Technical Reports Server (NTRS)

    Wattson, R. B.; Harvey, P.; Swift, R.

    1975-01-01

    An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.

  9. Symmetrically private information retrieval based on blind quantum computing

    NASA Astrophysics Data System (ADS)

    Sun, Zhiwei; Yu, Jianping; Wang, Ping; Xu, Lingling

    2015-05-01

    Universal blind quantum computation (UBQC) is a new secure quantum computing protocol which allows a user Alice who does not have any sophisticated quantum technology to delegate her computing to a server Bob without leaking any privacy. Using the features of UBQC, we propose a protocol to achieve symmetrically private information retrieval, which allows a quantum limited Alice to query an item from Bob with a fully fledged quantum computer; meanwhile, the privacy of both parties is preserved. The security of our protocol is based on the assumption that malicious Alice has no quantum computer, which avoids the impossibility proof of Lo. For the honest Alice, she is almost classical and only requires minimal quantum resources to carry out the proposed protocol. Therefore, she does not need any expensive laboratory which can maintain the coherence of complicated quantum experimental setups.

  10. Novel opportunities for computational biology and sociology in drug discovery

    PubMed Central

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  11. 3D Texture Features Mining for MRI Brain Tumor Identification

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra

    2014-03-01

    Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.

  12. Recent advances in modeling languages for pathway maps and computable biological networks.

    PubMed

    Slater, Ted

    2014-02-01

    As our theories of systems biology grow more sophisticated, the models we use to represent them become larger and more complex. Languages necessarily have the expressivity and flexibility required to represent these models in ways that support high-resolution annotation, and provide for simulation and analysis that are sophisticated enough to allow researchers to master their data in the proper context. These languages also need to facilitate model sharing and collaboration, which is currently best done by using uniform data structures (such as graphs) and language standards. In this brief review, we discuss three of the most recent systems biology modeling languages to appear: BEL, PySB and BCML, and examine how they meet these needs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Data mining: sophisticated forms of managed care modeling through artificial intelligence.

    PubMed

    Borok, L S

    1997-01-01

    Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.

  14. Automation Problems of 1968; Papers Presented at the Meeting...October 4-5, 1968.

    ERIC Educational Resources Information Center

    Andrews, Theodora, Ed.

    Librarians and their concerned colleagues met to give, hear and discuss papers on library automation, primarily by computers. Noted at this second meeting on library automation were: (1) considerably more sophistication and casualness about the techniques involved, (2) considerably more assurance of what and where things can be applied and (3)…

  15. The Relationship of Lexical Richness to the Quality of ESL Learners' Oral Narratives

    ERIC Educational Resources Information Center

    Lu, Xiaofei

    2012-01-01

    This study was an examination of the relationship of lexical richness to the quality of English as a second language (ESL) learners' oral narratives. A computational system was designed to automate the measurement of 3 dimensions of lexical richness, that is, lexical density, sophistication, and variation, using 25 different metrics proposed in…

  16. Requirements for SPIRES II. An External Specification for the Stanford Public Information Retrieval System.

    ERIC Educational Resources Information Center

    Parker, Edwin B.

    SPIRES (Stanford Public Information Retrieval System) is a computerized information storage and retrieval system intended for use by students and faculty members who have little knowledge of computers but who need rapid and sophisticated retrieval and analysis. The functions and capabilities of the system from the user's point of view are…

  17. Science of Security Lablet - Scalability and Usability

    DTIC Science & Technology

    2014-12-16

    mobile computing [19]. However, the high-level infrastructure design and our own implementation (both described throughout this paper) can easily...critical and infrastructural systems demands high levels of sophistication in the technical aspects of cybersecurity, software and hardware design...Forget, S. Komanduri, Alessandro Acquisti, Nicolas Christin, Lorrie Cranor, Rahul Telang. "Security Behavior Observatory: Infrastructure for Long-term

  18. 3DGRAPE - THREE DIMENSIONAL GRIDS ABOUT ANYTHING BY POISSON'S EQUATION

    NASA Technical Reports Server (NTRS)

    Sorenson, R. L.

    1994-01-01

    The ability to treat arbitrary boundary shapes is one of the most desirable characteristics of a method for generating grids. 3DGRAPE is designed to make computational grids in or about almost any shape. These grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. 3DGRAPE uses zones to solve the problem of warping one cube into the physical domain in real-world computational fluid dynamics problems. In a zonal approach, a physical domain is divided into regions, each of which maps into its own computational cube. It is believed that even the most complicated physical region can be divided into zones, and since it is possible to warp a cube into each zone, a grid generator which is oriented to zones and allows communication across zonal boundaries (where appropriate) solves the problem of topological complexity. 3DGRAPE expects to read in already-distributed x,y,z coordinates on the bodies of interest, coordinates which will remain fixed during the entire grid-generation process. The 3DGRAPE code makes no attempt to fit given body shapes and redistribute points thereon. Body-fitting is a formidable problem in itself. The user must either be working with some simple analytical body shape, upon which a simple analytical distribution can be easily effected, or must have available some sophisticated stand-alone body-fitting software. 3DGRAPE does not require the user to supply the block-to-block boundaries nor the shapes of the distribution of points. 3DGRAPE will typically supply those block-to-block boundaries simply as surfaces in the elliptic grid. Thus at block-to-block boundaries the following conditions are obtained: (1) grids lines will match up as they approach the block-to-block boundary from either side, (2) grid lines will cross the boundary with no slope discontinuity, (3) the spacing of points along the line piercing the boundary will be continuous, (4) the shape of the boundary will be consistent with the surrounding grid, and (5) the distribution of points on the boundary will be reasonable in view of the surrounding grid. 3DGRAPE offers a powerful building-block approach to complex 3-D grid generation, but is a low-level tool. Users may build each face of each block as they wish, from a wide variety of resources. 3DGRAPE uses point-successive-over-relaxation (point-SOR) to solve the Poisson equations. This method is slow, although it does vectorize nicely. Any number of sophisticated graphics programs may be used on the stored output file of 3DGRAPE though it lacks interactive graphics. Versatility was a prominent consideration in developing the code. The block structure allows a great latitude in the problems it can treat. As the acronym implies, this program should be able to handle just about any physical region into which a computational cube or cubes can be warped. 3DGRAPE was written in FORTRAN 77 and should be machine independent. It was originally developed on a Cray under COS and tested on a MicroVAX 3200 under VMS 5.1.

  19. Grids: The Top Ten Questions

    DOE PAGES

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  20. Software Surface Modeling and Grid Generation Steering Committee

    NASA Technical Reports Server (NTRS)

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  1. Computational protein design with backbone plasticity

    PubMed Central

    MacDonald, James T.; Freemont, Paul S.

    2016-01-01

    The computational algorithms used in the design of artificial proteins have become increasingly sophisticated in recent years, producing a series of remarkable successes. The most dramatic of these is the de novo design of artificial enzymes. The majority of these designs have reused naturally occurring protein structures as ‘scaffolds’ onto which novel functionality can be grafted without having to redesign the backbone structure. The incorporation of backbone flexibility into protein design is a much more computationally challenging problem due to the greatly increased search space, but promises to remove the limitations of reusing natural protein scaffolds. In this review, we outline the principles of computational protein design methods and discuss recent efforts to consider backbone plasticity in the design process. PMID:27911735

  2. Simple geometric algorithms to aid in clearance management for robotic mechanisms

    NASA Technical Reports Server (NTRS)

    Copeland, E. L.; Ray, L. D.; Peticolas, J. D.

    1981-01-01

    Global geometric shapes such as lines, planes, circles, spheres, cylinders, and the associated computational algorithms which provide relatively inexpensive estimates of minimum spatial clearance for safe operations were selected. The Space Shuttle, remote manipulator system, and the Power Extension Package are used as an example. Robotic mechanisms operate in quarters limited by external structures and the problem of clearance is often of considerable interest. Safe clearance management is simple and suited to real time calculation, whereas contact prediction requires more precision, sophistication, and computational overhead.

  3. International Guide to Highway Transportation Information: Volume 1 - Highway Transportation Libraries and Information Centers

    DOT National Transportation Integrated Search

    2013-01-01

    The FHWA Road Weather Management Program partnered with Utah DOT to develop and implement advanced traveler information strategies during weather events. UDOT already has one of the most sophisticated Traffic Operations Centers (TOCs) in the country ...

  4. The Classes of Authoring Programs.

    ERIC Educational Resources Information Center

    Kozel, Kathy

    1997-01-01

    Provides an overview of developments in authoring tools and describes ways to categorize products by platform, type of end-product, sophistication of end-product, and authoring metaphor. Discusses products from AimTech, Allegiant, Allen Communication, Asymetrix, Corel, Discovery Systems International, Enigma, Harrow Media, Horizons, Innovus,…

  5. Challenges and opportunities for the commercialization of postharvest biocontrol

    USDA-ARS?s Scientific Manuscript database

    The past twenty years has seen the field of postharvest biocontrol evolve into a sophisticated science with global research programs worldwide, numerous yearly publications, patented technologies, and the development of new commercial products. The use of these products, however, still remains limi...

  6. Commercialization of postharvest biocontrol: barriers and opportunities

    USDA-ARS?s Scientific Manuscript database

    The past twenty years has seen the field of postharvest biocontrol evolve from an unknown entity with one or two novel reports in the literature to a sophisticated science with strong research programs worldwide, hundreds of publications, patented technologies, and now several commercial products. ...

  7. U1A Complex

    ScienceCinema

    None

    2018-01-16

    Some of the most sophisticated experiments in the stockpile stewardship program are conducted in an environmentally safe manner, nearly 1000 feet below the ground at the site. The U1a complex a sprawling underground laboratory and tunnel complex is home to a number of unique capabilities.

  8. Regional-Scale Modeling at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Adler, R.; Baker, D.; Braun, S.; Chou, M.-D.; Jasinski, M. F.; Jia, Y.; Kakar, R.; Karyampudi, M.; Lang, S.

    2003-01-01

    Over the past decade, the Goddard Mesoscale Modeling and Dynamics Group has used a popular regional scale model, MM5, to study precipitation processes. Our group is making contributions to the MM5 by incorporating the following physical and numerical packages: improved Goddard cloud processes, a land processes model (Parameterization for Land-Atmosphere-Cloud Exchange - PLACE), efficient but sophisticated radiative processes, conservation of hydrometeor mass (water budget), four-dimensional data assimilation for rainfall, and better computational methods for trace gas transport. At NASA Goddard, the MM5 has been used to study: (1) the impact of initial conditions, assimilation of satellite-derived rainfall, and cumulus parameterizations on rapidly intensifying oceanic cyclones, hurricanes and typhoons, (2) the dynamic and thermodynamic processes associated with the development of narrow cold frontal rainbands, (3) regional climate and water cycles, (4) the impact of vertical transport by clouds and lightning on trace gas distributiodproduction associated with South and North American mesoscale convective systems, (5) the development of a westerly wind burst (WWB) that occurred during the TOGA COARE and the diurnal variation of precipitation in the tropics, (6) a Florida sea breeze convective event and a Mid-US flood event using a sophisticated land surface model, (7) the influence of soil heterogeneity on land surface energy balance in the southwest GCIP region, (8) explicit simulations (with 1.33 to 4 km horizontal resolution) of hurricanes Bob (1991) and Bonnie (1998), (9) a heavy precipitation event over Taiwan, and (10) to make real time forecasts for a major NASA field program. In this paper, the modifications and simulated cases will be described and discussed.

  9. Science Language Accommodation in Elementary School Read-Alouds

    NASA Astrophysics Data System (ADS)

    Glass, Rory; Oliveira, Alandeom W.

    2014-03-01

    This study examines the pedagogical functions of accommodation (i.e. provision of simplified science speech) in science read-aloud sessions facilitated by five elementary teachers. We conceive of read-alouds as communicative events wherein teachers, faced with the task of orally delivering a science text of relatively high linguistic complexity, open up an alternate channel of communication, namely oral discussion. By doing so, teachers grant students access to a simplified linguistic input, a strategy designed to promote student comprehension of the textual contents of children's science books. It was found that nearly half (46%) of the read-aloud time was allotted to discussions with an increased percentage of less sophisticated words and reduced use of more sophisticated vocabulary than found in the books through communicative strategies such as simplified rewording, simplified definition, and simplified questioning. Further, aloud reading of more linguistically complex books required longer periods of discussion and an increased degree of teacher oral input and accommodation. We also found evidence of reversed simplification (i.e. sophistication), leading to student uptake of scientific language. The main significance of this study is that it reveals that teacher talk serves two often competing pedagogical functions (accessible communication of scientific information to students and promotion of student acquisition of the specialized language of science). It also underscores the importance of giving analytical consideration to the simplification-sophistication dimension of science classroom discourse as well as the potential of computer-based analysis of classroom discourse to inform science teaching.

  10. Amoeba-based computing for traveling salesman problem: long-term correlations between spatially separated individual cells of Physarum polycephalum.

    PubMed

    Zhu, Liping; Aono, Masashi; Kim, Song-Ju; Hara, Masahiko

    2013-04-01

    A single-celled, multi-nucleated amoeboid organism, a plasmodium of the true slime mold Physarum polycephalum, can perform sophisticated computing by exhibiting complex spatiotemporal oscillatory dynamics while deforming its amorphous body. We previously devised an "amoeba-based computer (ABC)" to quantitatively evaluate the optimization capability of the amoeboid organism in searching for a solution to the traveling salesman problem (TSP) under optical feedback control. In ABC, the organism changes its shape to find a high quality solution (a relatively shorter TSP route) by alternately expanding and contracting its pseudopod-like branches that exhibit local photoavoidance behavior. The quality of the solution serves as a measure of the optimality of which the organism maximizes its global body area (nutrient absorption) while minimizing the risk of being illuminated (exposure to aversive stimuli). ABC found a high quality solution for the 8-city TSP with a high probability. However, it remains unclear whether intracellular communication among the branches of the organism is essential for computing. In this study, we conducted a series of control experiments using two individual cells (two single-celled organisms) to perform parallel searches in the absence of intercellular communication. We found that ABC drastically lost its ability to find a solution when it used two independent individuals. However, interestingly, when two individuals were prepared by dividing one individual, they found a solution for a few tens of minutes. That is, the two divided individuals remained correlated even though they were spatially separated. These results suggest the presence of a long-term memory in the intrinsic dynamics of this organism and its significance in performing sophisticated computing. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Development and Application of a Numerical Framework for Improving Building Foundation Heat Transfer Calculations

    NASA Astrophysics Data System (ADS)

    Kruis, Nathanael J. F.

    Heat transfer from building foundations varies significantly in all three spatial dimensions and has important dynamic effects at all timescales, from one hour to several years. With the additional consideration of moisture transport, ground freezing, evapotranspiration, and other physical phenomena, the estimation of foundation heat transfer becomes increasingly sophisticated and computationally intensive to the point where accuracy must be compromised for reasonable computation time. The tools currently available to calculate foundation heat transfer are often either too limited in their capabilities to draw meaningful conclusions or too sophisticated to use in common practices. This work presents Kiva, a new foundation heat transfer computational framework. Kiva provides a flexible environment for testing different numerical schemes, initialization methods, spatial and temporal discretizations, and geometric approximations. Comparisons within this framework provide insight into the balance of computation speed and accuracy relative to highly detailed reference solutions. The accuracy and computational performance of six finite difference numerical schemes are verified against established IEA BESTEST test cases for slab-on-grade heat conduction. Of the schemes tested, the Alternating Direction Implicit (ADI) scheme demonstrates the best balance between accuracy, performance, and numerical stability. Kiva features four approaches of initializing soil temperatures for an annual simulation. A new accelerated initialization approach is shown to significantly reduce the required years of presimulation. Methods of approximating three-dimensional heat transfer within a representative two-dimensional context further improve computational performance. A new approximation called the boundary layer adjustment method is shown to improve accuracy over other established methods with a negligible increase in computation time. This method accounts for the reduced heat transfer from concave foundation shapes, which has not been adequately addressed to date. Within the Kiva framework, three-dimensional heat transfer that can require several days to simulate is approximated in two-dimensions in a matter of seconds while maintaining a mean absolute deviation within 3%.

  12. Reducing the Time and Cost of Testing Engines

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Producing a new aircraft engine currently costs approximately $1 billion, with 3 years of development time for a commercial engine and 10 years for a military engine. The high development time and cost make it extremely difficult to transition advanced technologies for cleaner, quieter, and more efficient new engines. To reduce this time and cost, NASA created a vision for the future where designers would use high-fidelity computer simulations early in the design process in order to resolve critical design issues before building the expensive engine hardware. To accomplish this vision, NASA's Glenn Research Center initiated a collaborative effort with the aerospace industry and academia to develop its Numerical Propulsion System Simulation (NPSS), an advanced engineering environment for the analysis and design of aerospace propulsion systems and components. Partners estimate that using NPSS has the potential to dramatically reduce the time, effort, and expense necessary to design and test jet engines by generating sophisticated computer simulations of an aerospace object or system. These simulations will permit an engineer to test various design options without having to conduct costly and time-consuming real-life tests. By accelerating and streamlining the engine system design analysis and test phases, NPSS facilitates bringing the final product to market faster. NASA's NPSS Version (V)1.X effort was a task within the Agency s Computational Aerospace Sciences project of the High Performance Computing and Communication program, which had a mission to accelerate the availability of high-performance computing hardware and software to the U.S. aerospace community for its use in design processes. The technology brings value back to NASA by improving methods of analyzing and testing space transportation components.

  13. A Computer-Controlled Laser Bore Scanner

    NASA Astrophysics Data System (ADS)

    Cheng, Charles C.

    1980-08-01

    This paper describes the design and engineering of a laser scanning system for production applications. The laser scanning techniques, the timing control, the logic design of the pattern recognition subsystem, the digital computer servo control for the loading and un-loading of parts, and the laser probe rotation and its synchronization will be discussed. The laser inspection machine is designed to automatically inspect the surface of precision-bored holes, such as those in automobile master cylinders, without contacting the machined surface. Although the controls are relatively sophisticated, operation of the laser inspection machine is simple. A laser light beam from a commercially available gas laser, directed through a probe, scans the entire surface of the bore. Reflected light, picked up through optics by photoelectric sensors, generates signals that are fed to a mini-computer for processing. A pattern recognition techniques program in the computer determines acceptance or rejection of the part being inspected. The system's acceptance specifications are adjustable and are set to the user's established tolerances. However, the computer-controlled laser system is capable of defining from 10 to 75 rms surface finish, and voids or flaws from 0.0005 to 0.020 inch. Following the successful demonstration with an engineering prototype, the described laser machine has proved its capability to consistently ensure high-quality master brake cylinders. It thus provides a safety improvement for the automotive braking system. Flawless, smooth cylinder bores eliminate premature wearing of the rubber seals, resulting in a longer-lasting master brake cylinder and a safer and more reliable automobile. The results obtained from use of this system, which has been in operation about a year for replacement of a tedious, manual operation on one of the high-volume lines at the Bendix Hydraulics Division, have been very satisfactory.

  14. cudaMap: a GPU accelerated program for gene expression connectivity mapping.

    PubMed

    McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong

    2013-10-11

    Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.

  15. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  16. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  17. Lockheed L-1011 TriStar to support Adaptive Performance Optimization study with NASA F-18 chase plan

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This Lockheed L-1011 Tristar, seen here June 1995, is currently the subject of a new flight research experiment developed by NASA's Dryden Flight Research Center, Edwards, California, to improve the effiecency of large transport aircraft. Shown with a NASA F-18 chase plane over California's Sierra Nevada mountains during an earlier baseline flight, the jetliner operated by Oribtal Sciences Corp., recently flew its first data-gathering mission in the Adaptive Performance Optimization project. The experiment seeks to reduce fuel comsumption of large jetliners by improving the aerodynamic efficiency of their wings at cruise conditions. A research computer employing a sophisticated software program adapts to changing flight conditions by commanding small movements of the L-1011's outboard ailerons to give its wings the most efficient - or optimal - airfoil. Up to a dozen research flights will be flown in the current and follow-on phases of the project over the next couple years.

  18. The relative effects of entry parameters on thermal protection system weight. [space shuttle orbiters

    NASA Technical Reports Server (NTRS)

    Hirasaki, P. N.

    1971-01-01

    Shielding a spacecraft from the severe thermal environment of an atmospheric entry requires a sophisticated thermal protection system (TPS). Thermal computer program models were developed for two such TPS designs proposed for the space shuttle orbiter. The multilayer systems, a reusable surface insulation TPS, and a re-radiative metallic skin TPS, were sized for a cross-section of trajectories in the entry corridor. This analysis indicates the relative influence of the entry parameters on the weight of each TPS concept. The results are summarized graphically. The trajectory variables considered were down-range, cross-range, orbit inclination, entry interface velocity and flight path angle, maximum heating rate level, angle of attack, and ballistic coefficient. Variations in cross-range and flight path angle over the ranges considered had virtually no effect on the required entry TPS weight. The TPS weight was significantly more sensitive to variations in angle of attack than to dispersions in the other trajectory considered.

  19. Digital Compositing Techniques for Coronal Imaging (Invited review)

    NASA Astrophysics Data System (ADS)

    Espenak, F.

    2000-04-01

    The solar corona exhibits a huge range in brightness which cannot be captured in any single photographic exposure. Short exposures show the bright inner corona and prominences, while long exposures reveal faint details in equatorial streamers and polar brushes. For many years, radial gradient filters and other analog techniques have been used to compress the corona's dynamic range in order to study its morphology. Such techniques demand perfect pointing and tracking during the eclipse, and can be difficult to calibrate. In the past decade, the speed, memory and hard disk capacity of personal computers have rapidly increased as prices continue to drop. It is now possible to perform sophisticated image processing of eclipse photographs on commercially available CPU's. Software programs such as Adobe Photoshop permit combining multiple eclipse photographs into a composite image which compresses the corona's dynamic range and can reveal subtle features and structures. Algorithms and digital techniques used for processing 1998 eclipse photographs will be discussed which are equally applicable to the recent eclipse of 1999 August 11.

  20. Automated Blazar Light Curves Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Spencer James

    Every night in a remote clearing called Fenton Hill high in the Jemez Mountains of central New Mexico, a bank of robotically controlled telescopes tilt their lenses to the sky for another round of observation through digital imaging. Los Alamos National Laboratory’s Thinking Telescopes project is watching for celestial transients including high-power cosmic flashes called, and like all science, it can be messy work. To keep the project clicking along, Los Alamos scientists routinely install equipment upgrades, maintain the site, and refine the sophisticated machinelearning computer programs that process those images and extract useful data from them. Each week themore » system amasses 100,000 digital images of the heavens, some of which are compromised by clouds, wind gusts, focus problems, and so on. For a graduate student at the Lab taking a year’s break between master’s and Ph.D. studies, working with state-of-the-art autonomous telescopes that can make fundamental discoveries feels light years beyond the classroom.« less

  1. Is Life Unique?

    PubMed Central

    Abel, David L.

    2011-01-01

    Is life physicochemically unique? No. Is life unique? Yes. Life manifests innumerable formalisms that cannot be generated or explained by physicodynamics alone. Life pursues thousands of biofunctional goals, not the least of which is staying alive. Neither physicodynamics, nor evolution, pursue goals. Life is largely directed by linear digital programming and by the Prescriptive Information (PI) instantiated particularly into physicodynamically indeterminate nucleotide sequencing. Epigenomic controls only compound the sophistication of these formalisms. Life employs representationalism through the use of symbol systems. Life manifests autonomy, homeostasis far from equilibrium in the harshest of environments, positive and negative feedback mechanisms, prevention and correction of its own errors, and organization of its components into Sustained Functional Systems (SFS). Chance and necessity—heat agitation and the cause-and-effect determinism of nature’s orderliness—cannot spawn formalisms such as mathematics, language, symbol systems, coding, decoding, logic, organization (not to be confused with mere self-ordering), integration of circuits, computational success, and the pursuit of functionality. All of these characteristics of life are formal, not physical. PMID:25382119

  2. DiscML: an R package for estimating evolutionary rates of discrete characters using maximum likelihood.

    PubMed

    Kim, Tane; Hao, Weilong

    2014-09-27

    The study of discrete characters is crucial for the understanding of evolutionary processes. Even though great advances have been made in the analysis of nucleotide sequences, computer programs for non-DNA discrete characters are often dedicated to specific analyses and lack flexibility. Discrete characters often have different transition rate matrices, variable rates among sites and sometimes contain unobservable states. To obtain the ability to accurately estimate a variety of discrete characters, programs with sophisticated methodologies and flexible settings are desired. DiscML performs maximum likelihood estimation for evolutionary rates of discrete characters on a provided phylogeny with the options that correct for unobservable data, rate variations, and unknown prior root probabilities from the empirical data. It gives users options to customize the instantaneous transition rate matrices, or to choose pre-determined matrices from models such as birth-and-death (BD), birth-death-and-innovation (BDI), equal rates (ER), symmetric (SYM), general time-reversible (GTR) and all rates different (ARD). Moreover, we show application examples of DiscML on gene family data and on intron presence/absence data. DiscML was developed as a unified R program for estimating evolutionary rates of discrete characters with no restriction on the number of character states, and with flexibility to use different transition models. DiscML is ideal for the analyses of binary (1s/0s) patterns, multi-gene families, and multistate discrete morphological characteristics.

  3. Atmospheric Radiation Measurement Program facilities newsletter, January 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sisterson, D.L.

    2000-02-16

    The subject of this newsletter is the ARM unmanned aerospace vehicle program. The ARM Program's focus is on climate research, specifically research related to solar radiation and its interaction with clouds. The SGP CART site contains highly sophisticated surface instrumentation, but even these instruments cannot gather some crucial climate data from high in the atmosphere. The Department of Energy and the Department of Defense joined together to use a high-tech, high-altitude, long-endurance class of unmanned aircraft known as the unmanned aerospace vehicle (UAV). A UAV is a small, lightweight airplane that is controlled remotely from the ground. A pilot sitsmore » in a ground-based cockpit and flies the aircraft as if he were actually on board. The UAV can also fly completely on its own through the use of preprogrammed computer flight routines. The ARM UAV is fitted with payload instruments developed to make highly accurate measurements of atmospheric flux, radiance, and clouds. Using a UAV is beneficial to climate research in many ways. The UAV puts the instrumentation within the environment being studied and gives scientists direct measurements, in contrast to indirect measurements from satellites orbiting high above Earth. The data collected by UAVs can be used to verify and calibrate measurements and calculated values from satellites, therefore making satellite data more useful and valuable to researchers.« less

  4. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  5. Lost in Second Life: Virtual Embodiment and Language Learning via Multimodal Communication

    ERIC Educational Resources Information Center

    Pasfield-Neofitou, Sarah; Huang, Hui; Grant, Scott

    2015-01-01

    Increased recognition of the role of the body and environment in cognition has taken place in recent decades in the form of new theories of embodied and extended cognition. The growing use of ever more sophisticated computer-generated 3D virtual worlds and avatars has added a new dimension to these theories of cognition. Both developments provide…

  6. Impact of Media Richness and Flow on E-Learning Technology Acceptance

    ERIC Educational Resources Information Center

    Liu, Su-Houn; Liao, Hsiu-Li; Pratt, Jean A.

    2009-01-01

    Advances in e-learning technologies parallels a general increase in sophistication by computer users. The use of just one theory or model, such as the technology acceptance model, is no longer sufficient to study the intended use of e-learning systems. Rather, a combination of theories must be integrated in order to fully capture the complexity of…

  7. Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)

    DTIC Science & Technology

    2006-12-01

    Encapsulation HMAC Keyed-Hash Message Authentication Code ICMP Internet Control Message Protocol IEEE Institute of Electrical and Electronics Engineers IETF...Internet Engineering Task Force IOS Internetwork Operating System IP Internet Protocol ITU International Telecommunication Union LAN Local Area...network computing. Most organizations today have sophisticated networks that are connected to the Internet. The major benefit reaped from such a

  8. Box compression analysis of world-wide data spanning 46 years

    Treesearch

    Thomas J. Urbanik; Benjamin Frank

    2006-01-01

    The state of the art among most industry citations of box compression estimation is the equation by McKee developed in 1963. Because of limitations in computing tools at the time the McKee equation was developed, the equation is a simplification, with many constraints, of a more general relationship. By applying the results of sophisticated finite element modeling, in...

  9. Pattern Discovery in Biomolecular Data – Tools, Techniques, and Applications | Center for Cancer Research

    Cancer.gov

    Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph

  10. Interesting, Cool and Tantalising? Or Inappropriate, Complicated and Tedious? Pupil and Teacher Views on ICT in Science Teaching

    ERIC Educational Resources Information Center

    Willshire, Michael

    2013-01-01

    In a relatively short space of time, classrooms have become full of computers, gadgets and electronic devices. Technology will only continue to become more sophisticated, more efficient and more abundant in schools. But how desirable is this technological revolution and to what extent should it develop? To measure the effectiveness and popularity…

  11. The Energy Conservation Idea Handbook. A Compendium of Imaginative and Innovative Examples of Ideas and Practices at Colleges and Universities Today.

    ERIC Educational Resources Information Center

    Tickton, Sidney G.; And Others

    Summarized in this compendium are approximately 500 ideas being used by colleges and universities in the United States to deal with the problem of energy conservation. These ideas range from suggestions that cost pennies to implement to sophisticated computer controls or the construction of new buildings which incorporate alternative energy…

  12. A novel paradigm for telemedicine using the personal bio-monitor.

    PubMed

    Bhatikar, Sanjay R; Mahajan, Roop L; DeGroff, Curt

    2002-01-01

    The foray of solid-state technology in the medical field has yielded an arsenal of sophisticated healthcare tools. Personal, portable computing power coupled with the information superhighway open up the possibility of sophisticated healthcare management that will impact the medical field just as much. The full synergistic potential of three interwoven technologies: (1) compact electronics, (2) World Wide Web, and (3) Artificial Intelligence is yet to be realized. The system presented in this paper integrates these technologies synergistically, providing a new paradigm for healthcare. Our idea is to deploy internet-enabled, intelligent, handheld personal computers for medical diagnosis. The salient features of the 'Personal Bio-Monitor' we envisage are: (1) Utilization of the peripheral signals of the body which may be acquired non-invasively and with ease, for diagnosis of medical conditions; (2) An Artificial Neural Network (ANN) based approach for diagnosis; (3) Configuration of the diagnostic device as a handheld for personal use; (4) Internet connectivity, following the emerging bluetooth protocol, for prompt conveyance of information to a patient's health care provider via the World Wide Web. The proposal is substantiated with an intelligent handheld device developed by the investigators for pediatric cardiac auscultation. This device performed accurate diagnoses of cardiac abnormalities in pediatrics using an artificial neural network to process heart sounds acquired by a low-frequency microphone and transmitted its diagnosis to a desktop PC via infrared. The idea of the personal biomonitor presented here has the potential to streamline healthcare by optimizing two valuable resources: physicians' time and sophisticated equipment time. We show that the elements of such a system are in place, with our prototype. Our novel contribution is the synergistic integration of compact electronics' technology, artificial neural network methodology and the wireless web resulting in a revolutionary new paradigm for healthcare management.

  13. DOE's Computer Incident Advisory Capability (CIAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, E.

    1990-09-01

    Computer security is essential in maintaining quality in the computing environment. Computer security incidents, however, are becoming more sophisticated. The DOE Computer Incident Advisory Capability (CIAC) team was formed primarily to assist DOE sites in responding to computer security incidents. Among CIAC's other responsibilities are gathering and distributing information to DOE sites, providing training workshops, coordinating with other agencies, response teams, and vendors, creating guidelines for incident handling, and developing software tools. CIAC has already provided considerable assistance to DOE sites faced with virus infections and worm and hacker attacks, has issued over 40 information bulletins, and has developed andmore » presented a workshop on incident handling. CIAC's experience in helping sites has produced several lessons learned, including the need to follow effective procedures to avoid virus infections in small systems and the need for sound password management and system administration in networked systems. CIAC's activity and scope will expand in the future. 4 refs.« less

  14. Editorial Comments, 1974-1986: The Case For and Against the Use of Computer-Assisted Decision Making

    PubMed Central

    Weaver, Robert R.

    1987-01-01

    Journal editorials are an important medium for communicating information about medical innovations. Evaluative statements contained in editorials pertain to the innovation's technical merits, as well as its probable economic, social and political, and ethical consequences. This information will either promote or impede the subsequent diffusion of innovations. This paper analyzes the evaluative information contained in thirty editorials that pertain to the topic of computer-assisted decision making (CDM). Most editorials agree that CDM technology is effective and economical in performing routine clinical tasks; controversy surrounds the use of more sophisticated CDM systems for complex problem solving. A few editorials argue that the innovation should play an integral role in transforming the established health care system. Most, however, maintain that it can or should be accommodated within the existing health care framework. Finally, while few editorials discuss the ethical ramifications of CDM technology, those that do suggest that it will contribute to more humane health care. The editorial analysis suggests that CDM technology aimed at routine clinical task will experience rapid diffusion. In contrast, the diffusion of more sophisticated CDM systems will, in the foreseeable future, likely be sporadic at best.

  15. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  16. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  17. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  18. Facial Animations: Future Research Directions & Challenges

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

  19. Autonomous mobile robot research using the HERMIES-III robot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, F.G.; Beckerman, M.; Spelt, P.F.

    1989-01-01

    This paper reports on the status and future directions in the research, development and experimental validation of intelligent control techniques for autonomous mobile robots using the HERMIES-III robot at the Center for Engineering Systems Advanced research (CESAR) at Oak Ridge National Laboratory (ORNL). HERMIES-III is the fourth robot in a series of increasingly more sophisticated and capable experimental test beds developed at CESAR. HERMIES-III is comprised of a battery powered, onmi-directional wheeled platform with a seven degree-of-freedom manipulator arm, video cameras, sonar range sensors, laser imaging scanner and a dual computer system containing up to 128 NCUBE nodes in hypercubemore » configuration. All electronics, sensors, computers, and communication equipment required for autonomous operation of HERMIES-III are located on board along with sufficient battery power for three to four hours of operation. The paper first provides a more detailed description of the HERMIES-III characteristics, focussing on the new areas of research and demonstration now possible at CESAR with this new test-bed. The initial experimental program is then described with emphasis placed on autonomous performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES- III). The paper concludes with a discussion of the integration problems and safety considerations necessarily arising from the set-up of an experimental program involving human-scale, multi-autonomous mobile robots performance. 10 refs., 3 figs.« less

  20. GIS data models for coal geology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McColloch, G.H. Jr.; Timberlake, K.J.; Oldham, A.V.

    A variety of spatial data models can be applied to different aspects of coal geology. The simple vector data models found in various Computer Aided Drafting (CAD) programs are sometimes used for routine mapping and some simple analyses. However, more sophisticated applications that maintain the topological relationships between cartographic elements enhance analytical potential. Also, vector data models are best for producing various types of high quality, conventional maps. The raster data model is generally considered best for representing data that varies continuously over a geographic area, such as the thickness of a coal bed. Information is lost when contour linesmore » are threaded through raster grids for display, so volumes and tonnages are more accurately determined by working directly with raster data. Raster models are especially well suited to computationally simple surface-to-surface analysis, or overlay functions. Another data model, triangulated irregular networks (TINs) are superior at portraying visible surfaces because many TIN programs support break fines. Break lines locate sharp breaks in slope such as those generated by bodies of water or ridge crests. TINs also {open_quotes}honor{close_quotes} data points so that a surface generated from a set of points will be forced to pass through those points. TINs or grids generated from TINs, are particularly good at determining the intersections of surfaces such as coal seam outcrops and geologic unit boundaries. No single technique works best for all coal-related applications. The ability to use a variety of data models, and transform from one model to another is essential for obtaining optimum results in a timely manner.« less

  1. The Breeding Bird Survey, 1967 and 1968

    USGS Publications Warehouse

    Robbins, C.S.; Van Velzen, W.T.

    1969-01-01

    In the Breeding Bird Survey of North America, cooperators ran 982 survey routes in 1967 and 1,174 in 1968. All States except Hawaii and all Canadian Provinces except Newfoundland were included. Roadside routes are selected at random within 1-degree blocks of latitude and longitude. Each 24 1/2-mile route, with 3-minute stops spaced half a mile apart, is driven by automobile. All birds heard or seen at the stops are recorded on special forms, and the data are transferred to magnetic tape for analysis. The average number of birds of each species per route is tabulated by State and Province, presenting for the first time a record of the comparative abundance of each species across the continent. The sample size is given for each species recorded. A sophisticated analysis program, here employed for the first time, is used to compute weighted mean values of the survey results for selected species at the State, stratum, regional, and continental level. The statistical significance of year-to-year changes at the 80, 90, 95, and 99 percent levels of probability are part of the computer output. An index for comparing populations of each species from year to year is established, with 1968 as the base year. Maps show the breeding range and comparative abundance of selected species.

  2. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  3. LANDSAT-D investigations in snow hydrology

    NASA Technical Reports Server (NTRS)

    Dozier, J.

    1983-01-01

    The atmospheric radiative transfer calculation program (ATARD) and its supporting programs (setting up atmospheric profile, making Mie tables and an exponential-sum-fitting table) were completed. More sophisticated treatment of aerosol scattering (including angular phase function or asymmetric factor) and multichannel analysis of results from ATRAD are being developed. Some progress was made on a Monte Carlo program for examining two dimensional effects, specifically a surface boundary condition that varies across a scene. The MONTE program combines ATRAD and the Monte Carlo method together to produce an atmospheric point spread function. Currently the procedure passes monochromatic tests and the results are reasonable.

  4. Advanced imaging programs: maximizing a multislice CT investment.

    PubMed

    Falk, Robert

    2008-01-01

    Advanced image processing has moved from a luxury to a necessity in the practice of medicine. A hospital's adoption of sophisticated 3D imaging entails several important steps with many factors to consider in order to be successful. Like any new hospital program, 3D post-processing should be introduced through a strategic planning process that includes administrators, physicians, and technologists to design, implement, and market a program that is scalable-one that minimizes up front costs while providing top level service. This article outlines the steps for planning, implementation, and growth of an advanced imaging program.

  5. Identifying Productive Resources in Secondary School Students' Discourse about Energy

    ERIC Educational Resources Information Center

    Harrer, Benedikt

    2013-01-01

    A growing program of research in science education acknowledges the beginnings of disciplinary reasoning in students' ideas and seeks to inform instruction that responds productively to these disciplinary progenitors in the moment to foster their development into sophisticated scientific practice. This dissertation examines secondary school…

  6. Knowledge Is Power. Research Can Help Your Marketing Program Succeed.

    ERIC Educational Resources Information Center

    Smith, Robert M.

    1982-01-01

    Three major types of market research can be helpful in college marketing: exploratory (internal and external to the college); developmental, to test marketing strategies and messages; and evaluative, to complete the market planning cycle. Increasingly sophisticated and accountable marketing techniques can be developed. (MSE)

  7. Big Explosives Experimental Facility - BEEF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  8. Mapping with Young Children.

    ERIC Educational Resources Information Center

    Sunal, Cynthia Szymanski; Warash, Bobbi Gibson

    Techniques for encouraging young children to discover the purpose and use of maps are discussed. Motor activity and topological studies form a base from which the teacher and children can build a mapping program of progressive sophistication. Concepts important to mapping include boundaries, regions, exteriors, interiors, holes, order, point of…

  9. EDUCATIONAL SPECIFICATIONS FOR SECONDARY SCHOOLS.

    ERIC Educational Resources Information Center

    FLANIGAN, VIRGINIA; AND OTHERS

    THE REPORT CAN BE USED AS A GUIDE IN THE PREPARATION OF EDUCATIONAL SPECIFICATIONS FOR SECONDARY SCHOOLS. NEW CURRICULA, METHODS OF INSTRUCTION, AND TEACHING AIDS ADD TO THE SOPHISTICATION OF EDUCATION. PROGRAMS ENCOMPASS MANY AREAS OF EDUCATION, EACH REQUIRING PROFESSIONAL DECISIONS. THESE DECISIONS MUST BE ORGANIZED INTO WRITTEN SPECIFICATIONS…

  10. Big Explosives Experimental Facility - BEEF

    ScienceCinema

    None

    2018-01-16

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  11. Expert Perspectives: Future of Teacher Preparation in the Digital Age

    ERIC Educational Resources Information Center

    Alliance for Excellent Education, 2013

    2013-01-01

    As schools, classrooms, and districts move toward more sophisticated instructional technologies to successfully implement higher college- and career-ready standards, educator-preparation programs must act quickly to equip future educators with the necessary skills to use technology effectively to personalize instruction and increase student…

  12. Interactive Distance Learning in Connecticut.

    ERIC Educational Resources Information Center

    Pietras, Jesse John; Murphy, Robert J.

    This paper provides an overview of distance learning activities in Connecticut and addresses the feasibility of such activities. Distance education programs have evolved from the one dimensional electronic mail systems to the use of sophisticated digital fiber networks. The Middlesex Distance Learning Consortium has developed a long-range plan to…

  13. Integrating Environmental Education

    ERIC Educational Resources Information Center

    Paterson, Jim

    2009-01-01

    Thinking green is normal for the current generation of high school students, who have always had recycling bins in their classrooms and green themes in their assemblies, their lessons, and their television shows. It follows that sophisticated, multidisciplinary programs are now operating in schools throughout the country to educate students about…

  14. Improved perceptual-motor performance measurement system

    NASA Technical Reports Server (NTRS)

    Parker, J. F., Jr.; Reilly, R. E.

    1969-01-01

    Battery of tests determines the primary dimensions of perceptual-motor performance. Eighteen basic measures range from simple tests to sophisticated electronic devices. Improved system has one unit for the subject containing test display and response elements, and one for the experimenter where test setups, programming, and scoring are accomplished.

  15. Conservative zonal schemes for patched grids in 2 and 3 dimensions

    NASA Technical Reports Server (NTRS)

    Hessenius, Kristin A.

    1987-01-01

    The computation of flow over complex geometries, such as realistic aircraft configurations, poses difficult grid generation problems for computational aerodynamicists. The creation of a traditional, single-module grid of acceptable quality about an entire configuration may be impossible even with the most sophisticated of grid generation techniques. A zonal approach, wherein the flow field is partitioned into several regions within which grids are independently generated, is a practical alternative for treating complicated geometries. This technique not only alleviates the problems of discretizing a complex region, but also facilitates a block processing approach to computation thereby circumventing computer memory limitations. The use of such a zonal scheme, however, requires the development of an interfacing procedure that ensures a stable, accurate, and conservative calculation for the transfer of information across the zonal borders.

  16. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously. The new approaches and expanded use of computers will require substantial increases in the quantity and sophistication of the Division 's computer resources. The requirements presented in this report will be used to develop technical specifications that describe the computer resources needed during the 1990's. (USGS)

  17. Cultural Factors in Managing an FMS Case Program: Saudi Arabian Army Ordnance Corps (SOCP) Program

    DTIC Science & Technology

    1977-11-01

    which included the purchase of large amounts of US;--’,oducee current generation self-Dromelled artillery, personnel earri- ero, tanks, mortar carriers...exores:ecd when attempting, to discuss 13 complex, sophisticated technical material with senior counterparts who possessed relative fluency in...i.ored -:ith ’ mop ity; they crnnot be rvoided; the: can to a rrroat extent be anticipated as critical man- cement factors. Bfy anticipating and preparing

  18. Free energies of binding from large-scale first-principles quantum mechanical calculations: application to ligand hydration energies.

    PubMed

    Fox, Stephen J; Pittock, Chris; Tautermann, Christofer S; Fox, Thomas; Christ, Clara; Malcolm, N O J; Essex, Jonathan W; Skylaris, Chris-Kriton

    2013-08-15

    Schemes of increasing sophistication for obtaining free energies of binding have been developed over the years, where configurational sampling is used to include the all-important entropic contributions to the free energies. However, the quality of the results will also depend on the accuracy with which the intermolecular interactions are computed at each molecular configuration. In this context, the energy change associated with the rearrangement of electrons (electronic polarization and charge transfer) upon binding is a very important effect. Classical molecular mechanics force fields do not take this effect into account explicitly, and polarizable force fields and semiempirical quantum or hybrid quantum-classical (QM/MM) calculations are increasingly employed (at higher computational cost) to compute intermolecular interactions in free-energy schemes. In this work, we investigate the use of large-scale quantum mechanical calculations from first-principles as a way of fully taking into account electronic effects in free-energy calculations. We employ a one-step free-energy perturbation (FEP) scheme from a molecular mechanical (MM) potential to a quantum mechanical (QM) potential as a correction to thermodynamic integration calculations within the MM potential. We use this approach to calculate relative free energies of hydration of small aromatic molecules. Our quantum calculations are performed on multiple configurations from classical molecular dynamics simulations. The quantum energy of each configuration is obtained from density functional theory calculations with a near-complete psinc basis set on over 600 atoms using the ONETEP program.

  19. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  20. Password Cracking Using Sony Playstations

    NASA Astrophysics Data System (ADS)

    Kleinhans, Hugo; Butts, Jonathan; Shenoi, Sujeet

    Law enforcement agencies frequently encounter encrypted digital evidence for which the cryptographic keys are unknown or unavailable. Password cracking - whether it employs brute force or sophisticated cryptanalytic techniques - requires massive computational resources. This paper evaluates the benefits of using the Sony PlayStation 3 (PS3) to crack passwords. The PS3 offers massive computational power at relatively low cost. Moreover, multiple PS3 systems can be introduced easily to expand parallel processing when additional power is needed. This paper also describes a distributed framework designed to enable law enforcement agents to crack encrypted archives and applications in an efficient and cost-effective manner.

  1. Designer drugs: the evolving science of drug discovery.

    PubMed

    Wanke, L A; DuBose, R F

    1998-07-01

    Drug discovery and design are fundamental to drug development. Until recently, most drugs were discovered through random screening or developed through molecular modification. New technologies are revolutionizing this phase of drug development. Rational drug design, using powerful computers and computational chemistry and employing X-ray crystallography, nuclear magnetic resonance spectroscopy, and three-dimensional quantitative structure activity relationship analysis, is creating highly specific, biologically active molecules by virtual reality modeling. Sophisticated screening technologies are eliminating all but the most active lead compounds. These new technologies promise more efficacious, safe, and cost-effective medications, while minimizing drug development time and maximizing profits.

  2. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  3. Robots Are Taking Over--Who Does What.

    ERIC Educational Resources Information Center

    Garrison, H. Don

    Robots are machines designed to replace human labor. A fear of vast unemployment due to robots seems unfounded, however, since industrialization creates many more jobs and automation requires technologists to build, program, maintain, and operate sophisticated equipment. Robots possess an intelligence unit, a manipulator, and an end effector.…

  4. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  5. Microcomputers and Stimulus Control: From the Laboratory to the Classroom.

    ERIC Educational Resources Information Center

    LeBlanc, Judith M.; And Others

    1985-01-01

    The need for developing a technology of teaching that equals current sophistication of microcomputer technology is addressed. The importance of principles of learning and behavior analysis is emphasized. Potential roles of stimulus control and precise error analysis in educational program development and in prescription of specific learning…

  6. The Impending Revolution in School Business Management.

    ERIC Educational Resources Information Center

    James, H. Thomas

    The development of logically sophisticated analytical models in a growing number of fields has placed new emphasis on efficiency in school management. Recent systems models guiding the longrun analysis of school management in terms of efficiency--through cost-benefit studies, systems analysis, and program planning and budgeting systems--are in…

  7. Student Satisfaction and Graduate Part-Time Students

    ERIC Educational Resources Information Center

    Moore, Monica Moody

    2011-01-01

    The Advanced Academic Programs (AAP) of the Zanvyl Krieger School of Arts and Sciences at Johns Hopkins University (JHU) enrolls approximately 2,700 part-time graduate students across three physical locations. It is a complex organization whose target audience is a sophisticated consumer of higher education. With the support of Eduventures, AAP…

  8. An Application of Computerized Instructional Television in Biology.

    ERIC Educational Resources Information Center

    Kendrick, Bryce

    Computerized instructional television was used to teach undergraduate students about 100,000 or more extant fungi through an interactive, self testing, teaching program. Students did not find this sophisticated hardware an adequate substitute for the lecture experience and ultimately gave their professor a strong vote of confidence. (Author/JEG)

  9. Eavesdropping on coconut rhinoceros beetles, red palm weevils, Asian longhorned beetles, and other invasive travelers

    USDA-ARS?s Scientific Manuscript database

    As global trade increases, invasive insects inflict increasing economic damage to agriculture and urban landscapes in the United States yearly, despite a sophisticated array of interception methods and quarantine programs designed to exclude their entry. Insects that are hidden inside soil, wood, or...

  10. Improving Instructions Using a Data Analysis Collaborative Model

    ERIC Educational Resources Information Center

    Good, Rebecca B.; Jackson, Sherion H.

    2007-01-01

    As student data analysis reports become more sophisticated, these reports reveal greater details on low performance skills. Availability of models and programs depicting detailed instructions or guidance for utilizing data to impact classroom instruction, in an effort to increase student achievement, has been lacking. This study examines the…

  11. A History of Instructional Technology.

    ERIC Educational Resources Information Center

    Saettler, Paul

    Theoretical and methodological foundations of the modern audiovisual/radio/television/programed instruction complex have been provided by educational theorists from the Elder Sophists of the fifth century B.C., to the medieval scholars who taught in the monastic or cathedral schools, to the reformers of 1700-1900, to the psychologists of the 20th…

  12. The microcomputer scientific software series 2: general linear model--regression.

    Treesearch

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  13. Program Assessment: Getting to a Practical How-To Model

    ERIC Educational Resources Information Center

    Gardiner, Lorraine R.; Corbitt, Gail; Adams, Steven J.

    2010-01-01

    The Association to Advance Collegiate Schools of Business (AACSB) International's assurance of learning (AoL) standards require that schools develop a sophisticated continuous-improvement process. The authors review various assessment models and develop a practical, 6-step AoL model based on the literature and the authors' AoL-implementation…

  14. Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health

    PubMed Central

    D'Alfonso, Simon; Santesteban-Echarri, Olga; Rice, Simon; Wadley, Greg; Lederman, Reeva; Miles, Christopher; Gleeson, John; Alvarez-Jimenez, Mario

    2017-01-01

    Introduction: Benefits from mental health early interventions may not be sustained over time, and longer-term intervention programs may be required to maintain early clinical gains. However, due to the high intensity of face-to-face early intervention treatments, this may not be feasible. Adjunctive internet-based interventions specifically designed for youth may provide a cost-effective and engaging alternative to prevent loss of intervention benefits. However, until now online interventions have relied on human moderators to deliver therapeutic content. More sophisticated models responsive to user data are critical to inform tailored online therapy. Thus, integration of user experience with a sophisticated and cutting-edge technology to deliver content is necessary to redefine online interventions in youth mental health. This paper discusses the development of the moderated online social therapy (MOST) web application, which provides an interactive social media-based platform for recovery in mental health. We provide an overview of the system's main features and discus our current work regarding the incorporation of advanced computational and artificial intelligence methods to enhance user engagement and improve the discovery and delivery of therapy content. Methods: Our case study is the ongoing Horyzons site (5-year randomized controlled trial for youth recovering from early psychosis), which is powered by MOST. We outline the motivation underlying the project and the web application's foundational features and interface. We discuss system innovations, including the incorporation of pertinent usage patterns as well as identifying certain limitations of the system. This leads to our current motivations and focus on using computational and artificial intelligence methods to enhance user engagement, and to further improve the system with novel mechanisms for the delivery of therapy content to users. In particular, we cover our usage of natural language analysis and chatbot technologies as strategies to tailor interventions and scale up the system. Conclusions: To date, the innovative MOST system has demonstrated viability in a series of clinical research trials. Given the data-driven opportunities afforded by the software system, observed usage patterns, and the aim to deploy it on a greater scale, an important next step in its evolution is the incorporation of advanced and automated content delivery mechanisms. PMID:28626431

  15. Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health.

    PubMed

    D'Alfonso, Simon; Santesteban-Echarri, Olga; Rice, Simon; Wadley, Greg; Lederman, Reeva; Miles, Christopher; Gleeson, John; Alvarez-Jimenez, Mario

    2017-01-01

    Introduction: Benefits from mental health early interventions may not be sustained over time, and longer-term intervention programs may be required to maintain early clinical gains. However, due to the high intensity of face-to-face early intervention treatments, this may not be feasible. Adjunctive internet-based interventions specifically designed for youth may provide a cost-effective and engaging alternative to prevent loss of intervention benefits. However, until now online interventions have relied on human moderators to deliver therapeutic content. More sophisticated models responsive to user data are critical to inform tailored online therapy. Thus, integration of user experience with a sophisticated and cutting-edge technology to deliver content is necessary to redefine online interventions in youth mental health. This paper discusses the development of the moderated online social therapy (MOST) web application, which provides an interactive social media-based platform for recovery in mental health. We provide an overview of the system's main features and discus our current work regarding the incorporation of advanced computational and artificial intelligence methods to enhance user engagement and improve the discovery and delivery of therapy content. Methods: Our case study is the ongoing Horyzons site (5-year randomized controlled trial for youth recovering from early psychosis), which is powered by MOST. We outline the motivation underlying the project and the web application's foundational features and interface. We discuss system innovations, including the incorporation of pertinent usage patterns as well as identifying certain limitations of the system. This leads to our current motivations and focus on using computational and artificial intelligence methods to enhance user engagement, and to further improve the system with novel mechanisms for the delivery of therapy content to users. In particular, we cover our usage of natural language analysis and chatbot technologies as strategies to tailor interventions and scale up the system. Conclusions: To date, the innovative MOST system has demonstrated viability in a series of clinical research trials. Given the data-driven opportunities afforded by the software system, observed usage patterns, and the aim to deploy it on a greater scale, an important next step in its evolution is the incorporation of advanced and automated content delivery mechanisms.

  16. Charity gets children well-connected.

    PubMed

    2008-06-01

    A growing number of young patients at Sheffield Children's Hospital will soon be able to keep up with schoolwork, access TV and other entertainment services, and telephone friends and family, all at no cost, following the installation of a sophisticated bedside patient entertainment/computing system supplied by Wandsworth Group. In a believed UK first, the equipment is being entirely funded by the hospital's charity. Health Estate Journal reports.

  17. Teaching Strategies for Using Projected Images to Develop Conceptual Understanding: Exploring Discussion Practices in Computer Simulation and Static Image-Based Lessons

    ERIC Educational Resources Information Center

    Price, Norman T.

    2013-01-01

    The availability and sophistication of visual display images, such as simulations, for use in science classrooms has increased exponentially however, it can be difficult for teachers to use these images to encourage and engage active student thinking. There is a need to describe flexible discussion strategies that use visual media to engage active…

  18. The Impact of Online Teaching and Learning about Emotional Intelligence, Myers Briggs Personality Dimensions and Mindfulness on Personal and Social Awareness

    ERIC Educational Resources Information Center

    Cotler, Jami L.

    2016-01-01

    As computer-meditated communication continues to evolve and become more sophisticated and accessible, the applications for this technology continue to grow. One area that has garnered a considerable amount of attention is online teaching and learning. Research has shown increasing evidence that learning outcomes of face-to-face, in comparison to…

  19. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  20. User-customized brain computer interfaces using Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Bashashati, Hossein; Ward, Rabab K.; Bashashati, Ali

    2016-04-01

    Objective. The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject’s brain characteristics. Approach. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. Main Results. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Significance. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  1. Buying in to bioinformatics: an introduction to commercial sequence analysis software

    PubMed Central

    2015-01-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. PMID:25183247

  2. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    PubMed

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.

  3. Driving on the surface of Mars with the rover sequencing and visualization program

    NASA Technical Reports Server (NTRS)

    Wright, J.; Hartman, F.; Cooper, B.; Maxwell, S.; Yen, J.; Morrison, J.

    2005-01-01

    Operating a rover on Mars is not possible using teleoperations due to the distance involved and the bandwith limitations. To operate these rovers requires sophisticated tools to make operators knowledgeable of the terrain, hazards, features of interest, and rover state and limitations, and to support building command sequences and rehearsing expected operations. This paper discusses how the Rover Sequencing and Visualization program and a small set of associated tools support this requirement.

  4. InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.

    2010-12-01

    The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.

  5. Single-molecule dataset (SMD): a generalized storage format for raw and processed single-molecule data.

    PubMed

    Greenfeld, Max; van de Meent, Jan-Willem; Pavlichin, Dmitri S; Mabuchi, Hideo; Wiggins, Chris H; Gonzalez, Ruben L; Herschlag, Daniel

    2015-01-16

    Single-molecule techniques have emerged as incisive approaches for addressing a wide range of questions arising in contemporary biological research [Trends Biochem Sci 38:30-37, 2013; Nat Rev Genet 14:9-22, 2013; Curr Opin Struct Biol 2014, 28C:112-121; Annu Rev Biophys 43:19-39, 2014]. The analysis and interpretation of raw single-molecule data benefits greatly from the ongoing development of sophisticated statistical analysis tools that enable accurate inference at the low signal-to-noise ratios frequently associated with these measurements. While a number of groups have released analysis toolkits as open source software [J Phys Chem B 114:5386-5403, 2010; Biophys J 79:1915-1927, 2000; Biophys J 91:1941-1951, 2006; Biophys J 79:1928-1944, 2000; Biophys J 86:4015-4029, 2004; Biophys J 97:3196-3205, 2009; PLoS One 7:e30024, 2012; BMC Bioinformatics 288 11(8):S2, 2010; Biophys J 106:1327-1337, 2014; Proc Int Conf Mach Learn 28:361-369, 2013], it remains difficult to compare analysis for experiments performed in different labs due to a lack of standardization. Here we propose a standardized single-molecule dataset (SMD) file format. SMD is designed to accommodate a wide variety of computer programming languages, single-molecule techniques, and analysis strategies. To facilitate adoption of this format we have made two existing data analysis packages that are used for single-molecule analysis compatible with this format. Adoption of a common, standard data file format for sharing raw single-molecule data and analysis outcomes is a critical step for the emerging and powerful single-molecule field, which will benefit both sophisticated users and non-specialists by allowing standardized, transparent, and reproducible analysis practices.

  6. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  7. Integrative multicellular biological modeling: a case study of 3D epidermal development using GPU algorithms

    PubMed Central

    2010-01-01

    Background Simulation of sophisticated biological models requires considerable computational power. These models typically integrate together numerous biological phenomena such as spatially-explicit heterogeneous cells, cell-cell interactions, cell-environment interactions and intracellular gene networks. The recent advent of programming for graphical processing units (GPU) opens up the possibility of developing more integrative, detailed and predictive biological models while at the same time decreasing the computational cost to simulate those models. Results We construct a 3D model of epidermal development and provide a set of GPU algorithms that executes significantly faster than sequential central processing unit (CPU) code. We provide a parallel implementation of the subcellular element method for individual cells residing in a lattice-free spatial environment. Each cell in our epidermal model includes an internal gene network, which integrates cellular interaction of Notch signaling together with environmental interaction of basement membrane adhesion, to specify cellular state and behaviors such as growth and division. We take a pedagogical approach to describing how modeling methods are efficiently implemented on the GPU including memory layout of data structures and functional decomposition. We discuss various programmatic issues and provide a set of design guidelines for GPU programming that are instructive to avoid common pitfalls as well as to extract performance from the GPU architecture. Conclusions We demonstrate that GPU algorithms represent a significant technological advance for the simulation of complex biological models. We further demonstrate with our epidermal model that the integration of multiple complex modeling methods for heterogeneous multicellular biological processes is both feasible and computationally tractable using this new technology. We hope that the provided algorithms and source code will be a starting point for modelers to develop their own GPU implementations, and encourage others to implement their modeling methods on the GPU and to make that code available to the wider community. PMID:20696053

  8. Performance evaluation of GPU parallelization, space-time adaptive algorithms, and their combination for simulating cardiac electrophysiology.

    PubMed

    Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo

    2018-02-01

    The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows

    PubMed Central

    Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi

    2016-01-01

    Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788

  10. Historical data recording for process computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, J.C.; Sellars, H.L.

    1981-11-01

    Computers have been used to monitor and control chemical and refining processes for more than 15 years. During this time, there has been a steady growth in the variety and sophistication of the functions performed by these process computers. Early systems were limited to maintaining only current operating measurements, available through crude operator's consoles or noisy teletypes. The value of retaining a process history, that is, a collection of measurements over time, became apparent, and early efforts produced shift and daily summary reports. The need for improved process historians which record, retrieve and display process information has grown as processmore » computers assume larger responsibilities in plant operations. This paper describes newly developed process historian functions that have been used on several of its in-house process monitoring and control systems in Du Pont factories. 3 refs.« less

  11. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  12. A Model for School Board Operation.

    ERIC Educational Resources Information Center

    Hickcox, Edward S.; And Others

    A school board must operate in such a way that it can cope with the increasingly larger size, complex organization, and sophisticated programs of schools. The relationships among the community, board, and school can be viewed as component parts of a system. Formal and informal lines of communication exist among these parts--between the community…

  13. Prediction of School Performance from the Minnesota Child Development Inventory: Implications for Preschool Screening.

    ERIC Educational Resources Information Center

    Colligan, Robert C.

    Almost all preschool screening programs depend entirely on information and observations obtained during a brief evaluative session with the child. However, the logistics involved in managing large numbers of parents and children, the use of volunteers having varying degrees of sophistication or competency in assessment, the reliability and…

  14. Meeting the Social and Legal Needs of Urban Indians: An Experimental Program.

    ERIC Educational Resources Information Center

    Halverson, Lowell K.; Garrow, Tom

    Approximately 40 percent of America's Indians live in urban environments; of these, about 12,000 live in Seattle, Washington. With no representation in local government, and lacking the power and cultural sophistication to make the political process work for them, many Indian emigres have developed an almost institutionalized distrust of and…

  15. Audio-Vision: Audio-Visual Interaction in Desktop Multimedia.

    ERIC Educational Resources Information Center

    Daniels, Lee

    Although sophisticated multimedia authoring applications are now available to amateur programmers, the use of audio in of these programs has been inadequate. Due to the lack of research in the use of audio in instruction, there are few resources to assist the multimedia producer in using sound effectively and efficiently. This paper addresses the…

  16. Key Lessons about Induction for Policy Makers and Researchers

    ERIC Educational Resources Information Center

    Wayne, Andrew J.

    2012-01-01

    The purpose of this chapter is to digest the core chapters of this volume, which draws together some of the most sophisticated thinking on new teacher induction from the last decade. This chapter attends to five key understandings about induction programs, including their context, design, implementation, and outcomes. These understandings emerge…

  17. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    ERIC Educational Resources Information Center

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  18. Physical Modeling in the Geological Sciences: An Annotated Bibliography. CEGS Programs Publication No. 16.

    ERIC Educational Resources Information Center

    Charlesworth, L. J., Jr.; Passero, Richard Nicholas

    The bibliography identifies, describes, and evaluates devices and techniques discussed in the world's literature to demonstrate or stimulate natural physical geologic phenomena in classroom or laboratory teaching or research situations. The aparatus involved ranges from the very simple and elementary to the highly complex, sophisticated, and…

  19. Have the Focus and Sophistication of Research in Health Education Changed?

    ERIC Educational Resources Information Center

    Merrill, Ray M.; Lindsay, Christopher A.; Shields, Eric C.; Stoddard, Julianne

    2007-01-01

    This study assessed the types of research and the statistical methods used in three representative health education journals from 1994 through 2003. Editorials, commentaries, program/practice notes, and perspectives represent 17.6% of the journals' content. The most common types of articles are cross-sectional studies (27.5%), reviews (23.2%), and…

  20. 77 FR 3559 - Energy Conservation Program for Consumer Products: Test Procedures for Refrigerators...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ..., which is typical of an approach enabled by more sophisticated electronic controls. Id. The interim final... and long- time automatic defrost or variable defrost control and adjust the default values of maximum... accurate measurement of the energy use of products with variable defrost control. DATES: The amendments are...

  1. Design and Implementation of an Interdepartmental Biotechnology Program across Engineering Technology Curricula

    ERIC Educational Resources Information Center

    Clase, Kari

    2008-01-01

    The health industry is an important and growing economic engine. Advances are being made in pharmaceutical and biotechnology discoveries and their applications (including manufacturing), as well as in health care services. As a result, there is an increasing sophistication of the products and services available and being developed, with an…

  2. Going against the Grain of Accountability Policy: Leadership Preparation for Using Data to Promote Socially Just Outcomes

    ERIC Educational Resources Information Center

    Mackey, Hollie J.

    2015-01-01

    Leadership preparation programs are in transition as scholars seek to determine more sophisticated approaches in developing leaders for the increasing demands of accountability policy. This critical conceptual analysis focuses on leadership preparation for the socialization of school leaders. It is intended to reframe current perspectives about…

  3. Aeronautics and Space Report of the President: 1977 Activities.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The national programs in aeronautics and space made steady progress in 1977 toward their long-term objectives. In aeronautics the goals were improved performance, energy efficiency, and safety in aircraft. In space the goals were: (1) better remote sensing systems to generate more sophisticated information about the Earth's environment; (2)…

  4. Recent workforce trends and their effects on the silviculture program in British Columbia

    Treesearch

    John Betts

    2008-01-01

    British Columbia's entrepreneurial silviculture sector provides a reliable just-in-time service delivery of forestry activities from planting trees to fighting wildfires. Transient, and seeming to rely often on improvisation, contractors actually run logistically sophisticated businesses that are able to match varying field conditions to the biological and...

  5. Technical Report of the NAEP 1992 Trial State Assessment Program in Mathematics.

    ERIC Educational Resources Information Center

    Johnson, Eugene G.; And Others

    The "Nation's Report Card," the National Assessment of Educational Progress (NAEP), is the only nationally representative and continuing assessment of what America's students know and can do in various subject areas. This report summarizes some of the sophisticated statistical methodology used in the 1992 Trial State Assessment of…

  6. Minimize Subjective Theory, Maximize Authentic Experience in the Teaching of French Civilization.

    ERIC Educational Resources Information Center

    Corredor, Eva L.

    A program developed to teach French civilization and modern France at the U.S. Naval Academy (Annapolis, Maryland) was designed to take advantage of readily available, relatively sophisticated technology for classroom instruction. The hardware used includes a satellite earth station that receives regular television broadcasts from France, a…

  7. Practical aspects of modeling aircraft dynamics from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1984-01-01

    The purpose of parameter estimation, a subset of system identification, is to estimate the coefficients (such as stability and control derivatives) of the aircraft differential equations of motion from sampled measured dynamic responses. In the past, the primary reason for estimating stability and control derivatives from flight tests was to make comparisons with wind tunnel estimates. As aircraft became more complex, and as flight envelopes were expanded to include flight regimes that were not well understood, new requirements for the derivative estimates evolved. For many years, the flight determined derivatives were used in simulations to aid in flight planning and in pilot training. The simulations were particularly important in research flight test programs in which an envelope expansion into new flight regimes was required. Parameter estimation techniques for estimating stability and control derivatives from flight data became more sophisticated to support the flight test programs. As knowledge of these new flight regimes increased, more complex aircraft were flown. Much of this increased complexity was in sophisticated flight control systems. The design and refinement of the control system required higher fidelity simulations than were previously required.

  8. Programming chemical kinetics: engineering dynamic reaction networks with DNA strand displacement

    NASA Astrophysics Data System (ADS)

    Srinivas, Niranjan

    Over the last century, the silicon revolution has enabled us to build faster, smaller and more sophisticated computers. Today, these computers control phones, cars, satellites, assembly lines, and other electromechanical devices. Just as electrical wiring controls electromechanical devices, living organisms employ "chemical wiring" to make decisions about their environment and control physical processes. Currently, the big difference between these two substrates is that while we have the abstractions, design principles, verification and fabrication techniques in place for programming with silicon, we have no comparable understanding or expertise for programming chemistry. In this thesis we take a small step towards the goal of learning how to systematically engineer prescribed non-equilibrium dynamical behaviors in chemical systems. We use the formalism of chemical reaction networks (CRNs), combined with mass-action kinetics, as our programming language for specifying dynamical behaviors. Leveraging the tools of nucleic acid nanotechnology (introduced in Chapter 1), we employ synthetic DNA molecules as our molecular architecture and toehold-mediated DNA strand displacement as our reaction primitive. Abstraction, modular design and systematic fabrication can work only with well-understood and quantitatively characterized tools. Therefore, we embark on a detailed study of the "device physics" of DNA strand displacement (Chapter 2). We present a unified view of strand displacement biophysics and kinetics by studying the process at multiple levels of detail, using an intuitive model of a random walk on a 1-dimensional energy landscape, a secondary structure kinetics model with single base-pair steps, and a coarse-grained molecular model that incorporates three-dimensional geometric and steric effects. Further, we experimentally investigate the thermodynamics of three-way branch migration. Our findings are consistent with previously measured or inferred rates for hybridization, fraying, and branch migration, and provide a biophysical explanation of strand displacement kinetics. Our work paves the way for accurate modeling of strand displacement cascades, which would facilitate the simulation and construction of more complex molecular systems. In Chapters 3 and 4, we identify and overcome the crucial experimental challenges involved in using our general DNA-based technology for engineering dynamical behaviors in the test tube. In this process, we identify important design rules that inform our choice of molecular motifs and our algorithms for designing and verifying DNA sequences for our molecular implementation. We also develop flexible molecular strategies for "tuning" our reaction rates and stoichiometries in order to compensate for unavoidable non-idealities in the molecular implementation, such as imperfectly synthesized molecules and spurious "leak" pathways that compete with desired pathways. We successfully implement three distinct autocatalytic reactions, which we then combine into a de novo chemical oscillator. Unlike biological networks, which use sophisticated evolved molecules (like proteins) to realize such behavior, our test tube realization is the first to demonstrate that Watson-Crick base pairing interactions alone suffice for oscillatory dynamics. Since our design pipeline is general and applicable to any CRN, our experimental demonstration of a de novo chemical oscillator could enable the systematic construction of CRNs with other dynamic behaviors.

  9. Manufacturing Technology Research Needs of the Gear Industry.

    DTIC Science & Technology

    1987-12-31

    Management Shortcomings within the U.S. Precision Gear Industry ........... 33 2.2.7 European Gear and Machine Tool Companies ....... .. 35 2.2.8 German...manufacturing becomes more sophisticated, workers are running numerically con- trolled computer equipment requiring an understanding of math. 2.2.6.9 Management ...inefficiencies of the job shop environ- ment by managing the gear business as a backward integra- tion of the assembly line. o Develop and maintain

  10. Introduction to the concepts of TELEDEMO and TELEDIMS

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Schlutsmeyer, A. P.

    1982-01-01

    An introduction to the system concepts: TELEDEMO and TELEDIMS is provided. TELEDEMO is derived primarily from computer graphics and, via incorporation of sophisticated image data compression, enables effective low cost teleconferencing at data rates as low as 1K bit/second using dial-up phone lines. Combining TELEDEMO's powerful capabilities for the development of presentation material with microprocessor-based Information Management Systems (IMS) yields a truly all electronic IMS called TELEDIMS.

  11. Forensic Information Warfare Requirement Study

    DTIC Science & Technology

    2002-06-01

    technologies that are taking place now and in the near future that will adversely impact the current technologies and require additional sophistication...WetStone Technologies, Inc. moderated a panel at the Economic Crime Investigation Institute’s Ninth Annual Conference (Fraud Management in the Twenty-First...second, to ascertain the legal impact of these tools. Their report was delivered to AFRL and provides an in-depth look into these areas. 8 Computer

  12. Computational Modeling of Cultural Dimensions in Adversary Organizations

    DTIC Science & Technology

    2010-01-01

    Nodes”, In the Proceedings of the 9th Conference on Uncertainty in Artificial Intelli - gence, 1993. [8] Pearl, J. Probabilistic Reasoning in...the artificial life simulations; in con- trast, models with only a few agents typically employ quite sophisticated cognitive agents capa- ble of...Model Construction 45 cisions as to how to allocate scarce ISR assets (two Unmanned Air Systems, UAS ) among the two Red activities while at the same

  13. CD-ROM And Knowledge Integration

    NASA Astrophysics Data System (ADS)

    Rann, Leonard S.

    1988-06-01

    As the title of this paper suggests, it is about CD-ROM technology and the structuring of massive databases. Even more, it is about the impact CD-ROM has had on the publication of massive amounts of information, and the unique qualities of the medium that allows for the most sophisticated computer retrieval techniques that have ever been used. I am not drawing on experience as a pedant in the educational field, but rather as a software and database designer who has worked with CD-ROM since its inception. I will be giving examples from my company's current applications, as well as discussing some of the challenges that face information publishers in the future. In particular I have a belief about what the most valuable outlet can be created using CD-ROM will be: The CD-ROM is particularly suited for the mass delivery of information systems and databases that either require or utilize a large amount of computational preprocessing to allow a real-time or interactive response to be achieved. Until the advent of CD-ROM technology this level of sophistication in publication was virtually impossible. I will further explain this later in this paper. First, I will discuss the salient features of CD-ROM that make it unique in the world of data storage for electronic publishing.

  14. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  15. National Dam Safety Program. Watchung Lake Dam (NJ00767), Raritan River Basin, Stony Brook, Somerset County, New Jersey Phase I Inspection Report.

    DTIC Science & Technology

    1981-07-01

    Safety Program Erosion Embankmients Watchung Lake Dam, N.J. Visual InspectionSeae IStructural Analysis Spillways 12M~ A0ST Acr (cathiue samvwgip @ta N...determined by a qualified professional consultant engaged by the owner using more sophisticated methods , procedures and studies within six months...be overtopped. (The SDF, in this instance, is one half of the Probable Maximum Flood). The decision to consider the spillway " inaae - quate" instead of

  16. The evolution of educational information systems and nurse faculty roles.

    PubMed

    Nelson, Ramona; Meyers, Linda; Rizzolo, Mary Anne; Rutar, Pamela; Proto, Marcia B; Newbold, Susan

    2006-01-01

    Institutions of higher education are purchasing and/or designing sophisticated administrative information systems to manage such functions as the application, admissions, and registration process, grants management, student records, and classroom scheduling. Although faculty also manage large amounts of data, few automated systems have been created to help faculty improve teaching and learning through the management of information related to individual students, the curriculum, educational programs, and program evaluation. This article highlights the potential benefits that comprehensive educational information systems offer nurse faculty.

  17. The EOS Aqua/Aura Experience: Lessons Learned on Design, Integration, and Test of Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Nosek, Thomas P.

    2004-01-01

    NASA and NOAA earth observing satellite programs are flying a number of sophisticated scientific instruments which collect data on many phenomena and parameters of the earth's environment. The NASA Earth Observing System (EOS) Program originated the EOS Common Bus approach, which featured two spacecraft (Aqua and Aura) of virtually identical design but with completely different instruments. Significant savings were obtained by the Common Bus approach and these lessons learned are presented as information for future program requiring multiple busses for new diversified instruments with increased capabilities for acquiring earth environmental data volume, accuracy, and type.

  18. Optimized Materials From First Principles Simulations: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galli, G; Gygi, F

    2005-07-26

    In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less

  19. Fast Dating Using Least-Squares Criteria and Algorithms.

    PubMed

    To, Thu-Hien; Jung, Matthieu; Lycett, Samantha; Gascuel, Olivier

    2016-01-01

    Phylogenies provide a useful way to understand the evolutionary history of genetic samples, and data sets with more than a thousand taxa are becoming increasingly common, notably with viruses (e.g., human immunodeficiency virus (HIV)). Dating ancestral events is one of the first, essential goals with such data. However, current sophisticated probabilistic approaches struggle to handle data sets of this size. Here, we present very fast dating algorithms, based on a Gaussian model closely related to the Langley-Fitch molecular-clock model. We show that this model is robust to uncorrelated violations of the molecular clock. Our algorithms apply to serial data, where the tips of the tree have been sampled through times. They estimate the substitution rate and the dates of all ancestral nodes. When the input tree is unrooted, they can provide an estimate for the root position, thus representing a new, practical alternative to the standard rooting methods (e.g., midpoint). Our algorithms exploit the tree (recursive) structure of the problem at hand, and the close relationships between least-squares and linear algebra. We distinguish between an unconstrained setting and the case where the temporal precedence constraint (i.e., an ancestral node must be older that its daughter nodes) is accounted for. With rooted trees, the former is solved using linear algebra in linear computing time (i.e., proportional to the number of taxa), while the resolution of the latter, constrained setting, is based on an active-set method that runs in nearly linear time. With unrooted trees the computing time becomes (nearly) quadratic (i.e., proportional to the square of the number of taxa). In all cases, very large input trees (>10,000 taxa) can easily be processed and transformed into time-scaled trees. We compare these algorithms to standard methods (root-to-tip, r8s version of Langley-Fitch method, and BEAST). Using simulated data, we show that their estimation accuracy is similar to that of the most sophisticated methods, while their computing time is much faster. We apply these algorithms on a large data set comprising 1194 strains of Influenza virus from the pdm09 H1N1 Human pandemic. Again the results show that these algorithms provide a very fast alternative with results similar to those of other computer programs. These algorithms are implemented in the LSD software (least-squares dating), which can be downloaded from http://www.atgc-montpellier.fr/LSD/, along with all our data sets and detailed results. An Online Appendix, providing additional algorithm descriptions, tables, and figures can be found in the Supplementary Material available on Dryad at http://dx.doi.org/10.5061/dryad.968t3. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  20. Fast Dating Using Least-Squares Criteria and Algorithms

    PubMed Central

    To, Thu-Hien; Jung, Matthieu; Lycett, Samantha; Gascuel, Olivier

    2016-01-01

    Phylogenies provide a useful way to understand the evolutionary history of genetic samples, and data sets with more than a thousand taxa are becoming increasingly common, notably with viruses (e.g., human immunodeficiency virus (HIV)). Dating ancestral events is one of the first, essential goals with such data. However, current sophisticated probabilistic approaches struggle to handle data sets of this size. Here, we present very fast dating algorithms, based on a Gaussian model closely related to the Langley–Fitch molecular-clock model. We show that this model is robust to uncorrelated violations of the molecular clock. Our algorithms apply to serial data, where the tips of the tree have been sampled through times. They estimate the substitution rate and the dates of all ancestral nodes. When the input tree is unrooted, they can provide an estimate for the root position, thus representing a new, practical alternative to the standard rooting methods (e.g., midpoint). Our algorithms exploit the tree (recursive) structure of the problem at hand, and the close relationships between least-squares and linear algebra. We distinguish between an unconstrained setting and the case where the temporal precedence constraint (i.e., an ancestral node must be older that its daughter nodes) is accounted for. With rooted trees, the former is solved using linear algebra in linear computing time (i.e., proportional to the number of taxa), while the resolution of the latter, constrained setting, is based on an active-set method that runs in nearly linear time. With unrooted trees the computing time becomes (nearly) quadratic (i.e., proportional to the square of the number of taxa). In all cases, very large input trees (>10,000 taxa) can easily be processed and transformed into time-scaled trees. We compare these algorithms to standard methods (root-to-tip, r8s version of Langley–Fitch method, and BEAST). Using simulated data, we show that their estimation accuracy is similar to that of the most sophisticated methods, while their computing time is much faster. We apply these algorithms on a large data set comprising 1194 strains of Influenza virus from the pdm09 H1N1 Human pandemic. Again the results show that these algorithms provide a very fast alternative with results similar to those of other computer programs. These algorithms are implemented in the LSD software (least-squares dating), which can be downloaded from http://www.atgc-montpellier.fr/LSD/, along with all our data sets and detailed results. An Online Appendix, providing additional algorithm descriptions, tables, and figures can be found in the Supplementary Material available on Dryad at http://dx.doi.org/10.5061/dryad.968t3. PMID:26424727

  1. Development of Modern Performance Assessment Tools and Capabilities for Underground Disposal of Transuranic Waste at WIPP

    NASA Astrophysics Data System (ADS)

    Zeitler, T.; Kirchner, T. B.; Hammond, G. E.; Park, H.

    2014-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. In a broad modernization effort, the DOE has overseen the transfer of these codes to modern hardware and software platforms. Additionally, there is a current effort to establish new performance assessment capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Improvements to the current computational environment will result in greater detail in the final models due to the parallelization afforded by the modern code. Parallelization will allow for relatively faster calculations, as well as a move from a two-dimensional calculation grid to a three-dimensional grid. The result of the modernization effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.

  2. Control systems for heating, ventilating, and air conditioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haines, R.W.

    1977-01-01

    Hundreds of ideas for designing and controlling sophisticated heating, ventilating and air conditioning (HVAC) systems are presented. Information is included on enthalpy control, energy conservation in HVAC systems, on solar heating, cooling and refrigeration systems, and on a self-draining water collector and heater. Computerized control systems and the economics of supervisory systems are discussed. Information is presented on computer system components, software, relevant terminology, and computerized security and fire reporting systems. Benefits of computer systems are explained, along with optimization techniques, data management, maintenance schedules, and energy consumption. A bibliography, glossaries of HVAC terminology, abbreviations, symbols, and a subject indexmore » are provided. (LCL)« less

  3. Technology for the product and process data base

    NASA Technical Reports Server (NTRS)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  4. Lung Cancer: Posttreatment Imaging: Radiation Therapy and Imaging Findings.

    PubMed

    Benveniste, Marcelo F; Welsh, James; Viswanathan, Chitra; Shroff, Girish S; Betancourt Cuellar, Sonia L; Carter, Brett W; Marom, Edith M

    2018-05-01

    In this review, we discuss the different radiation delivery techniques available to treat non-small cell lung cancer, typical radiologic manifestations of conventional radiotherapy, and different patterns of lung injury and temporal evolution of the newer radiotherapy techniques. More sophisticated techniques include intensity-modulated radiotherapy, stereotactic body radiotherapy, proton therapy, and respiration-correlated computed tomography or 4-dimensional computed tomography for radiotherapy planning. Knowledge of the radiation treatment plan and technique, the completion date of radiotherapy, and the temporal evolution of radiation-induced lung injury is important to identify expected manifestations of radiation-induced lung injury and differentiate them from tumor recurrence or infection. Published by Elsevier Inc.

  5. A nonperturbative approximation for the moderate Reynolds number Navier–Stokes equations

    PubMed Central

    Roper, Marcus; Brenner, Michael P.

    2009-01-01

    The nonlinearity of the Navier–Stokes equations makes predicting the flow of fluid around rapidly moving small bodies highly resistant to all approaches save careful experiments or brute force computation. Here, we show how a linearization of the Navier–Stokes equations captures the drag-determining features of the flow and allows simplified or analytical computation of the drag on bodies up to Reynolds number of order 100. We illustrate the utility of this linearization in 2 practical problems that normally can only be tackled with sophisticated numerical methods: understanding flow separation in the flow around a bluff body and finding drag-minimizing shapes. PMID:19211800

  6. A nonperturbative approximation for the moderate Reynolds number Navier-Stokes equations.

    PubMed

    Roper, Marcus; Brenner, Michael P

    2009-03-03

    The nonlinearity of the Navier-Stokes equations makes predicting the flow of fluid around rapidly moving small bodies highly resistant to all approaches save careful experiments or brute force computation. Here, we show how a linearization of the Navier-Stokes equations captures the drag-determining features of the flow and allows simplified or analytical computation of the drag on bodies up to Reynolds number of order 100. We illustrate the utility of this linearization in 2 practical problems that normally can only be tackled with sophisticated numerical methods: understanding flow separation in the flow around a bluff body and finding drag-minimizing shapes.

  7. An evaluation of computer assisted clinical classification algorithms.

    PubMed

    Chute, C G; Yang, Y; Buntrock, J

    1994-01-01

    The Mayo Clinic has a long tradition of indexing patient records in high resolution and volume. Several algorithms have been developed which promise to help human coders in the classification process. We evaluate variations on code browsers and free text indexing systems with respect to their speed and error rates in our production environment. The more sophisticated indexing systems save measurable time in the coding process, but suffer from incompleteness which requires a back-up system or human verification. Expert Network does the best job of rank ordering clinical text, potentially enabling the creation of thresholds for the pass through of computer coded data without human review.

  8. Instructional computing in space physics moves ahead

    NASA Astrophysics Data System (ADS)

    Russell, C. T.; Omidi, N.

    As the number of spacecraft stationed in the Earth's magnetosphere exponentiates and society becomes more technologically sophisticated and dependent on these spacebased resources, both the importance of space physics and the need to train people in this field will increase.Space physics is a very difficult subject for students to master. Both mechanical and electromagnetic forces are important. The treatment of problems can be very mathematical, and the scale sizes of phenomena are usually such that laboratory studies become impossible, and experimentation, when possible at all, must be carried out in deep space. Fortunately, computers have evolved to the point that they are able to greatly facilitate instruction in space physics.

  9. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments.

    PubMed

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  10. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments

    NASA Astrophysics Data System (ADS)

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  11. Preliminary performance analysis of an interplanetary navigation system using asteroid based beacons

    NASA Technical Reports Server (NTRS)

    Jee, J. Rodney; Khatib, Ahmad R.; Muellerschoen, Ronald J.; Williams, Bobby G.; Vincent, Mark A.

    1988-01-01

    A futuristic interplanetary navigation system using transmitters placed on selected asteroids is introduced. This network of space beacons is seen as a needed alternative to the overly burdened Deep Space Network. Covariance analyses on the potential performance of these space beacons located on a candidate constellation of eight real asteroids are initiated. Simplified analytic calculations are performed to determine limiting accuracies attainable with the network for geometric positioning. More sophisticated computer simulations are also performed to determine potential accuracies using long arcs of range and Doppler data from the beacons. The results from these computations show promise for this navigation system.

  12. Nonadditivity of van der Waals forces on liquid surfaces

    NASA Astrophysics Data System (ADS)

    Venkataram, Prashanth S.; Whitton, Jeremy D.; Rodriguez, Alejandro W.

    2016-09-01

    We present an approach for modeling nanoscale wetting and dewetting of textured solid surfaces that exploits recently developed, sophisticated techniques for computing exact long-range dispersive van der Waals (vdW) or (more generally) Casimir forces in arbitrary geometries. We apply these techniques to solve the variational formulation of the Young-Laplace equation and predict the equilibrium shapes of liquid-vacuum interfaces near solid gratings. We show that commonly employed methods of computing vdW interactions based on additive Hamaker or Derjaguin approximations, which neglect important electromagnetic boundary effects, can result in large discrepancies in the shapes and behaviors of liquid surfaces compared to exact methods.

  13. chemf: A purely functional chemistry toolkit.

    PubMed

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of the programmers involved in this project.

  14. chemf: A purely functional chemistry toolkit

    PubMed Central

    2012-01-01

    Background Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. Results We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. Conclusions We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of the programmers involved in this project. PMID:23253942

  15. Computing the binding affinity of a ligand buried deep inside a protein with the hybrid steered molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villarreal, Oscar D.; Yu, Lili; Department of Laboratory Medicine, Yancheng Vocational Institute of Health Sciences, Yancheng, Jiangsu 224006

    Computing the ligand-protein binding affinity (or the Gibbs free energy) with chemical accuracy has long been a challenge for which many methods/approaches have been developed and refined with various successful applications. False positives and, even more harmful, false negatives have been and still are a common occurrence in practical applications. Inevitable in all approaches are the errors in the force field parameters we obtain from quantum mechanical computation and/or empirical fittings for the intra- and inter-molecular interactions. These errors propagate to the final results of the computed binding affinities even if we were able to perfectly implement the statistical mechanicsmore » of all the processes relevant to a given problem. And they are actually amplified to various degrees even in the mature, sophisticated computational approaches. In particular, the free energy perturbation (alchemical) approaches amplify the errors in the force field parameters because they rely on extracting the small differences between similarly large numbers. In this paper, we develop a hybrid steered molecular dynamics (hSMD) approach to the difficult binding problems of a ligand buried deep inside a protein. Sampling the transition along a physical (not alchemical) dissociation path of opening up the binding cavity- -pulling out the ligand- -closing back the cavity, we can avoid the problem of error amplifications by not relying on small differences between similar numbers. We tested this new form of hSMD on retinol inside cellular retinol-binding protein 1 and three cases of a ligand (a benzylacetate, a 2-nitrothiophene, and a benzene) inside a T4 lysozyme L99A/M102Q(H) double mutant. In all cases, we obtained binding free energies in close agreement with the experimentally measured values. This indicates that the force field parameters we employed are accurate and that hSMD (a brute force, unsophisticated approach) is free from the problem of error amplification suffered by many sophisticated approaches in the literature.« less

  16. Computer-aided light sheet flow visualization using photogrammetry

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1994-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.

  17. Computer-Aided Light Sheet Flow Visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  18. A Computing Method for Sound Propagation Through a Nonuniform Jet Stream

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Liu, C. H.

    1974-01-01

    Understanding the principles of jet noise propagation is an essential ingredient of systematic noise reduction research. High speed computer methods offer a unique potential for dealing with complex real life physical systems whereas analytical solutions are restricted to sophisticated idealized models. The classical formulation of sound propagation through a jet flow was found to be inadequate for computer solutions and a more suitable approach was needed. Previous investigations selected the phase and amplitude of the acoustic pressure as dependent variables requiring the solution of a system of nonlinear algebraic equations. The nonlinearities complicated both the analysis and the computation. A reformulation of the convective wave equation in terms of a new set of dependent variables is developed with a special emphasis on its suitability for numerical solutions on fast computers. The technique is very attractive because the resulting equations are linear in nonwaving variables. The computer solution to such a linear system of algebraic equations may be obtained by well-defined and direct means which are conservative of computer time and storage space. Typical examples are illustrated and computational results are compared with available numerical and experimental data.

  19. Computer-aided light sheet flow visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  20. Petascale Simulation Initiative Tech Base: FY2007 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, J; Chen, R; Jefferson, D

    The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending on when Peloton becomes available); (2) Improve SARS's robustness and ease-of-use, and develop user documentation; and (3) Work with LLNL code teams to help them determine how Symponents could benefit their applications. The original funding request was $296,000 for the year, and we eventually received $252,000. The remainder of this report describes our efforts and accomplishments for each of the goals listed above.« less

Top