Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Institutional computing (IC) information session
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Kenneth R; Lally, Bryan R
2011-01-19
The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.
ERIC Educational Resources Information Center
Mitchell, Lynda K.; Hardy, Philippe L.
The purpose of this chapter is to envision how the era of technological revolution will affect the guidance, counseling, and student support programs of the future. Advances in computer science, telecommunications, and biotechnology are discussed. These advances have the potential to affect dramatically the services of guidance programs of the…
Computing Literacy in the University of the Future.
ERIC Educational Resources Information Center
Gantt, Vernon W.
In exploring the impact of microcomputers and the future of the university in 1985 and beyond, a distinction should be made between computing literacy--the ability to use a computer--and computer literacy, which goes beyond successful computer use to include knowing how to program in various computer languages and understanding what goes on…
Application of computational physics within Northrop
NASA Technical Reports Server (NTRS)
George, M. W.; Ling, R. T.; Mangus, J. F.; Thompkins, W. T.
1987-01-01
An overview of Northrop programs in computational physics is presented. These programs depend on access to today's supercomputers, such as the Numerical Aerodynamical Simulator (NAS), and future growth on the continuing evolution of computational engines. Descriptions here are concentrated on the following areas: computational fluid dynamics (CFD), computational electromagnetics (CEM), computer architectures, and expert systems. Current efforts and future directions in these areas are presented. The impact of advances in the CFD area is described, and parallels are drawn to analagous developments in CEM. The relationship between advances in these areas and the development of advances (parallel) architectures and expert systems is also presented.
The Future's Future: Implications of Emerging Technology for Special Education Program Planning.
ERIC Educational Resources Information Center
Hofstetter, Fred T.
2001-01-01
This article reviews emerging technologies, imagines how they can be used to help learners with special needs, and recommends new special education program initiatives to help these students make a meaningful transition from school to work. Wearable computers, personal computing devices, DVD, HDTV, MP3, and personal digital assistants are…
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
Seventy Years of Computing in the Nuclear Weapons Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Billy Joe
Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.
Mentat: An object-oriented macro data flow system
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; Liu, Jane W. S.
1988-01-01
Mentat, an object-oriented macro data flow system designed to facilitate parallelism in distributed systems, is presented. The macro data flow model is a model of computation similar to the data flow model with two principal differences: the computational complexity of the actors is much greater than in traditional data flow systems, and there are persistent actors that maintain state information between executions. Mentat is a system that combines the object-oriented programming paradigm and the macro data flow model of computation. Mentat programs use a dynamic structure called a future list to represent the future of computations.
ERIC Educational Resources Information Center
Martin-McCormick, Lynda; And Others
An advocacy packet on educational equity in computer education consists of five separate materials. A booklet entitled "Today's Guide to the Schools of the Future" contains four sections. The first section, a computer equity assessment guide, includes interview questions about school policies and allocation of resources, student and teacher…
Identification of Program Signatures from Cloud Computing System Telemetry Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.
Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less
Training School Administrators in Computer Use.
ERIC Educational Resources Information Center
Spuck, Dennis W.; Bozeman, William C.
1988-01-01
Presents results of a survey of faculty members in doctoral-level educational administration programs that examined the use of computers in administrative training programs. The present status and future directions of technological training of school administrators are discussed, and a sample curriculum for a course in technology and computing is…
Mass Memory Storage Devices for AN/SLQ-32(V).
1985-06-01
tactical programs and libraries into the AN/UYK-19 computer , the RP-16 microprocessor, and other peripheral processors (e.g., ADLS and Band 1) will be...software must be loaded into computer memory from the 4-track magnetic tape cartridges (MTCs) on which the programs are stored. Program load begins...software. Future computer programs , which will reside in peripheral processors, include the Automated Decoy Launching System (ADLS) and Band 1. As
Computer-Aided Authoring of Programmed Instruction for Teaching Symbol Recognition. Final Report.
ERIC Educational Resources Information Center
Braby, Richard; And Others
This description of AUTHOR, a computer program for the automated authoring of programmed texts designed to teach symbol recognition, includes discussions of the learning strategies incorporated in the design of the instructional materials, hardware description and the algorithm for the software, and current and future developments. Appendices…
Evolution of a standard microprocessor-based space computer
NASA Technical Reports Server (NTRS)
Fernandez, M.
1980-01-01
An existing in inventory computer hardware/software package (B-1 RFS/ECM) was repackaged and applied to multiple missile/space programs. Concurrent with the application efforts, low risk modifications were made to the computer from program to program to take advantage of newer, advanced technology and to meet increasingly more demanding requirements (computational and memory capabilities, longer life, and fault tolerant autonomy). It is concluded that microprocessors hold promise in a number of critical areas for future space computer applications. However, the benefits of the DoD VHSIC Program are required and the old proliferation problem must be revised.
Project CAD as of July 1978: CAD support project, situation in July 1978
NASA Technical Reports Server (NTRS)
Boesch, L.; Lang-Lendorff, G.; Rothenberg, R.; Stelzer, V.
1979-01-01
The structure of Computer Aided Design (CAD) and the requirements for program developments in past and future are described. The actual standard and the future aims of CAD programs are presented. The developed programs in: (1) civil engineering; (2) mechanical engineering; (3) chemical engineering/shipbuilding; (4) electrical engineering; and (5) general programs are discussed.
ERIC Educational Resources Information Center
Cusick, Theresa; And Others
This examination of computer equity argues that current educational trends--which emphasize teaching applications of computers rather than programming--will limit the computer skills of students. Added to this difficulty is the argument that some students (often minority and female students) need not be pushed to learn programming if they don't…
NCRETURN Computer Program for Evaluating Investments Revised to Provide Additional Information
Allen L. Lundgren; Dennis L. Schweitzer
1971-01-01
Reports a modified version of NCRETURN, a computer program for evauating forestry investments. The revised version, RETURN, provides additional information about each investment, including future net worths and benefit-cost ratios, with no added input.
Computer Simulation in the Social Sciences/Social Studies.
ERIC Educational Resources Information Center
Klassen, Daniel L.
Computers are beginning to be used more frequently as instructional tools in secondary school social studies. This is especially true of "new social studies" programs; i.e., programs which subordinate mere mastery of factual content to the recognition of and ability to deal with the social imperatives of the future. Computer-assisted…
Payload/orbiter contamination control requirement study: Computer interface
NASA Technical Reports Server (NTRS)
Bareiss, L. E.; Hooper, V. W.; Ress, E. B.
1976-01-01
The MSFC computer facilities, and future plans for them are described relative to characteristics of the various computers as to availability and suitability for processing the contamination program. A listing of the CDC 6000 series and UNIVAC 1108 characteristics is presented so that programming requirements can be compared directly and differences noted.
Future animal improvement programs applied to global populations
USDA-ARS?s Scientific Manuscript database
Breeding programs evolved gradually from within-herd phenotypic selection to local and regional cooperatives to national evaluations and now international evaluations. In the future, breeders may adapt reproductive, computational, and genomic methods to global populations as easily as with national ...
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2004-01-01
This viewgraph presentation provides samples of computer code which have characteristics of poetic verse, and addresses the theoretical underpinnings of artistic coding, as well as how computer language influences software style, and the possible style of future coding.
NASA Technical Reports Server (NTRS)
Fang, Wai-Chi; Alkalai, Leon
1996-01-01
Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.
ERIC Educational Resources Information Center
Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.
2014-01-01
African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…
A Review of Resources for Evaluating K-12 Computer Science Education Programs
ERIC Educational Resources Information Center
Randolph, Justus J.; Hartikainen, Elina
2004-01-01
Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…
ERIC Educational Resources Information Center
Boardman, D.
1979-01-01
Practical experience has shown that computer aided design programs can provide an invaluable aid in the learning process when integrated into the syllabus in lecture and laboratory periods. This should be a major area of future development of computer assisted learning in engineering education. (Author/CMV)
COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways
ERIC Educational Resources Information Center
Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.
2015-01-01
The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…
eXascale PRogramming Environment and System Software (XPRESS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Barbara; Gabriel, Edgar
Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-scale computing for both exascale and strongscaled problems. The XPRESS collaborative research project will advance the state-of-the-art in high performance computing and enable exascale computing for current and future DOE mission-critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less
Student Computer Dialogs Without Special Purpose Languages.
ERIC Educational Resources Information Center
Bork, Alfred
The phrase "student computer dialogs" refers to interactive sessions between the student and the computer. Rather than using programing languages specifically designed for computer assisted instruction (CAI), existing general purpose languages should be emphasized in the future development of student computer dialogs, as the power and…
A Computer Program to Evaluate Timber Production Investments Under Uncertainty
Dennis L. Schweitzer
1968-01-01
A computer program has been written in Fortran IV to calculate probability distributions of present worths of investments in timber production. Inputs can include both point and probabilistic estimates of future costs, prices, and yields. Distributions of rates of return can also be constructed.
Laptop Lessons: Exploring the Promise of One-to-One Computing.
ERIC Educational Resources Information Center
Carter, Kim
2001-01-01
Describes benefits of programs where schools provide laptop computers for each student. Topics include results of studies that show positive learning outcomes; funding options; implementation; protecting the equipment; resources for learning about laptop programs; staff training and support; and future possibilities, including the implications of…
Linear programming computational experience with onyx
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atrek, E.
1994-12-31
ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.
The Amber Biomolecular Simulation Programs
CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.
2006-01-01
We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
The expanded role of computers in Space Station Freedom real-time operations
NASA Technical Reports Server (NTRS)
Crawford, R. Paul; Cannon, Kathleen V.
1990-01-01
The challenges that NASA and its international partners face in their real-time operation of the Space Station Freedom necessitate an increased role on the part of computers. In building the operational concepts concerning the role of the computer, the Space Station program is using lessons learned experience from past programs, knowledge of the needs of future space programs, and technical advances in the computer industry. The computer is expected to contribute most significantly in real-time operations by forming a versatile operating architecture, a responsive operations tool set, and an environment that promotes effective and efficient utilization of Space Station Freedom resources.
Apollo experience report: Apollo lunar surface experiments package data processing system
NASA Technical Reports Server (NTRS)
Eason, R. L.
1974-01-01
Apollo Program experience in the processing of scientific data from the Apollo lunar surface experiments package, in which computers and associated hardware and software were used, is summarized. The facility developed for the preprocessing of the lunar science data is described, as are several computer facilities and programs used by the Principal Investigators. The handling, processing, and analyzing of lunar science data and the interface with the Principal Investigators are discussed. Pertinent problems that arose in the development of the data processing schemes are discussed so that future programs may benefit from the solutions to the problems. The evolution of the data processing techniques for lunar science data related to recommendations for future programs of this type.
Chemical calculations on Cray computers
NASA Technical Reports Server (NTRS)
Taylor, Peter R.; Bauschlicher, Charles W., Jr.; Schwenke, David W.
1989-01-01
The influence of recent developments in supercomputing on computational chemistry is discussed with particular reference to Cray computers and their pipelined vector/limited parallel architectures. After reviewing Cray hardware and software the performance of different elementary program structures are examined, and effective methods for improving program performance are outlined. The computational strategies appropriate for obtaining optimum performance in applications to quantum chemistry and dynamics are discussed. Finally, some discussion is given of new developments and future hardware and software improvements.
Extreme Scale Computing to Secure the Nation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L; McGraw, J R; Johnson, J R
2009-11-10
Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less
Computer Programming with Early Elementary Students with Down Syndrome
ERIC Educational Resources Information Center
Taylor, Matthew S.; Vasquez, Eleazar; Donehower, Claire
2017-01-01
Students of all ages and abilities must be given the opportunity to learn academic skills that can shape future opportunities and careers. Researchers in the mid-1970s and 1980s began teaching young students the processes of computer programming using basic coding skills and limited technology. As technology became more personalized and easily…
1990-09-14
transmission of detected variations through sound lines of communication to centrally located standard Navy computers . These computers would be programmed to...have been programmed in C language. The program runs under the operating system ,OS9 on a VME-bus computer with a 68000 microprocessor. A number of full...present practice of"add-on" supervisory controls during ship design and construction,and "fix-it" R&D programs implemented after the ship isoperational
ERIC Educational Resources Information Center
Prince, Amber T.
Computer assisted instruction, and especially computer simulations, can help to ensure that preservice and inservice teachers learn from the right experiences. In the past, colleges of education used large mainframe computer systems to store student registration, provide simulation lessons on diagnosing reading difficulties, construct informal…
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1973-01-01
The DORCA Applications study has been directed at development of a data bank management computer program identified as DORMAN. Because of the size of the DORCA data files and the manipulations required on that data to support analyses with the DORCA program, automated data techniques to replace time-consuming manual input generation are required. The Dynamic Operations Requirements and Cost Analysis (DORCA) program was developed for use by NASA in planning future space programs. Both programs are designed for implementation on the UNIVAC 1108 computing system. The purpose of this Executive Summary Report is to define for the NASA management the basic functions of the DORMAN program and its capabilities.
Education: AIChE Probes Impact of Computer on Future Engineering Education.
ERIC Educational Resources Information Center
Krieger, James
1983-01-01
Evaluates influence of computer assisted instruction on engineering education, considering use of computers to remove burden of doing calculations and to provide interactive self-study programs of a tutorial/remedial nature. Cites universities requiring personal computer purchase, pointing out possibility for individualized design assignments.…
Flight program language requirements. Volume 2: Requirements and evaluations
NASA Technical Reports Server (NTRS)
1972-01-01
The efforts and results are summarized for a study to establish requirements for a flight programming language for future onboard computer applications. Several different languages were available as potential candidates for future NASA flight programming efforts. The study centered around an evaluation of the four most pertinent existing aerospace languages. Evaluation criteria were established, and selected kernels from the current Saturn 5 and Skylab flight programs were used as benchmark problems for sample coding. An independent review of the language specifications incorporated anticipated future programming requirements into the evaluation. A set of detailed language requirements was synthesized from these activities. The details of program language requirements and of the language evaluations are described.
The NASA Computational Fluid Dynamics (CFD) program - Building technology to solve future challenges
NASA Technical Reports Server (NTRS)
Richardson, Pamela F.; Dwoyer, Douglas L.; Kutler, Paul; Povinelli, Louis A.
1993-01-01
This paper presents the NASA Computational Fluid Dynamics program in terms of a strategic vision and goals as well as NASA's financial commitment and personnel levels. The paper also identifies the CFD program customers and the support to those customers. In addition, the paper discusses technical emphasis and direction of the program and some recent achievements. NASA's Ames, Langley, and Lewis Research Centers are the research hubs of the CFD program while the NASA Headquarters Office of Aeronautics represents and advocates the program.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Editor)
1986-01-01
The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.
Software development to support sensor control of robot arc welding
NASA Technical Reports Server (NTRS)
Silas, F. R., Jr.
1986-01-01
The development of software for a Digital Equipment Corporation MINC-23 Laboratory Computer to provide functions of a workcell host computer for Space Shuttle Main Engine (SSME) robotic welding is documented. Routines were written to transfer robot programs between the MINC and an Advanced Robotic Cyro 750 welding robot. Other routines provide advanced program editing features while additional software allows communicatin with a remote computer aided design system. Access to special robot functions were provided to allow advanced control of weld seam tracking and process control for future development programs.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Housner, Jerrold M.
1993-01-01
Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.
This hearing explores how the High Performance Computing and Communications Program (HPCC) relates to the technology needs of industry. Testimony and prepared statements from the following witnesses on future effects of computing and networking technologies on their companies are included: (1) F. Brett Berlin, president, Brett Berlin Associates,…
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computers
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1975-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.
ERIC Educational Resources Information Center
Brubaker, Thomas, A., Ed.; And Others
These conference proceedings address the capabilities of technology in education. Papers and summaries of presentations are provided on the following topics: programs for special needs students; virtual realities; funding opportunities; videodiscs; future programs and perspectives; telecomputing; computer networks in the classroom; human…
Educational Software: A Developer's Perspective.
ERIC Educational Resources Information Center
Armstrong, Timothy C.; Loane, Russell F.
1994-01-01
Examines the current status and short-term future of computer software development in higher education. Topics discussed include educational advantages of software; current program development techniques, including object oriented programming; and market trends, including IBM versus Macintosh and multimedia programs. (LRW)
Discussion of DNS: Past, Present, and Future
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.
1997-01-01
This paper covers the review, status, and projected future of direct numerical simulation (DNS) methodology relative to the state-of-the-art in computer technology, numerical methods, and the trends in fundamental research programs.
Current status and future direction of NASA's Space Life Sciences Program
NASA Technical Reports Server (NTRS)
White, Ronald J.; Lujan, Barbara F.
1989-01-01
The elements of the NASA Life Sciences Program that are related to manned space flight and biological scientific studies in space are reviewed. Projects included in the current program are outlined and the future direction of the program is discussed. Consideration is given to issues such as long-duration spaceflight, medical support in space, readaptation to the gravity field of earth, considerations for the Space Station, radiation hazards, environmental standards for space habitation, and human operator interaction with computers, robots, and telepresence systems.
Biomedical wellness challenges and opportunities
NASA Astrophysics Data System (ADS)
Tangney, John F.
2012-06-01
The mission of ONR's Human and Bioengineered Systems Division is to direct, plan, foster, and encourage Science and Technology in cognitive science, computational neuroscience, bioscience and bio-mimetic technology, social/organizational science, training, human factors, and decision making as related to future Naval needs. This paper highlights current programs that contribute to future biomedical wellness needs in context of humanitarian assistance and disaster relief. ONR supports fundamental research and related technology demonstrations in several related areas, including biometrics and human activity recognition; cognitive sciences; computational neurosciences and bio-robotics; human factors, organizational design and decision research; social, cultural and behavioral modeling; and training, education and human performance. In context of a possible future with automated casualty evacuation, elements of current science and technology programs are illustrated.
High-End Computing Challenges in Aerospace Design and Engineering
NASA Technical Reports Server (NTRS)
Bailey, F. Ronald
2004-01-01
High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.
The Future Is Kids and Computers.
ERIC Educational Resources Information Center
Personal Computing, 1982
1982-01-01
Describes a project which produced educational computer programs for PET microcomputers and use of computers in money management, in a filter company, and in a certified public accountant firm (which cancelled a contract for a time-sharing service). Also describes a computerized eye information network for ophthalmologists. (JN)
Future applications of artificial intelligence to Mission Control Centers
NASA Technical Reports Server (NTRS)
Friedland, Peter
1991-01-01
Future applications of artificial intelligence to Mission Control Centers are presented in the form of the viewgraphs. The following subject areas are covered: basic objectives of the NASA-wide AI program; inhouse research program; constraint-based scheduling; learning and performance improvement for scheduling; GEMPLAN multi-agent planner; planning, scheduling, and control; Bayesian learning; efficient learning algorithms; ICARUS (an integrated architecture for learning); design knowledge acquisition and retention; computer-integrated documentation; and some speculation on future applications.
Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings
NASA Technical Reports Server (NTRS)
1992-01-01
The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.
ERIC Educational Resources Information Center
Lee, Youngsun; Wehmeyer, Michael L.; Palmer, Susan B.; Williams-Diehm, Kendra; Davies, Daniel K.; Stock, Steven E.
2011-01-01
The purpose of this study was to investigate the impact of student-directed transition planning instruction ("Whose Future Is It Anyway?" curriculum) with a computer-based reading support program ("Rocket Reader") on the self-determination, self-efficacy and outcome expectancy, and transition planning knowledge of students with disabilities. This…
Explore the Future: Will Books Have a Place in the Computer Classroom?
ERIC Educational Resources Information Center
Jobe, Ronald A.
The question of the place of books in a classroom using computers appears to be simple, yet it is one of vital concern to teachers. The availability of programs (few of which focus on literary appreciation), the mesmerizing qualities of the computer, its distortion of time, the increasing power of computers over teacher time, and the computer's…
Apollo Photograph Evaluation (APE) programming manual
NASA Technical Reports Server (NTRS)
Kim, I. J.
1974-01-01
This document describes the programming techniques used to implement the equations of the Apollo Photograph Evaluation (APE) program on the UNIVAC 1108 computer and contains detailed descriptions of the program structure, a User's Guide section to provide the necessary information for proper operation of the program, and information for the assessment of the program's adaptability to future problems.
IPAD: Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
1980-01-01
The conference was organized to promote wider awareness of the IPAD program and its coming impact on American industry. The program focuses on technology issues that are critical to computer aided design manufacturing. Included is a description of a representative aerospace design process and its interface with manufacturing, the design of a future IPAD integrated computer aided design system, results to date in developing IPAD products and associated technology, and industry experiences and plans to exploit these products.
Development of Alabama Resources Information System (ARIS)
NASA Technical Reports Server (NTRS)
Herring, B. E.; Vachon, R. I.
1976-01-01
A formal, organized set of information concerning the development status of the Alabama Resources Information System (ARIS) as of September 1976 is provided. A series of computer source language programs, and flow charts related to each of the computer programs to provide greater ease in performing future change are presented. Listings of the variable names, and their meanings, used in the various source code programs, and copies of the various user manuals which were prepared through this time are given.
The Instrument of the Future: Computers in Education.
ERIC Educational Resources Information Center
Leonard, Rex; LeCroy, Barbara
Before computers will be able to fulfill their potential in education, two major challenges must be overcome--the lack of well-trained teachers and a lack of general knowledge about software and its capabilities. Teachers must acquire some computer literacy skills, including programming, word processing, materials generation and record keeping. In…
LORAN-C LATITUDE-LONGITUDE CONVERSION AT SEA: PROGRAMMING CONSIDERATIONS.
McCullough, James R.; Irwin, Barry J.; Bowles, Robert M.
1985-01-01
Comparisons are made of the precision of arc-length routines as computer precision is reduced. Overland propagation delays are discussed and illustrated with observations from offshore New England. Present practice of LORAN-C error budget modeling is then reviewed with the suggestion that additional terms be considered in future modeling. Finally, some detailed numeric examples are provided to help with new computer program checkout.
Tailored program evaluation: Past, present, future.
Suggs, L Suzanne; Cowdery, Joan E; Carroll, Jennifer B
2006-11-01
This paper discusses measurement issues related to the evaluation of computer-tailored health behavior change programs. As the first generation of commercially available tailored products is utilized in health promotion programming, programmers and researchers are becoming aware of the unique challenges that the evaluation of these programs presents. A project is presented that used an online tailored health behavior assessment (HBA) in a worksite setting. Process and outcome evaluation methods are described and include the challenges faced, and strategies proposed and implemented, for meeting them. Implications for future research in tailored program development, implementation, and evaluation are also discussed.
Palamar, Borys I; Vaskivska, Halyna O; Palamar, Svitlana P
In the article the author touches upon the subject of significance of computer equipment for organization of cooperation of professor and future specialists. Such subject-subject interaction may be directed to forming of professional skills of future specialists. By using information and communication technologies in education system range of didactic tasks can be solved. Improving of process of teaching of subjects in high school, self-learning future specialists, motivating to learning and self-learning, the development of reflection in the learning process. The authors considers computer equipment as instrument for development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems on the creative basis. Based on results of researches the author comes to certain conclusions about the effectiveness of usage of computer technologies in process of teaching future specialists and their self-learning. Improper supplying of high schools with computer equipment, lack of appropriate educational programs, professors' teachers' poor knowledge and usage of computers have negative impact on organization of process of teaching disciplines in high schools. Computer equipment and ICT in general are the instruments of development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems. So, the formation of psychosocial environment of development of future specialist is multifaceted, complex and didactically important issue.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swindeman, M. J.; Jetter, R. I.; Sham, T. -L.
One of the objectives of the high temperature design methodology activities is to develop and validate both improvements and the basic features of ASME Boiler and Pressure Vessel Code, Section III, Rules for Construction of Nuclear Facility Components, Division 5, High Temperature Reactors, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to aid assessment procedures of components under specified loading conditions in accordance with the elevated temperature design requirements for Division 5 Class A components. There are many features and alternative paths of varying complexity in HBB. The initial focus ofmore » this computer program is a basic path through the various options for a single reference material, 316H stainless steel. However, the computer program is being structured for eventual incorporation all of the features and permitted materials of HBB. This report will first provide a description of the overall computer program, particular challenges in developing numerical procedures for the assessment, and an overall approach to computer program development. This is followed by a more comprehensive appendix, which is the draft computer program manual for the program development. The strain limits rules have been implemented in the computer program. The evaluation of creep-fatigue damage will be implemented in future work scope.« less
The Impact of Software on Associate Degree Programs in Electronic Engineering Technology.
ERIC Educational Resources Information Center
Hata, David M.
1986-01-01
Assesses the range and extent of computer assisted instruction software available in electronic engineering technology education. Examines the need for software skills in four areas: (1) high-level languages; (2) assembly language; (3) computer-aided engineering; and (4) computer-aided instruction. Outlines strategies for the future in three…
ALOHA System Technical Reports 16, 19, 24, 28, and 30, 1974.
ERIC Educational Resources Information Center
Hawaii Univ., Honolulu. ALOHA System.
A series of technical reports based on the Aloha System for educational computer programs provide a background on how various countries in the Pacific region developed computer capabilities and describe their current operations, as well as prospects for future expansion. Included are studies on the Japan-Hawaii TELEX and Satellite; computers at…
Education, Information Technology and Cognitive Science.
ERIC Educational Resources Information Center
Scaife, M.
1989-01-01
Discusses information technology and its effects on developmental psychology and children's education. Topics discussed include a theory of child-computer interaction (CCI); programing; communication and computers, including electronic mail; cognitive science; artificial intelligence; modeling the user-system interaction; and the future of…
The Future of School Library Media Centers.
ERIC Educational Resources Information Center
Craver, Kathleen W.
1984-01-01
Examines impact of technology on school library media program development and role of school librarian. Technological trends (computerized record keeping, computer-assisted instruction, networking, home computers, videodiscs), employment and economic trends, education of school librarians, social and behavioral trends, and organizational and…
ERIC Educational Resources Information Center
Friedman, Stan, Sr.
2004-01-01
This article describes the results of the 19th annual Computers in Libraries Conference in Washington, DC on March 10-12, 2004. The conference peered into the future, drew lessons from the past, and ran like clockwork. Program chair Jane Dysart and her organizing committee are by now old hands, bringing together three keynote addresses, 100…
Making STEM Accessible and Effective through NASA Robotics Programs
ERIC Educational Resources Information Center
West, Jonathan; Vadiee, Nader; Sutherland, Emery; Kaye, Bradley; Baker, Kyle
2018-01-01
There is no question that Science, Math, Engineering, and Technology (STEM) education is critical to the future of our students and workforce. As technology advances, computer programming skills are becoming a necessity in almost all fields. However, teaching programming and other advanced technologies is very difficult, especially in…
An Antibiotic Resource Program for Students of the Health Professions.
ERIC Educational Resources Information Center
Tritz, Gerald J.
1986-01-01
Provides a description of a computer program developed to supplement instruction in testing of antibiotics on clinical isolates of microorganisms. The program is a simulation and database for interpretation of experimental data designed to enhance laboratory learning and prepare future physicians to use computerized diagnostic instrumentation and…
Applications of Artificial Intelligence in Education--A Personal View.
ERIC Educational Resources Information Center
Richer, Mark H.
1985-01-01
Discusses: how artificial intelligence (AI) can advance education; if the future of software lies in AI; the roots of intelligent computer-assisted instruction; protocol analysis; reactive environments; LOGO programming language; student modeling and coaching; and knowledge-based instructional programs. Numerous examples of AI programs are cited.…
Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming
NASA Technical Reports Server (NTRS)
Dorband, John E.; Aburdene, Maurice F.
2002-01-01
Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.
Apollo experience report: Real-time auxiliary computing facility development
NASA Technical Reports Server (NTRS)
Allday, C. E.
1972-01-01
The Apollo real time auxiliary computing function and facility were an extension of the facility used during the Gemini Program. The facility was expanded to include support of all areas of flight control, and computer programs were developed for mission and mission-simulation support. The scope of the function was expanded to include prime mission support functions in addition to engineering evaluations, and the facility became a mandatory mission support facility. The facility functioned as a full scale mission support activity until after the first manned lunar landing mission. After the Apollo 11 mission, the function and facility gradually reverted to a nonmandatory, offline, on-call operation because the real time program flexibility was increased and verified sufficiently to eliminate the need for redundant computations. The evaluation of the facility and function and recommendations for future programs are discussed in this report.
NASA Technical Reports Server (NTRS)
Weeks, Cindy Lou
1986-01-01
Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.
PCs: Key to the Future. Business Center Provides Sound Skills and Good Attitudes.
ERIC Educational Resources Information Center
Pay, Renee W.
1991-01-01
The Advanced Computing/Management Training Program at Jordan Technical Center (Sandy, Utah) simulates an automated office to teach five sets of skills: computer architecture and operating systems, word processing, data processing, communications skills, and management principles. (SK)
NASA Tech Briefs, May 1989. Volume 13, No. 5
NASA Technical Reports Server (NTRS)
1989-01-01
This issue contains a special feature on the flight station of the future, discussing future enhancements to Aircraft cockpits. Topics include: Electronic Components and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, and Mathematics and Information Sciences.
Thermal radiation view factor: Methods, accuracy and computer-aided procedures
NASA Technical Reports Server (NTRS)
Kadaba, P. V.
1982-01-01
The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.
NASA Technical Reports Server (NTRS)
Zolotukhin, V. G.; Kolosov, B. I.; Usikov, D. A.; Borisenko, V. I.; Mosin, S. T.; Gorokhov, V. N.
1980-01-01
A description of a batch of programs for the YeS-1040 computer combined into an automated system for processing photo (and video) images of the Earth's surface, taken from spacecraft, is presented. Individual programs with the detailed discussion of the algorithmic and programmatic facilities needed by the user are presented. The basic principles for assembling the system, and the control programs are included. The exchange format within whose framework the cataloging of any programs recommended for the system of processing will be activated in the future is displayed.
Electro-Optic Computing Architectures. Volume I
1998-02-01
The objective of the Electro - Optic Computing Architecture (EOCA) program was to develop multi-function electro - optic interfaces and optical...interconnect units to enhance the performance of parallel processor systems and form the building blocks for future electro - optic computing architectures...Specifically, three multi-function interface modules were targeted for development - an Electro - Optic Interface (EOI), an Optical Interconnection Unit (OW
The Challenge '88 Project: Interfacing of Chemical Instruments to Computers.
ERIC Educational Resources Information Center
Lyons, Jim; Verghese, Manoj
The main part of this project involved using a computer, either an Apple or an IBM, as a chart recorder for the infrared (IR) and nuclear magnetic resonance (NMR) spectrophotometers. The computer "reads" these machines and displays spectra on its monitor. The graphs can then be stored for future reference and manipulation. The program to…
Recursive computer architecture for VLSI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Treleaven, P.C.; Hopkins, R.P.
1982-01-01
A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.
Computational fluid dynamics at NASA Ames and the numerical aerodynamic simulation program
NASA Technical Reports Server (NTRS)
Peterson, V. L.
1985-01-01
Computers are playing an increasingly important role in the field of aerodynamics such as that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. The four main areas of computational aerodynamics research at NASA Ames Research Center which are directed toward extending the state of the art are identified and discussed. Example results obtained from approximate forms of the governing equations are presented and discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to programs of practical importance. Finally, the Numerical Aerodynamic Simulation Program--with its 1988 target of achieving a sustained computational rate of 1 billion floating-point operations per second--is discussed in terms of its goals, status, and its projected effect on the future of computational aerodynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, B.
1997-07-01
Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.
Simulation and Gaming: Directions, Issues, Ponderables.
ERIC Educational Resources Information Center
Uretsky, Michael
1995-01-01
Discusses the current use of simulation and gaming in a variety of settings. Describes advances in technology that facilitate the use of simulation and gaming, including computer power, computer networks, software, object-oriented programming, video, multimedia, virtual reality, and artificial intelligence. Considers the future use of simulation…
NASA Technical Reports Server (NTRS)
Marnock, M. J.
1971-01-01
The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.
DOT National Transportation Integrated Search
1976-08-01
This report contains a functional design for the simulation of a future automation concept in support of the ATC Systems Command Center. The simulation subsystem performs airport airborne arrival delay predictions and computes flow control tables for...
Integrating Computational Science Tools into a Thermodynamics Course
NASA Astrophysics Data System (ADS)
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.
Local Special Education Planning Model: User's Manual.
ERIC Educational Resources Information Center
Hartman, Peggy L.; Hartman, William T.
To help school districts estimate the present and future needs and costs of their special education programs, this manual presents the Local Special Education Planning Model, an interactive computer program (with worksheets) that provides a framework for using a district's own data to analyze its special education program. Part 1 of the manual…
Automatic Program Synthesis Reports.
ERIC Educational Resources Information Center
Biermann, A. W.; And Others
Some of the major results of future goals of an automatic program synthesis project are described in the two papers that comprise this document. The first paper gives a detailed algorithm for synthesizing a computer program from a trace of its behavior. Since the algorithm involves a search, the length of time required to do the synthesis of…
History of the numerical aerodynamic simulation program
NASA Technical Reports Server (NTRS)
Peterson, Victor L.; Ballhaus, William F., Jr.
1987-01-01
The Numerical Aerodynamic Simulation (NAS) program has reached a milestone with the completion of the initial operating configuration of the NAS Processing System Network. This achievement is the first major milestone in the continuing effort to provide a state-of-the-art supercomputer facility for the national aerospace community and to serve as a pathfinder for the development and use of future supercomputer systems. The underlying factors that motivated the initiation of the program are first identified and then discussed. These include the emergence and evolution of computational aerodynamics as a powerful new capability in aerodynamics research and development, the computer power required for advances in the discipline, the complementary nature of computation and wind tunnel testing, and the need for the government to play a pathfinding role in the development and use of large-scale scientific computing systems. Finally, the history of the NAS program is traced from its inception in 1975 to the present time.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1991-01-01
The main contribution of the effort in the last two years is the introduction of the MOPPS system. After doing extensive literature search, we introduced the system which is described next. MOPPS employs a new solution to the problem of managing programs which solve scientific and engineering applications on a distributed processing environment. Autonomous computers cooperate efficiently in solving large scientific problems with this solution. MOPPS has the advantage of not assuming the presence of any particular network topology or configuration, computer architecture, or operating system. It imposes little overhead on network and processor resources while efficiently managing programs concurrently. The core of MOPPS is an intelligent program manager that builds a knowledge base of the execution performance of the parallel programs it is managing under various conditions. The manager applies this knowledge to improve the performance of future runs. The program manager learns from experience.
Multiparadigm Design Environments
1992-01-01
following results: 1. New methods for programming in terms of conceptual models 2. Design of object-oriented languages 3. Compiler optimization and...experimented with object-based methods for programming directly in terms of conceptual models, object-oriented language design, computer program...expect the3e results to have a strong influence on future ,,j :- ...... L ! . . • a mm ammmml ll Illlll • l I 1 Conceptual Programming Conceptual
Computer Intelligence: Unlimited and Untapped.
ERIC Educational Resources Information Center
Staples, Betsy
1983-01-01
Herbert Simon (Nobel prize-winning economist/professor) expresses his views on human and artificial intelligence, problem solving, inventing concepts, and the future. Includes comments on expert systems, state of the art in artificial intelligence, robotics, and "Bacon," a computer program that finds scientific laws hidden in raw data.…
ERIC Educational Resources Information Center
Oralbekova, Aliya K.; Arzymbetova, Sholpan Zh.; Begalieva, Saule B.; Ospanbekova, Meirgul N.; Mussabekova, Gulvira A.; Dauletova, Ainash S.
2016-01-01
Many children with disabilities in the Republic of Kazakhstan face up to physiological difficulties in moving, communicating, learning, along with problems related to learning various computer programs. Computer technologies are of particular importance for children with disabilities. By using information and computer technologies, these children…
Eyewitness to history: Landmarks in the development of computerized electrocardiography.
Rautaharju, Pentti M
2016-01-01
The use of digital computers for ECG processing was pioneered in the early 1960s by two immigrants to the US, Hubert Pipberger, who initiated a collaborative VA project to collect an ECG-independent Frank lead data base, and Cesar Caceres at NIH who selected for his ECAN program standard 12-lead ECGs processed as single leads. Ray Bonner in the early 1970s placed his IBM 5880 program in a cart to print ECGs with interpretation, and computer-ECG programs were developed by Telemed, Marquette, HP-Philips and Mortara. The "Common Standards for quantitative Electrocardiography (CSE)" directed by Jos Willems evaluated nine ECG programs and eight cardiologists in clinically-defined categories. The total accuracy by a representative "average" cardiologist (75.5%) was 5.8% higher than that of the average program (69.7, p<0.001). Future comparisons of computer-based and expert reader performance are likely to show evolving results with continuing improvement of computer-ECG algorithms and changing expertise of ECG interpreters. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Fischer, James R.; Grosch, Chester; Mcanulty, Michael; Odonnell, John; Storey, Owen
1987-01-01
NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era.
Computational methods for a three-dimensional model of the petroleum-discovery process
Schuenemeyer, J.H.; Bawiec, W.J.; Drew, L.J.
1980-01-01
A discovery-process model devised by Drew, Schuenemeyer, and Root can be used to predict the amount of petroleum to be discovered in a basin from some future level of exploratory effort: the predictions are based on historical drilling and discovery data. Because marginal costs of discovery and production are a function of field size, the model can be used to make estimates of future discoveries within deposit size classes. The modeling approach is a geometric one in which the area searched is a function of the size and shape of the targets being sought. A high correlation is assumed between the surface-projection area of the fields and the volume of petroleum. To predict how much oil remains to be found, the area searched must be computed, and the basin size and discovery efficiency must be estimated. The basin is assumed to be explored randomly rather than by pattern drilling. The model may be used to compute independent estimates of future oil at different depth intervals for a play involving multiple producing horizons. We have written FORTRAN computer programs that are used with Drew, Schuenemeyer, and Root's model to merge the discovery and drilling information and perform the necessary computations to estimate undiscovered petroleum. These program may be modified easily for the estimation of remaining quantities of commodities other than petroleum. ?? 1980.
Education through the prism of computation
NASA Astrophysics Data System (ADS)
Kaurov, Vitaliy
2014-03-01
With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.
Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel
NASA Technical Reports Server (NTRS)
Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.
2016-01-01
The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.
National meeting to review IPAD status and goals. [Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
A joint NASA/industry project called Integrated Programs for Aerospace-vehicle Design (IPAD) is described, which has the goal of raising aerospace-industry productivity through the application of computers to integrate company-wide management of engineering data. Basically a general-purpose interactive computing system developed to support engineering design processes, the IPAD design is composed of three major software components: the executive, data management, and geometry and graphics software. Results of IPAD activities include a comprehensive description of a future representative aerospace vehicle design process and its interface to manufacturing, and requirements and preliminary design of a future IPAD software system to integrate engineering activities of an aerospace company having several products under simultaneous development.
Using Alice 2.0 to Design Games for People with Stroke.
Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack
2012-08-01
Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.
A Model for Intelligent Computer-Aided Education Systems.
ERIC Educational Resources Information Center
Du Plessis, Johan P.; And Others
1995-01-01
Proposes a model for intelligent computer-aided education systems that is based on cooperative learning, constructive problem-solving, object-oriented programming, interactive user interfaces, and expert system techniques. Future research is discussed, and a prototype for teaching mathematics to 10- to 12-year-old students is appended. (LRW)
Software Engineering Techniques for Computer-Aided Learning.
ERIC Educational Resources Information Center
Ibrahim, Bertrand
1989-01-01
Describes the process for developing tutorials for computer-aided learning (CAL) using a programing language rather than an authoring system. The workstation used is described, the use of graphics is discussed, the role of a local area network (LAN) is explained, and future plans are discussed. (five references) (LRW)
Preparing Urban Teachers for the Technological Future.
ERIC Educational Resources Information Center
Sheingold, Karen; And Others
This report reviews the results of a survey of teacher training programs in technology among 28 urban school systems in order to ascertain the current state of school computer use and teacher retraining. Results indicate that preparing students for the future presents particular problems for urban schools. With technology restructuring jobs and…
Research at Yale in Natural Language Processing. Research Report #84.
ERIC Educational Resources Information Center
Schank, Roger C.
This report summarizes the capabilities of five computer programs at Yale that do automatic natural language processing as of the end of 1976. For each program an introduction to its overall intent is given, followed by the input/output, a short discussion of the research underlying the program, and a prognosis for future development. The programs…
NASA Technical Reports Server (NTRS)
1975-01-01
A revised user's manual for the computer program MAPSEP is presented. Major changes from the interplanetary version of MAPSEP are summarized. The changes are intended to provide a basic capability to analyze anticipated solar electric missions, and a foundation for future more complex, modifications. For Vol. III, N75-16589.
ERIC Educational Resources Information Center
Kjällander, Susanne; Åkerfeldt, Anna; Mannila, Linda; Parnes, Peter
2018-01-01
For education to provide knowledge reflecting our current and future society, many countries are revising their curricula, including a vivid discussion on digital competence, programming and computational thinking. This article builds an understanding of the maker movement in relation to education in programming, by demonstrating challenges and…
A Survey of Techniques for Approximate Computing
Mittal, Sparsh
2016-03-18
Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less
An overview of the F-117A avionics flight test program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silz, R.
1992-02-01
This paper is an overview of the history of the F-117A avionics flight test program. System design concepts and equipment selections are explored followed by a review of full scale development and full capability development testing. Flight testing the Weapon System Computational Subsystem upgrade and the Offensive Combat Improvement Program are reviewed. Current flight test programs and future system updates are highlighted.
The UCLA MEDLARS Computer System *
Garvis, Francis J.
1966-01-01
Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested. PMID:5901355
The Technological Evolution in Schools: Reflections and Projections.
ERIC Educational Resources Information Center
Higgins, James E.
1991-01-01
Presents a first-person account of one teacher's experiences with computer hardware and software. The article discusses various programs and applications, such as integrated learning systems, database searching via CD-ROM, desktop publishing, authoring programs, and indicates future changes in instruction with increasing use of technology. (SM)
NASA Technical Reports Server (NTRS)
Trinh, H. P.; Gross, K. W.
1989-01-01
Computational studies have been conducted to examine the capability of a CFD code by simulating the steady state thrust chamber internal flow. The SSME served as the sample case, and significant parameter profiles are presented and discussed. Performance predictions from TDK, the recommended JANNAF reference computer program, are compared with those from PHOENICS to establish the credibility of its results. The investigation of an overexpanded nozzle flow is particularly addressed since it plays an important role in the area ratio selection of future rocket engines. Experience gained during this uncompleted flow separation study and future steps are outlined.
Electro-Optic Computing Architectures: Volume II. Components and System Design and Analysis
1998-02-01
The objective of the Electro - Optic Computing Architecture (EOCA) program was to develop multi-function electro - optic interfaces and optical...interconnect units to enhance the performance of parallel processor systems and form the building blocks for future electro - optic computing architectures...Specifically, three multi-function interface modules were targeted for development - an Electro - Optic Interface (EOI), an Optical Interconnection Unit
Diamond High Assurance Security Program: Trusted Computing Exemplar
2002-09-01
computing component, the Embedded MicroKernel Prototype. A third-party evaluation of the component will be initiated during development (e.g., once...target technologies and larger projects is a topic for future research. Trusted Computing Reference Component – The Embedded MicroKernel Prototype We...Kernel The primary security function of the Embedded MicroKernel will be to enforce process and data-domain separation, while providing primitive
Adoption of computer-assisted learning in medical education: the educators' perspective.
Schifferdecker, Karen E; Berman, Norm B; Fall, Leslie H; Fischer, Martin R
2012-11-01
Computer-assisted learning (CAL) in medical education has been shown to be effective in the achievement of learning outcomes, but requires the input of significant resources and development time. This study examines the key elements and processes that led to the widespread adoption of a CAL program in undergraduate medical education, the Computer-assisted Learning in Paediatrics Program (CLIPP). It then considers the relative importance of elements drawn from existing theories and models for technology adoption and other studies on CAL in medical education to inform the future development, implementation and testing of CAL programs in medical education. The study used a mixed-methods explanatory design. All paediatric clerkship directors (CDs) using CLIPP were recruited to participate in a self-administered, online questionnaire. Semi-structured interviews were then conducted with a random sample of CDs to further explore the quantitative results. Factors that facilitated adoption included CLIPP's ability to fill gaps in exposure to core clinical problems, the use of a national curriculum, development by CDs, and the meeting of CDs' desires to improve teaching and student learning. An additional facilitating factor was that little time and effort were needed to implement CLIPP within a clerkship. The quantitative findings were mostly corroborated by the qualitative findings. This study indicates issues that are important in the consideration and future exploration of the development and implementation of CAL programs in medical education. The promise of CAL as a method of enhancing the process and outcomes of medical education, and its cost, increase the need for future CAL funders and developers to pay equal attention to the needs of potential adopters and the development process as they do to the content and tools in the CAL program. Important questions that remain on the optimal design, use and integration of CAL should be addressed in order to adequately inform future development. Support is needed for studies that address these critical areas. © Blackwell Publishing Ltd 2012.
IPCS implications for future supersonic transport aircraft
NASA Technical Reports Server (NTRS)
Billig, L. O.; Kniat, J.; Schmidt, R. D.
1976-01-01
The Integrated Propulsion Control System (IPCS) demonstrates control of an entire supersonic propulsion module - inlet, engine afterburner, and nozzle - with an HDC 601 digital computer. The program encompasses the design, build, qualification, and flight testing of control modes, software, and hardware. The flight test vehicle is an F-111E airplane. The L.H. inlet and engine will be operated under control of a digital computer mounted in the weapons bay. A general description and the current status of the IPCS program are given.
Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program
ERIC Educational Resources Information Center
Jeffrey, Scott; Alvarez, Jaime
2010-01-01
The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…
ERIC Educational Resources Information Center
Baldwin, Fred D.
2001-01-01
With support from federal grants and area industry, the Alfred State College of Technology in New York's Southern Tier is training future workers for high-skill manufacturing jobs. The college offers certification and associate's degree programs in welding and machine-tool technology and is developing a training program in computer technology.…
NASA Astrophysics Data System (ADS)
Lehman, Donald Clifford
Today's medical laboratories are dealing with cost containment health care policies and unfilled laboratory positions. Because there may be fewer experienced clinical laboratory scientists, students graduating from clinical laboratory science (CLS) programs are expected by their employers to perform accurately in entry-level positions with minimal training. Information in the CLS field is increasing at a dramatic rate, and instructors are expected to teach more content in the same amount of time with the same resources. With this increase in teaching obligations, instructors could use a tool to facilitate grading. The research question was, "Can computer-assisted assessment evaluate students in an accurate and time efficient way?" A computer program was developed to assess CLS students' ability to evaluate peripheral blood smears. Automated grading permits students to get results quicker and allows the laboratory instructor to devote less time to grading. This computer program could improve instruction by providing more time to students and instructors for other activities. To be valuable, the program should provide the same quality of grading as the instructor. These benefits must outweigh potential problems such as the time necessary to develop and maintain the program, monitoring of student progress by the instructor, and the financial cost of the computer software and hardware. In this study, surveys of students and an interview with the laboratory instructor were performed to provide a formative evaluation of the computer program. In addition, the grading accuracy of the computer program was examined. These results will be used to improve the program for use in future courses.
National Occupational Skill Standards. CADD: Computer Aided Drafting and Design.
ERIC Educational Resources Information Center
National Coalition for Advanced Manufacturing, Washington, DC.
This document identifies computer-aided drafting and design (CADD) skills that companies require of training programs and future employees. The information was developed by two committees of technically knowledgeable CADD users from across the United States and validated by several hundred other CADD users. The skills are aimed at a beginner CADD…
Assessing the Computational Literacy of Elementary Students on a National Level in Korea
ERIC Educational Resources Information Center
Jun, SooJin; Han, SunGwan; Kim, HyeonCheol; Lee, WonGyu
2014-01-01
Information and communication technology (ICT) literacy education has become an important issue, and the necessity of computational literacy (CL) has been increasing in our growing information society. CL is becoming an important element for future talents, and many countries, including the USA, are developing programs for CL education.…
Student Teachers' Perceptions on Educational Technologies' Past, Present and Future
ERIC Educational Resources Information Center
Orhan Goksun, Derya; Filiz, Ozan; Kurt, Adile Askim
2018-01-01
The aim of this study is to reveal Computer Education and Instructional Technologies student teachers', who are in a distance teacher education program, perceptions on past, present and educational technologies of future via infographics. In this study, 54 infographics, which were created by student teachers who were enrolled in Special Teaching…
The new landscape of parallel computer architecture
NASA Astrophysics Data System (ADS)
Shalf, John
2007-07-01
The past few years has seen a sea change in computer architecture that will impact every facet of our society as every electronic device from cell phone to supercomputer will need to confront parallelism of unprecedented scale. Whereas the conventional multicore approach (2, 4, and even 8 cores) adopted by the computing industry will eventually hit a performance plateau, the highest performance per watt and per chip area is achieved using manycore technology (hundreds or even thousands of cores). However, fully unleashing the potential of the manycore approach to ensure future advances in sustained computational performance will require fundamental advances in computer architecture and programming models that are nothing short of reinventing computing. In this paper we examine the reasons behind the movement to exponentially increasing parallelism, and its ramifications for system design, applications and programming models.
Web-based training: a new paradigm in computer-assisted instruction in medicine.
Haag, M; Maylein, L; Leven, F J; Tönshoff, B; Haux, R
1999-01-01
Computer-assisted instruction (CAI) programs based on internet technologies, especially on the world wide web (WWW), provide new opportunities in medical education. The aim of this paper is to examine different aspects of such programs, which we call 'web-based training (WBT) programs', and to differentiate them from conventional CAI programs. First, we will distinguish five different interaction types: presentation; browsing; tutorial dialogue; drill and practice; and simulation. In contrast to conventional CAI, there are four architectural types of WBT programs: client-based; remote data and knowledge; distributed teaching; and server-based. We will discuss the implications of the different architectures for developing WBT software. WBT programs have to meet other requirements than conventional CAI programs. The most important tools and programming languages for developing WBT programs will be listed and assigned to the architecture types. For the future, we expect a trend from conventional CAI towards WBT programs.
Training the Future - Swamp Work Activities
2017-07-19
In the Swamp Works laboratory at NASA's Kennedy Space Center in Florida, student interns, from the left, Jeremiah House, Thomas Muller and Austin Langdon are joining agency scientists, contributing in the area of Exploration Research and Technology. House is studying computer/electrical engineering at John Brown University in Siloam Springs, Arkansas. Muller is pursuing a degree in computer engineering and control systems and Florida Tech. Langdon is an electrical engineering major at the University of Kentucky. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Merlin - Massively parallel heterogeneous computing
NASA Technical Reports Server (NTRS)
Wittie, Larry; Maples, Creve
1989-01-01
Hardware and software for Merlin, a new kind of massively parallel computing system, are described. Eight computers are linked as a 300-MIPS prototype to develop system software for a larger Merlin network with 16 to 64 nodes, totaling 600 to 3000 MIPS. These working prototypes help refine a mapped reflective memory technique that offers a new, very general way of linking many types of computer to form supercomputers. Processors share data selectively and rapidly on a word-by-word basis. Fast firmware virtual circuits are reconfigured to match topological needs of individual application programs. Merlin's low-latency memory-sharing interfaces solve many problems in the design of high-performance computing systems. The Merlin prototypes are intended to run parallel programs for scientific applications and to determine hardware and software needs for a future Teraflops Merlin network.
The Design and Implementation of NASA's Advanced Flight Computing Module
NASA Technical Reports Server (NTRS)
Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce
1995-01-01
This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.
A shock wave capability for the improved Two-Dimensional Kinetics (TDK) computer program
NASA Technical Reports Server (NTRS)
Nickerson, G. R.; Dang, L. D.
1984-01-01
The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket engine performance prediction procedures. The purpose of this contract has been to improve the TDK computer program so that it can be applied to rocket engine designs of advanced type. In particular, future orbit transfer vehicles (OTV) will require rocket engines that operate at high expansion ratio, i.e., in excess of 200:1. Because only a limited length is available in the space shuttle bay, it is possible that OTV nozzles will be designed with both relatively short length and high expansion ratio. In this case, a shock wave may be present in the flow. The TDK computer program was modified to include the simulation of shock waves in the supersonic nozzle flow field. The shocks induced by the wall contour can produce strong perturbations of the flow, affecting downstream conditions which need to be considered for thrust chamber performance calculations.
NASA Technical Reports Server (NTRS)
1973-01-01
The logistics of orbital vehicle servicing computer specifications was developed and a number of alternatives to improve utilization of the space shuttle and the tug were investigated. Preliminary results indicate that space servicing offers a potential for reducing future operational and program costs over ground refurbishment of satellites. A computer code which could be developed to simulate space servicing is presented.
NASA Technical Reports Server (NTRS)
Rubbert, P. E.
1978-01-01
The commercial airplane builder's viewpoint on the important issues involved in the development of improved computational aerodynamics tools such as powerful computers optimized for fluid flow problems is presented. The primary user of computational aerodynamics in a commercial aircraft company is the design engineer who is concerned with solving practical engineering problems. From his viewpoint, the development of program interfaces and pre-and post-processing capability for new computational methods is just as important as the algorithms and machine architecture. As more and more details of the entire flow field are computed, the visibility of the output data becomes a major problem which is then doubled when a design capability is added. The user must be able to see, understand, and interpret the results calculated. Enormous costs are expanded because of the need to work with programs having only primitive user interfaces.
An acceptable role for computers in the aircraft design process
NASA Technical Reports Server (NTRS)
Gregory, T. J.; Roberts, L.
1980-01-01
Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.
Noar, Seth M.; Webb, Elizabeth M.; Van Stee, Stephanie K.; Redding, Colleen A.; Feist-Price, Sonja; Crosby, Richard; Troutman, Adewale
2011-01-01
New prevention options are urgently needed for African-Americans in the United States given the disproportionate impact of HIV/AIDS on this group. This combined with recent evidence supporting the efficacy of computer technology-based interventions in HIV prevention led our research group to pursue the development of a computer-delivered individually tailored intervention for heterosexually active African-Americans—the tailored information program for safer sex (TIPSS). In the current article, we discuss the development of the TIPSS program, including (i) the targeted population and behavior, (ii) theoretical basis for the intervention, (iii) design of the intervention, (iv) formative research, (v) technical development and testing and (vi) intervention delivery and ongoing randomized controlled trial. Given the many advantages of computer-based interventions, including low-cost delivery once developed, they offer much promise for the future of HIV prevention among African-Americans and other at-risk groups. PMID:21257676
Graphical User Interface Development for Representing Air Flow Patterns
NASA Technical Reports Server (NTRS)
Chaudhary, Nilika
2004-01-01
In the Turbine Branch, scientists carry out experimental and computational work to advance the efficiency and diminish the noise production of jet engine turbines. One way to do this is by decreasing the heat that the turbine blades receive. Most of the experimental work is carried out by taking a single turbine blade and analyzing the air flow patterns around it, because this data indicates the sections of the turbine blade that are getting too hot. Since the cost of doing turbine blade air flow experiments is very high, researchers try to do computational work that fits the experimental data. The goal of computational fluid dynamics is for scientists to find a numerical way to predict the complex flow patterns around different turbine blades without physically having to perform tests or costly experiments. When visualizing flow patterns, scientists need a way to represent the flow conditions around a turbine blade. A researcher will assign specific zones that surround the turbine blade. In a two-dimensional view, the zones are usually quadrilaterals. The next step is to assign boundary conditions which define how the flow enters or exits one side of a zone. way of setting up computational zones and grids, visualizing flow patterns, and storing all the flow conditions in a file on the computer for future computation. Such a program is necessary because the only method for creating flow pattern graphs is by hand, which is tedious and time-consuming. By using a computer program to create the zones and grids, the graph would be faster to make and easier to edit. Basically, the user would run a program that is an editable graph. The user could click and drag with the mouse to form various zones and grids, then edit the locations of these grids, add flow and boundary conditions, and finally save the graph for future use and analysis. My goal this summer is to create a graphical user interface (GUI) that incorporates all of these elements. I am writing the program in Java, a language that is portable among platforms, because it can run on different operating systems such as Windows and Unix without having to be rewritten. I had no prior experience of programming in Java at the start of my internship; I am continuously learning as I create the program. I have written the part of the program that enables a user to draw several zones, edit them, and store their locations. The next phase of my project is to allow the user to click on the side of a zone and create a boundary condition for it. A previous intern wrote a program that allows the user to input boundary conditions. I can integrate the two programs to create a larger, more usable program. After that, I will develop a way for the user to save the graph for future reference. Another eventual goal is to make the GUI capable of creating three-dimensional zones as well. Researchers such as my mentor, Dr. David Ashpis, need a quick, user-friendly
NASA Technical Reports Server (NTRS)
Plankey, B.
1981-01-01
A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.
Three Program Architecture for Design Optimization
NASA Technical Reports Server (NTRS)
Miura, Hirokazu; Olson, Lawrence E. (Technical Monitor)
1998-01-01
In this presentation, I would like to review historical perspective on the program architecture used to build design optimization capabilities based on mathematical programming and other numerical search techniques. It is rather straightforward to classify the program architecture in three categories as shown above. However, the relative importance of each of the three approaches has not been static, instead dynamically changing as the capabilities of available computational resource increases. For example, we considered that the direct coupling architecture would never be used for practical problems, but availability of such computer systems as multi-processor. In this presentation, I would like to review the roles of three architecture from historical as well as current and future perspective. There may also be some possibility for emergence of hybrid architecture. I hope to provide some seeds for active discussion where we are heading to in the very dynamic environment for high speed computing and communication.
Teaching and Program Variations in International Business: Past, Present and Future.
ERIC Educational Resources Information Center
Kaynak, Erdener; Schermerhorn, John R., Jr.
1999-01-01
This introductory article in a theme issue identifies common themes in the included papers, such as the need for more "active learning" and "project-based learning," the use of computer technology to facilitate "virtual teamwork," the importance of support services for these initiatives, and reliance on need-oriented programs and courses in…
Computer-aided injection molding system
NASA Astrophysics Data System (ADS)
Wang, K. K.; Shen, S. F.; Cohen, C.; Hieber, C. A.; Isayev, A. I.
1982-10-01
Achievements are reported in cavity-filling simulation, modeling viscoelastic effects, measuring and predicting frozen-in birefringence in molded parts, measuring residual stresses and associated mechanical properties of molded parts, and developing an interactive mold-assembly design program and an automatic NC maching data generation and verification program. The Cornell Injection Molding Program (CIMP) consortium is discussed as are computer user manuals that have been published by the consortium. Major tasks which should be addressed in future efforts are listed, including: (1) predict and experimentally determine the post-fillin behavior of thermoplastics; (2) simulate and experimentally investigate the injection molding of thermosets and filled materials; and (3) further investigate residual stresses, orientation and mechanical properties.
Caesy: A software tool for computer-aided engineering
NASA Technical Reports Server (NTRS)
Wette, Matt
1993-01-01
A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.
Optical Computing Based on Neuronal Models
1988-05-01
walking, and cognition are far too complex for existing sequential digital computers. Therefore new architectures, hardware, and algorithms modeled...collective behavior, and iterative processing into optical processing and artificial neurodynamical systems. Another intriguing promise of neural nets is...with architectures, implementations, and programming; and material research s -7- called for. Our future research in neurodynamics will continue to
Learning from the Learners: Preparing Future Teachers to Leverage the Benefits of Laptop Computers
ERIC Educational Resources Information Center
Grundmeyer, Trent; Peters, Randal
2016-01-01
Technology is changing the teaching and learning landscape. Teacher preparation programs must produce teachers who have new skills and strategies to leverage the benefits of laptop computers in their classrooms. This study used a phenomenological strategy to explain first-year college students' perceptions of the effects of a 1:1 laptop experience…
Look into the Future: Displaced Clerical Project. Final Report.
ERIC Educational Resources Information Center
Stover, Deborah A.
"Look into the Future" is a program created by a Job Training Partnership Act project and 9to5, Working Women Education Fund, to address the training and retraining needs of office workers in light of the advances in computer and communications systems. This guide describes the model project and suggests steps other organizations can…
CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences
NASA Technical Reports Server (NTRS)
Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri
2014-01-01
This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science and Technology.
This report considers the current and future impact of technology on schools, solutions to existing problems, and major policy questions concerning computer technology's role in education. Experiences of several universities in integrating computers into their programs are reviewed, as well as those of states and local school districts in…
NASA Technical Reports Server (NTRS)
Nguyen, D. T.; Rogers, J. L., Jr.
1986-01-01
A finite element based programming system for minimum weight design of a truss-type structure subjected to displacement, stress, and lower and upper bounds on design variables is presented. The programming system consists of a number of independent processors, each performing a specific task. These processors, however, are interfaced through a well-organized data base, thus making the tasks of modifying, updating, or expanding the programming system much easier in a friendly environment provided by many inexpensive personal computers. The proposed software can be viewed as an important step in achieving a 'dummy' finite element for optimization. The programming system has been implemented on both large and small computers (such as VAX, CYBER, IBM-PC, and APPLE) although the focus is on the latter. Examples are presented to demonstrate the capabilities of the code. The present programming system can be used stand-alone or as part of the multilevel decomposition procedure to obtain optimum design for very large scale structural systems. Furthermore, other related research areas such as developing optimization algorithms (or in the larger level: a structural synthesis program) for future trends in using parallel computers may also benefit from this study.
Energy Efficient Engine (E3) controls and accessories detail design report
NASA Technical Reports Server (NTRS)
Beitler, R. S.; Lavash, J. P.
1982-01-01
An Energy Efficient Engine program has been established by NASA to develop technology for improving the energy efficiency of future commercial transport aircraft engines. As part of this program, a new turbofan engine was designed. This report describes the fuel and control system for this engine. The system design is based on many of the proven concepts and component designs used on the General Electric CF6 family of engines. One significant difference is the incorporation of digital electronic computation in place of the hydromechanical computation currently used.
VAPEPS user's reference manual, version 5.0
NASA Technical Reports Server (NTRS)
Park, D. M.
1988-01-01
This is the reference manual for the VibroAcoustic Payload Environment Prediction System (VAPEPS). The system consists of a computer program and a vibroacoustic database. The purpose of the system is to collect measurements of vibroacoustic data taken from flight events and ground tests, and to retrieve this data and provide a means of using the data to predict future payload environments. This manual describes the operating language of the program. Topics covered include database commands, Statistical Energy Analysis (SEA) prediction commands, stress prediction command, and general computational commands.
Review of NASA antiskid braking research
NASA Technical Reports Server (NTRS)
Tanner, J. A.
1982-01-01
NASA antiskid braking system research programs are reviewed. These programs include experimental studies of four antiskid systems on the Langley Landing Loads Track, flights tests with a DC-9 airplane, and computer simulation studies. Results from these research efforts include identification of factors contributing to degraded antiskid performance under adverse weather conditions, tire tread temperature measurements during antiskid braking on dry runway surfaces, and an assessment of the accuracy of various brake pressure-torque computer models. This information should lead to the development of better antiskid systems in the future.
ERIC Educational Resources Information Center
School Science Review, 1983
1983-01-01
Presents background information, laboratory procedures, classroom materials/activities, and chemistry experiments. Topics include sublimation, electronegativity, electrolysis, experimental aspects of strontianite, halide test, evaluation of present and future computer programs in chemistry, formula building, care of glass/saturated calomel…
Can An Evolutionary Process Create English Text?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.
Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less
Computer problem-solving coaches for introductory physics: Design and usability studies
NASA Astrophysics Data System (ADS)
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-06-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.
US computer research networks: Current and future
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Sood, D.; Verostko, A.
1989-01-01
During the last decade, NASA LeRC's Communication Program has conducted a series of telecommunications forecasting studies to project trends and requirements and to identify critical telecommunications technologies that must be developed to meet future requirements. The Government Networks Division of Contel Federal Systems has assisted NASA in these studies, and the current study builds upon these earlier efforts. The current major thrust of the NASA Communications Program is aimed at developing the high risk, advanced, communications satellite and terminal technologies required to significantly increase the capacity of future communications systems. Also, major new technological, economic, and social-political events and trends are now shaping the communications industry of the future. Therefore, a re-examination of future telecommunications needs and requirements is necessary to enable NASA to make management decisions in its Communications Program and to ensure the proper technologies and systems are addressed. This study, through a series of Task Orders, is helping NASA define the likely communication service needs and requirements of the future and thereby ensuring that the most appropriate technology developments are pursued.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
A survey of parallel programming tools
NASA Technical Reports Server (NTRS)
Cheng, Doreen Y.
1991-01-01
This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.
Champion, Katrina E; Newton, Nicola C; Barrett, Emma L; Teesson, Maree
2013-03-01
The use of alcohol and drugs amongst young people is a serious concern and the need for effective prevention is clear. This paper identifies and describes current school-based alcohol and other drug prevention programs facilitated by computers or the Internet. The Cochrane Library, PsycINFO and PubMed databases were searched in March 2012. Additional materials were obtained from reference lists of papers. Studies were included if they described an Internet- or computer-based prevention program for alcohol or other drugs delivered in schools. Twelve trials of 10 programs were identified. Seven trials evaluated Internet-based programs and five delivered an intervention via CD-ROM. The interventions targeted alcohol, cannabis and tobacco. Data to calculate effect size and odds ratios were unavailable for three programs. Of the seven programs with available data, six achieved reductions in alcohol, cannabis or tobacco use at post intervention and/or follow up. Two interventions were associated with decreased intentions to use tobacco, and two significantly increased alcohol and drug-related knowledge. This is the first study to review the efficacy of school-based drug and alcohol prevention programs delivered online or via computers. Findings indicate that existing computer- and Internet-based prevention programs in schools have the potential to reduce alcohol and other drug use as well as intentions to use substances in the future. These findings, together with the implementation advantages and high fidelity associated with new technology, suggest that programs facilitated by computers and the Internet offer a promising delivery method for school-based prevention. © 2012 Australasian Professional Society on Alcohol and other Drugs.
Stanczyk, Nicola Esther; Crutzen, Rik; Bolman, Catherine; Muris, Jean; de Vries, Hein
2013-02-06
Smoking tobacco is one of the most preventable causes of illness and death. Web-based tailored smoking cessation interventions have shown to be effective. Although these interventions have the potential to reach a large number of smokers, they often face high attrition rates, especially among lower educated smokers. A possible reason for the high attrition rates in the latter group is that computer-tailored smoking cessation interventions may not be attractive enough as they are mainly text-based. Video-based messages might be more effective in attracting attention and stimulating comprehension in people with a lower educational level and could therefore reduce attrition rates. The objective of the present study was to investigate whether differences exist in message-processing mechanisms (attention, comprehension, self-reference, appreciation, processing) and future adherence (intention to visit/use the website again, recommend the website to others), according to delivery strategy (video or text based messages) and educational level, to a Dutch computer-tailored smoking cessation program. Smokers who were motivated to quit within the following 6 months and who were aged over 16 were included in the program. Participants were randomly assigned to one of two conditions (video/text CT). The sample was stratified into 2 categories: lower and higher educated participants. In total, 139 participants completed the first session of the web-based tailored intervention and were subsequently asked to fill out a questionnaire assessing message-processing mechanisms and future adherence. ANOVAs and regression analyses were conducted to investigate the differences in message-processing mechanisms and future adherence with regard to delivery strategy and education. No interaction effects were found between delivery strategy (video vs text) and educational level on message-processing mechanisms and future adherence. Delivery strategy had no effect on future adherence and processing mechanisms. However, in both groups results indicated that lower educated participants showed higher attention (F(1,138)=3.97; P=.05) and processing levels (F(1,138)=4.58; P=.04). Results revealed also that lower educated participants were more inclined to visit the computer-tailored intervention website again (F(1,138)=4.43; P=.04). Computer-tailored programs have the potential to positively influence lower educated groups as they might be more involved in the computer-tailored intervention than higher educated smokers. Longitudinal studies with a larger sample are needed to gain more insight into the role of delivery strategy in tailored information and to investigate whether the intention to visit the intervention website again results in the ultimate goal of behavior change. Netherlands Trial Register (NTR3102).
Crutzen, Rik; Bolman, Catherine; Muris, Jean; de Vries, Hein
2013-01-01
Background Smoking tobacco is one of the most preventable causes of illness and death. Web-based tailored smoking cessation interventions have shown to be effective. Although these interventions have the potential to reach a large number of smokers, they often face high attrition rates, especially among lower educated smokers. A possible reason for the high attrition rates in the latter group is that computer-tailored smoking cessation interventions may not be attractive enough as they are mainly text-based. Video-based messages might be more effective in attracting attention and stimulating comprehension in people with a lower educational level and could therefore reduce attrition rates. Objective The objective of the present study was to investigate whether differences exist in message-processing mechanisms (attention, comprehension, self-reference, appreciation, processing) and future adherence (intention to visit/use the website again, recommend the website to others), according to delivery strategy (video or text based messages) and educational level, to a Dutch computer-tailored smoking cessation program. Methods Smokers who were motivated to quit within the following 6 months and who were aged over 16 were included in the program. Participants were randomly assigned to one of two conditions (video/text CT). The sample was stratified into 2 categories: lower and higher educated participants. In total, 139 participants completed the first session of the web-based tailored intervention and were subsequently asked to fill out a questionnaire assessing message-processing mechanisms and future adherence. ANOVAs and regression analyses were conducted to investigate the differences in message-processing mechanisms and future adherence with regard to delivery strategy and education. Results No interaction effects were found between delivery strategy (video vs text) and educational level on message-processing mechanisms and future adherence. Delivery strategy had no effect on future adherence and processing mechanisms. However, in both groups results indicated that lower educated participants showed higher attention (F 1,138=3.97; P=.05) and processing levels (F 1,138=4.58; P=.04). Results revealed also that lower educated participants were more inclined to visit the computer-tailored intervention website again (F 1,138=4.43; P=.04). Conclusions Computer-tailored programs have the potential to positively influence lower educated groups as they might be more involved in the computer-tailored intervention than higher educated smokers. Longitudinal studies with a larger sample are needed to gain more insight into the role of delivery strategy in tailored information and to investigate whether the intention to visit the intervention website again results in the ultimate goal of behavior change. Trial Registration Netherlands Trial Register (NTR3102). PMID:23388554
ERIC Educational Resources Information Center
California Community Colleges, Sacramento. Office of the Chancellor.
This is the sixth report on the status and progress of the Telecommunications and Technology Infrastructure Program (TTIP), submitted by the California Community Colleges. In California, familiarity with and use of computers is fundamental to economic success. California is home to many of the major companies involved in creating the future of the…
CFD in design - A government perspective
NASA Technical Reports Server (NTRS)
Kutler, Paul; Gross, Anthony R.
1989-01-01
Some of the research programs involving the use of CFD in the aerodynamic design process at government laboratories around the United States are presented. Technology transfer issues and future directions in the discipline or CFD are addressed. The major challengers in the aerosciences as well as other disciplines that will require high-performance computing resources such as massively parallel computers are examined.
ERIC Educational Resources Information Center
Kurland, D. Midian, Ed.
The five papers in this symposium contribute to a dialog on the aims and methods of computer education, and indicate directions future research must take if necessary information is to be available to make informed decisions about the use of computers in schools. The first two papers address the question of what is required for a student to become…
Computer modeling and simulators as part of university training for NPP operating personnel
NASA Astrophysics Data System (ADS)
Volman, M.
2017-01-01
This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.
NASA HPCC Technology for Aerospace Analysis and Design
NASA Technical Reports Server (NTRS)
Schulbach, Catherine H.
1999-01-01
The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.
Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J
2013-12-01
To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Software for Testing Electroactive Structural Components
NASA Technical Reports Server (NTRS)
Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar
2003-01-01
A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.
Applications of genetic programming in cancer research.
Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M
2009-02-01
The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.
On Undecidability Aspects of Resilient Computations and Implications to Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S
2014-01-01
Future Exascale computing systems with a large number of processors, memory elements and interconnection links, are expected to experience multiple, complex faults, which affect both applications and operating-runtime systems. A variety of algorithms, frameworks and tools are being proposed to realize and/or verify the resilience properties of computations that guarantee correct results on failure-prone computing systems. We analytically show that certain resilient computation problems in presence of general classes of faults are undecidable, that is, no algorithms exist for solving them. We first show that the membership verification in a generic set of resilient computations is undecidable. We describe classesmore » of faults that can create infinite loops or non-halting computations, whose detection in general is undecidable. We then show certain resilient computation problems to be undecidable by using reductions from the loop detection and halting problems under two formulations, namely, an abstract programming language and Turing machines, respectively. These two reductions highlight different failure effects: the former represents program and data corruption, and the latter illustrates incorrect program execution. These results call for broad-based, well-characterized resilience approaches that complement purely computational solutions using methods such as hardware monitors, co-designs, and system- and application-specific diagnosis codes.« less
Specialized computer architectures for computational aerodynamics
NASA Technical Reports Server (NTRS)
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
Portable Computer Technology (PCT) Research and Development Program Phase 2
NASA Technical Reports Server (NTRS)
Castillo, Michael; McGuire, Kenyon; Sorgi, Alan
1995-01-01
The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
The Argonne Leadership Computing Facility 2010 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less
CUDAEASY - a GPU accelerated cosmological lattice program
NASA Astrophysics Data System (ADS)
Sainio, J.
2010-05-01
This paper presents, to the author's knowledge, the first graphics processing unit (GPU) accelerated program that solves the evolution of interacting scalar fields in an expanding universe. We present the implementation in NVIDIA's Compute Unified Device Architecture (CUDA) and compare the performance to other similar programs in chaotic inflation models. We report speedups between one and two orders of magnitude depending on the used hardware and software while achieving small errors in single precision. Simulations that used to last roughly one day to compute can now be done in hours and this difference is expected to increase in the future. The program has been written in the spirit of LATTICEEASY and users of the aforementioned program should find it relatively easy to start using CUDAEASY in lattice simulations. The program is available at http://www.physics.utu.fi/theory/particlecosmology/cudaeasy/ under the GNU General Public License.
Bellwether Social Studies Programs.
ERIC Educational Resources Information Center
Daetz, Denney
1985-01-01
Describes and reviews commercially-available computer software for social studies (SS). They are: "Jury Trial II" (utilizes artificial intelligence); "Africa" (utilizes creative graphics to teaching SS facts; "Revolutions: Past, Present and Future"; "The Other Side" (examines world peace using values…
2003-04-01
interface. These results can be incorporated in military training programs where computer games are part of the curriculum. Future military game ... development can also utilize these results to determine which type of instructional material to be included in the games.
Airfoil Vibration Dampers program
NASA Technical Reports Server (NTRS)
Cook, Robert M.
1991-01-01
The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.
Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Samuel; Baker, Gavin Matthew; Gamell, Marc
2015-10-01
Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leadingmore » candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.« less
NASA Astrophysics Data System (ADS)
Smeekens, M.; Baru, C.; Keller, G. R.; Arrowsmith, R.; Crosby, C. J.
2009-12-01
The Cyberinfrastructure Summer Institute for Geoscientists (CSIG) has been conducted each year since 2004 under sponsorship of the GEON project that is funded by the NSF. The goal of the institute, which is broadly advertised to the Geoscience community, is to introduce geoscientists to Computer Science concepts and commonly-used as well as emergent information technology tools. The week-long program originally covered topics ranging from Data Modeling, Web Services, and Geographic Information Systems, to brief introductions to key concepts in Grid Computing, Parallel Programming, and Scientific Workflows. However, the program as well as the composition and expectations of the audience have evolved over time. Detailed course and instructor evaluations provide valuable feedback on course content and presentation approaches, and are used to plan future CSIG curriculum. From an initial emphasis on Geoscience graduate students and postdocs, the selection process has evolved to encourage participation by individuals with backgrounds in Geoscience as well as Computer Science from academia, government agencies, and industry. More recently, there has been an emphasis on selecting junior faculty and those interested in teaching Geoinformatics courses. While the initial objective of CSIG was to provide an overview of information technology topics via lectures and demonstrations, over time attendees have become more interested in specific instruction in how informatics and cyberinfrastructure (CI) capabilities could be utilized to address issues in Earth Science research and education. There have been requests over the years for more in-depth coverage on some topics and hands-on exercises. The program has now evolved to include a “Build Track”, focused on IT issues related to the development and implementation of Geoinformatics systems, and an “Education Track”, focused on use of Geoinformatics resources in education. With increasing awareness of CI projects, the audience is also becoming more interested in an introduction to the broader landscape of CI activities in the Geosciences and related areas. In the future, we plan a “demo” session to showcase various CI projects. Attendees will not only hear about such projects but will be able to use and experience the cyber-environments and tools in a hands-on session. The evolution of the CSIG program reflects major changes in the IT landscape since 2004. Where we once discussed Grid Computing, students are now learning about Cloud Computing and related concepts. An institute like CSIG play an important role in providing “cross-training” such that geoscientists gain insight into IT issues and solution approaches, while computer scientist gain a better appreciation of the needs and requirements of geoscience applications. In this presentation, we will summarize and analyze the trends over the years in program as well as audience composition; discuss lessons learnt over the years; and present our plan for future CSIG offerings.
ERIC Educational Resources Information Center
1991
Narrated by actor Kadeem Hardison, this documentary videotape presents arguments and examples for using Computer Assisted Instruction (CAI) in today's classroom. Experts in education examine how individuals currently use technology and suggest how people can use technology better in the future to augment and improve education. Many programs are…
Evaluation of an F100 multivariable control using a real-time engine simulation
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Skira, C.; Soeder, J. F.
1977-01-01
A multivariable control design for the F100 turbofan engine was evaluated, as part of the F100 multivariable control synthesis (MVCS) program. The evaluation utilized a real-time, hybrid computer simulation of the engine and a digital computer implementation of the control. Significant results of the evaluation are presented and recommendations concerning future engine testing of the control are made.
Computer modeling of human decision making
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.
Training the Future - Interns Harvesting & Testing Plant Experim
2017-07-19
In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is majoring in computer science and chemistry at Rocky Mountain College in Billings, Montana. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Hukerikar, Saurabh; Teranishi, Keita; Diniz, Pedro C.; ...
2017-02-11
In the presence of accelerated fault rates, which are projected to be the norm on future exascale systems, it will become increasingly difficult for high-performance computing (HPC) applications to accomplish useful computation. Due to the fault-oblivious nature of current HPC programming paradigms and execution environments, HPC applications are insufficiently equipped to deal with errors. We believe that HPC applications should be enabled with capabilities to actively search for and correct errors in their computations. The redundant multithreading (RMT) approach offers lightweight replicated execution streams of program instructions within the context of a single application process. Furthermore, the use of completemore » redundancy incurs significant overhead to the application performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hukerikar, Saurabh; Teranishi, Keita; Diniz, Pedro C.
In the presence of accelerated fault rates, which are projected to be the norm on future exascale systems, it will become increasingly difficult for high-performance computing (HPC) applications to accomplish useful computation. Due to the fault-oblivious nature of current HPC programming paradigms and execution environments, HPC applications are insufficiently equipped to deal with errors. We believe that HPC applications should be enabled with capabilities to actively search for and correct errors in their computations. The redundant multithreading (RMT) approach offers lightweight replicated execution streams of program instructions within the context of a single application process. Furthermore, the use of completemore » redundancy incurs significant overhead to the application performance.« less
Mobilize Your instruction Program with Wireless Technology.
ERIC Educational Resources Information Center
Mathias, Molly Susan; Heser, Steven
2002-01-01
Describes the use of wireless technology for library bibliographic instruction at the Milwaukee Area Technical College. Highlights include a wireless mobile cart that holds laptop computers; faculty support; future plans; and recommendations, including investigating technology infrastructure and marketing. (LRW)
NASA Tech Briefs, February 1989. Volume 13, No. 2
NASA Technical Reports Server (NTRS)
1989-01-01
This issue contains a special feature on shaping the future with Ceramics. Other topics include: Electronic Components & and Circuits. Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences,
Bacteria as computers making computers
Danchin, Antoine
2009-01-01
Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882
Bacteria as computers making computers.
Danchin, Antoine
2009-01-01
Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments.
A statistical view of FMRFamide neuropeptide diversity.
Espinoza, E; Carrigan, M; Thomas, S G; Shaw, G; Edison, A S
2000-01-01
FMRFamide-like peptide (FLP) amino acid sequences have been collected and statistically analyzed. FLP amino acid composition as a function of position in the peptide is graphically presented for several major phyla. Results of total amino acid composition and frequencies of pairs of FLP amino acids have been computed and compared with corresponding values from the entire GenBank protein sequence database. The data for pairwise distributions of amino acids should help in future structure-function studies of FLPs. To aid in future peptide discovery, a computer program and search protocol was developed to identify FLPs from the GenBank protein database without the use of keywords.
The Matter Simulation (R)evolution
2018-01-01
To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime. PMID:29532014
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1979-01-01
The paper describes the computational techniques employed in determining the optimal propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements. The computer programs used to perform calculations for all the factors that enter into the selection process of determining the optimum combinations of airplanes and engines are examined. Attention is given to the description of the computer codes including NNEP, WATE, LIFCYC, INSTAL, and POD DRG. A process is illustrated by which turbine engines can be evaluated as to fuel consumption, engine weight, cost and installation effects. Examples are shown as to the benefits of variable geometry and of the tradeoff between fuel burned and engine weights. Future plans for further improvements in the analytical modeling of engine systems are also described.
Diller, David J; Swanson, Jon; Bayden, Alexander S; Jarosinski, Mark; Audie, Joseph
2015-01-01
Peptides provide promising templates for developing drugs to occupy a middle space between small molecules and antibodies and for targeting 'undruggable' intracellular protein-protein interactions. Importantly, rational or in cerebro design, especially when coupled with validated in silico tools, can be used to efficiently explore chemical space and identify islands of 'drug-like' peptides to satisfy diverse drug discovery program objectives. Here, we consider the underlying principles of and recent advances in rational, computer-enabled peptide drug design. In particular, we consider the impact of basic physicochemical properties, potency and ADME/Tox opportunities and challenges, and recently developed computational tools for enabling rational peptide drug design. Key principles and practices are spotlighted by recent case studies. We close with a hypothetical future case study.
Operations analysis (study 2.1): Program manual and users guide for the LOVES computer code
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
Information is provided necessary to use the LOVES Computer Program in its existing state, or to modify the program to include studies not properly handled by the basic model. The Users Guide defines the basic elements assembled together to form the model for servicing satellites in orbit. As the program is a simulation, the method of attack is to disassemble the problem into a sequence of events, each occurring instantaneously and each creating one or more other events in the future. The main driving force of the simulation is the deterministic launch schedule of satellites and the subsequent failure of the various modules which make up the satellites. The LOVES Computer Program uses a random number generator to simulate the failure of module elements and therefore operates over a long span of time typically 10 to 15 years. The sequence of events is varied by making several runs in succession with different random numbers resulting in a Monte Carlo technique to determine statistical parameters of minimum value, average value, and maximum value.
GUI to Facilitate Research on Biological Damage from Radiation
NASA Technical Reports Server (NTRS)
Cucinotta, Frances A.; Ponomarev, Artem Lvovich
2010-01-01
A graphical-user-interface (GUI) computer program has been developed to facilitate research on the damage caused by highly energetic particles and photons impinging on living organisms. The program brings together, into one computational workspace, computer codes that have been developed over the years, plus codes that will be developed during the foreseeable future, to address diverse aspects of radiation damage. These include codes that implement radiation-track models, codes for biophysical models of breakage of deoxyribonucleic acid (DNA) by radiation, pattern-recognition programs for extracting quantitative information from biological assays, and image-processing programs that aid visualization of DNA breaks. The radiation-track models are based on transport models of interactions of radiation with matter and solution of the Boltzmann transport equation by use of both theoretical and numerical models. The biophysical models of breakage of DNA by radiation include biopolymer coarse-grained and atomistic models of DNA, stochastic- process models of deposition of energy, and Markov-based probabilistic models of placement of double-strand breaks in DNA. The program is designed for use in the NT, 95, 98, 2000, ME, and XP variants of the Windows operating system.
NASA information sciences and human factors program
NASA Technical Reports Server (NTRS)
1991-01-01
The Data Systems Program consists of research and technology devoted to controlling, processing, storing, manipulating, and analyzing space-derived data. The objectives of the program are to provide the technology advancements needed to enable affordable utilization of space-derived data, to increase substantially the capability for future missions of on-board processing and recording and to provide high-speed, high-volume computational systems that are anticipated for missions such as the evolutionary Space Station and Earth Observing System.
MatLab Programming for Engineers Having No Formal Programming Knowledge
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.
Contention Bounds for Combinations of Computation Graphs and Network Topologies
2014-08-08
member of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA, and ASPIRE Lab industrial sponsors and affiliates Intel...Google, Nokia, NVIDIA , Oracle, MathWorks and Samsung. Also funded by U.S. DOE Office of Science, Office of Advanced Scientific Computing Research...DARPA Award Number HR0011-12-2- 0016, the Center for Future Architecture Research, a mem- ber of STARnet, a Semiconductor Research Corporation
The U.S. "Tox21 Community" and the Future of Toxicology
In early 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the NIH Chemical Genomics Center, and the Environmental Protection Agency’s National Center for Computational Toxicology entered into a Memorandum of Understanding to collaborate o...
The importance of employing computational resources for the automation of drug discovery.
Rosales-Hernández, Martha Cecilia; Correa-Basurto, José
2015-03-01
The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.
Steele, K. S.
1994-01-01
Langston University, a Historically Black University located at Langston, Oklahoma, has a computing and information science program within the Langston University Division of Business. Since 1984, Langston University has participated in the Historically Black College and University program of the U.S. Department of Interior, which provided education, training, and funding through a combined earth-science and computer-technology cooperative program with the U.S. Geological Survey (USGS). USGS personnel have presented guest lectures at Langston University since 1984. Students have been enthusiastic about the lectures, and as a result of this program, 13 Langston University students have been hired by the USGS on a part-time basis while they continued their education at the University. The USGS expanded the offering of guest lectures in 1992 by increasing the number of visits to Langston University, and by inviting participation of speakers from throughout the country. The objectives of the guest-lecture series are to assist Langston University in offering state-of-the-art education in the computer sciences, to provide students with an opportunity to learn from and interact with skilled computer-science professionals, and to develop a pool of potential future employees for part-time and full-time employment. This report includes abstracts for guest-lecture presentations during 1992-93 school year.
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-01-01
Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.
Cranswick, Lachlan Michael David
2008-01-01
The history of crystallographic computing and use of crystallographic software is one which traces the escape from the drudgery of manual human calculations to a world where the user delegates most of the travail to electronic computers. In practice, this involves practising crystallographers communicating their thoughts to the crystallographic program authors, in the hope that new procedures will be implemented within their software. Against this background, the development of small-molecule single-crystal and powder diffraction software is traced. Starting with the analogue machines and the use of Hollerith tabulators of the late 1930's, it is shown that computing developments have been science led, with new technologies being harnessed to solve pressing crystallographic problems. The development of software is also traced, with a final caution that few of the computations now performed daily are really understood by the program users. Unless a sufficient body of people continues to dismantle and re-build programs, the knowledge encoded in the old programs will become as inaccessible as the knowledge of how to build the Great Pyramid at Giza.
Finn, Jerry; Atkinson, Teresa
2009-11-01
The Technology Safety Project of the Washington State Coalition Against Domestic Violence was designed to increase awareness and knowledge of technology safety issues for domestic violence victims, survivors, and advocacy staff. The project used a "train-the-trainer" model and provided computer and Internet resources to domestic violence service providers to (a) increase safe computer and Internet access for domestic violence survivors in Washington, (b) reduce the risk posed by abusers by educating survivors about technology safety and privacy, and (c) increase the ability of survivors to help themselves and their children through information technology. Evaluation of the project suggests that the program is needed, useful, and effective. Consumer satisfaction was high, and there was perceived improvement in computer confidence and knowledge of computer safety. Areas for future program development and further research are discussed.
Advances in mixed-integer programming methods for chemical production scheduling.
Velez, Sara; Maravelias, Christos T
2014-01-01
The goal of this paper is to critically review advances in the area of chemical production scheduling over the past three decades and then present two recently proposed solution methods that have led to dramatic computational enhancements. First, we present a general framework and problem classification and discuss modeling and solution methods with an emphasis on mixed-integer programming (MIP) techniques. Second, we present two solution methods: (a) a constraint propagation algorithm that allows us to compute parameters that are then used to tighten MIP scheduling models and (b) a reformulation that introduces new variables, thus leading to effective branching. We also present computational results and an example illustrating how these methods are implemented, as well as the resulting enhancements. We close with a discussion of open research challenges and future research directions.
A thermal vacuum test optimization procedure
NASA Technical Reports Server (NTRS)
Kruger, R.; Norris, H. P.
1979-01-01
An analytical model was developed that can be used to establish certain parameters of a thermal vacuum environmental test program based on an optimization of program costs. This model is in the form of a computer program that interacts with a user insofar as the input of certain parameters. The program provides the user a list of pertinent information regarding an optimized test program and graphs of some of the parameters. The model is a first attempt in this area and includes numerous simplifications. The model appears useful as a general guide and provides a way for extrapolating past performance to future missions.
Our Plan for a Wireless Loan Service.
ERIC Educational Resources Information Center
Allmang, Nancy
2003-01-01
Discusses the planning for wireless technology at the research library of the National Institute of Standards and Technology (NIST). Highlights include computer equipment, including laptops and PDAs; local area networks; equipment loan service; writing a business plan; infrastructure; training programs; and future considerations, including…
The possible usability of three-dimensional cone beam computed dental tomography in dental research
NASA Astrophysics Data System (ADS)
Yavuz, I.; Rizal, M. F.; Kiswanjaya, B.
2017-08-01
The innovations and advantages of three-dimensional cone beam computed dental tomography (3D CBCT) are continually growing for its potential use in dental research. Imaging techniques are important for planning research in dentistry. Newly improved 3D CBCT imaging systems and accessory computer programs have recently been proven effective for use in dental research. The aim of this study is to introduce 3D CBCT and open a window for future research possibilities that should be given attention in dental research.
Operation of the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1975-01-01
The ICASE research program is described in detail; it consists of four major categories: (1) efficient use of vector and parallel computers, with particular emphasis on the CDC STAR-100; (2) numerical analysis, with particular emphasis on the development and analysis of basic numerical algorithms; (3) analysis and planning of large-scale software systems; and (4) computational research in engineering and the natural sciences, with particular emphasis on fluid dynamics. The work in each of these areas is described in detail; other activities are discussed, a prognosis of future activities are included.
NASA Astrophysics Data System (ADS)
Zacharovas, Stanislovas; Nikolskij, Andrej; Kuchin, Jevgenij
2011-02-01
We have created a programming tool which uses image data provided by webcam connected to personal computer and gives user an ability to see the future digital hologram preview on his computer screen, before sending video data to holographic printing companies. In order to print digital hologram, one needs to have a sequence of images of the same scene taken from different angles and nowadays web cameras - stand-alone or incorporated into mobile computer, can be an acceptable source of such image sequences. In this article we are describing this DIY holographic imaging process in details.
Logistical Consideration in Computer-Based Screening of Astronaut Applicants
NASA Technical Reports Server (NTRS)
Galarza, Laura
2000-01-01
This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and using a computer-based system for psychological screening and selection.
Viewing ISS Data in Real Time via the Internet
NASA Technical Reports Server (NTRS)
Myers, Gerry; Chamberlain, Jim
2004-01-01
EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.
NASA Astrophysics Data System (ADS)
Nelson, E.; L'Ecuyer, T. S.; Douglas, A.; Hansen, Z.
2017-12-01
In the modern computing age, scientists must utilize a wide variety of skills to carry out scientific research. Programming, including a focus on collaborative development, has become more prevalent in both academic and professional career paths. Faculty in the Department of Atmospheric and Oceanic Sciences at the University of Wisconsin—Madison recognized this need and recently approved a new course offering for undergraduates and postgraduates in computational methods that was first held in Spring 2017. Three programming languages were covered in the inaugural course semester and development themes such as modularization, data wrangling, and conceptual code models were woven into all of the sections. In this presentation, we will share successes and challenges in developing a research project-focused computational course that leverages hands-on computer laboratory learning and open-sourced course content. Improvements and changes in future iterations of the course based on the first offering will also be discussed.
HeNCE: A Heterogeneous Network Computing Environment
Beguelin, Adam; Dongarra, Jack J.; Geist, George Al; ...
1994-01-01
Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE) is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM).more » The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas; Schuman, Catherine; Patton, Robert
The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less
The Integration of the Naval Unmanned Combat Aerial System (N-UCAS) into the Future Naval Air Wing
2009-12-01
5 Table 1. Aircraft Combat Radius from World War II (WWII) Through 1990s6 Period Airframe Distance WW2 F6F 400nm TBF 400nm SB2C...override the computers, take control, and guide his two bombs to target by infrared video imagery. Otherwise, our auto piloted computer was programmed
Toward using alpha and theta brain waves to quantify programmer expertise.
Crk, Igor; Kluthe, Timothy
2014-01-01
Empirical studies of programming language learnability and usability have thus far depended on indirect measures of human cognitive performance, attempting to capture what is at its essence a purely cognitive exercise through various indicators of comprehension, such as the correctness of coding tasks or the time spent working out the meaning of code and producing acceptable solutions. Understanding program comprehension is essential to understanding the inherent complexity of programming languages, and ultimately, having a measure of mental effort based on direct observation of the brain at work will illuminate the nature of the work of programming. We provide evidence of direct observation of the cognitive effort associated with programming tasks, through a carefully constructed empirical study using a cross-section of undergraduate computer science students and an inexpensive, off-the-shelf brain-computer interface device. This study presents a link between expertise and programming language comprehension, draws conclusions about the observed indicators of cognitive effort using recent cognitive theories, and proposes directions for future work that is now possible.
Training the Future - Swamp Work Activities
2017-07-19
In the Swamp Works laboratory at NASA's Kennedy Space Center in Florida, student interns such as Thomas Muller, left, and Austin Langdon are joining agency scientists, contributing in the area of Exploration Research and Technology. Muller is pursuing a degree in computer engineering and control systems and Florida Tech. Langdon is an electrical engineering major at the University of Kentucky. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
NASA Technical Reports Server (NTRS)
Patterson, G.
1973-01-01
The data processing procedures and the computer programs were developed to predict structural responses using the Impulse Transfer Function (ITF) method. There are three major steps in the process: (1) analog-to-digital (A-D) conversion of the test data to produce Phase I digital tapes (2) processing of the Phase I digital tapes to extract ITF's and storing them in a permanent data bank, and (3) predicting structural responses to a set of applied loads. The analog to digital conversion is performed by a standard package which will be described later in terms of the contents of the resulting Phase I digital tape. Two separate computer programs have been developed to perform the digital processing.
BreadNet: An On-Line Community.
ERIC Educational Resources Information Center
Walker, Susan
1987-01-01
Describes BreadNet, a computer network linking Middlebury College English teachers, their associates, and students. Network extends to rural English teachers and their K-8 students. BreadNet used for student pen pal program, teacher teleconferencing, information access. Also describes BreadNet's problems and future possibilities. (TES)
Virtual Reality in the Classroom.
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
1993-01-01
Considers the concept of virtual reality; reviews its history; describes general uses of virtual reality, including entertainment, medicine, and design applications; discusses classroom uses of virtual reality, including a software program called Virtus WalkThrough for use with a computer monitor; and suggests future possibilities. (34 references)…
Buse, Kathleen; Hill, Catherine; Benson, Kathleen
2017-01-01
While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions. PMID:28469591
Buse, Kathleen; Hill, Catherine; Benson, Kathleen
2017-01-01
While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions.
NASA Technical Reports Server (NTRS)
Abbott, John M.; Anderson, Bernhard H.; Rice, Edward J.
1990-01-01
The internal fluid mechanics research program in inlets, ducts, and nozzles consists of a balanced effort between the development of computational tools (both parabolized Navier-Stokes and full Navier-Stokes) and the conduct of experimental research. The experiments are designed to better understand the fluid flow physics, to develop new or improved flow models, and to provide benchmark quality data sets for validation of the computational methods. The inlet, duct, and nozzle research program is described according to three major classifications of flow phenomena: (1) highly 3-D flow fields; (2) shock-boundary-layer interactions; and (3) shear layer control. Specific examples of current and future elements of the research program are described for each of these phenomenon. In particular, the highly 3-D flow field phenomenon is highlighted by describing the computational and experimental research program in transition ducts having a round-to-rectangular area variation. In the case of shock-boundary-layer interactions, the specific details of research for normal shock-boundary-layer interactions are described. For shear layer control, research in vortex generators and the use of aerodynamic excitation for enhancement of the jet mixing process are described.
NASA Astrophysics Data System (ADS)
Barak, Miri; Harward, Judson; Kocur, George; Lerman, Steven
2007-08-01
Within the framework of MIT's course 1.00: Introduction to Computers and Engineering Problem Solving, this paper describes an innovative project entitled: Studio 1.00 that integrates lectures with in-class demonstrations, active learning sessions, and on-task feedback, through the use of wireless laptop computers. This paper also describes a related evaluation study that investigated the effectiveness of different instructional strategies, comparing traditional teaching with two models of the studio format. Students' learning outcomes, specifically, their final grades and conceptual understanding of computational methods and programming, were examined. Findings indicated that Studio-1.00, in both its extensive- and partial-active learning modes, enhanced students' learning outcomes in Java programming. Comparing to the traditional courses, more students in the studio courses received "A" as their final grade and less failed. Moreover, students who regularly attended the active learning sessions were able to conceptualize programming principles better than their peers. We have also found two weaknesses in the teaching format of Studio-1.00 that can guide future versions of the course.
Smith, Daniel G A; Burns, Lori A; Sirianni, Dominic A; Nascimento, Daniel R; Kumar, Ashutosh; James, Andrew M; Schriber, Jeffrey B; Zhang, Tianyuan; Zhang, Boyi; Abbott, Adam S; Berquist, Eric J; Lechner, Marvin H; Cunha, Leonardo A; Heide, Alexander G; Waldrop, Jonathan M; Takeshita, Tyler Y; Alenaizan, Asem; Neuhauser, Daniel; King, Rollin A; Simmonett, Andrew C; Turney, Justin M; Schaefer, Henry F; Evangelista, Francesco A; DePrince, A Eugene; Crawford, T Daniel; Patkowski, Konrad; Sherrill, C David
2018-06-11
Psi4NumPy demonstrates the use of efficient computational kernels from the open-source Psi4 program through the popular NumPy library for linear algebra in Python to facilitate the rapid development of clear, understandable Python computer code for new quantum chemical methods, while maintaining a relatively low execution time. Using these tools, reference implementations have been created for a number of methods, including self-consistent field (SCF), SCF response, many-body perturbation theory, coupled-cluster theory, configuration interaction, and symmetry-adapted perturbation theory. Furthermore, several reference codes have been integrated into Jupyter notebooks, allowing background, underlying theory, and formula information to be associated with the implementation. Psi4NumPy tools and associated reference implementations can lower the barrier for future development of quantum chemistry methods. These implementations also demonstrate the power of the hybrid C++/Python programming approach employed by the Psi4 program.
NASA Technical Reports Server (NTRS)
Bull, William B. (Compiler); Pinoli, Pat C. (Compiler); Upton, Cindy G. (Compiler); Day, Tony (Compiler); Hill, Keith (Compiler); Stone, Frank (Compiler); Hall, William B.
1994-01-01
This report is a compendium of the presentations of the 12th biannual meeting of the Industry Advisory Committee under the Solid Propulsion Integrity Program. A complete transcript of the welcoming talks is provided. Presentation outlines and overheads are included for the other sessions: SPIP Overview, Past, Current and Future Activity; Test Methods Manual and Video Tape Library; Air Force Developed Computer Aided Cure Program and SPC/TQM Experience; Magneto-Optical mapper (MOM), Joint Army/NASA program to assess composite integrity; Permeability Testing; Moisture Effusion Testing by Karl Fischer Analysis; Statistical Analysis of Acceptance Test Data; NMR Phenolic Resin Advancement; Constituent Testing Highlights on the LDC Optimization Program; Carbon Sulfur Study, Performance Related Testing; Current Rayon Specifications and Future Availability; RSRM/SPC Implementation; SRM Test Methods, Delta/Titan/FBM/RSRM; and Open Forum on Performance Based Acceptance Testing -- Industry Experience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liegey, Lauren Rene; Wilcox, Trevor; Mckinney, Gregg Walter
2015-08-07
My internship program was the Domestic Nuclear Detection Office Summer Internship Program. I worked at Los Alamos National Laboratory with Trevor A. Wilcox and Gregg W. McKinney in the NEN-5 group. My project title was “MCNP Physical Model Interoperability & Validation”. The goal of my project was to write a program to predict the solar modulation parameter for dates in the future and then implement it into MCNP6. This update to MCNP6 can be used to calculate the background more precisely, which is an important factor in being able to detect Special Nuclear Material. We will share our work inmore » a published American Nuclear Society (ANS) paper, an ANS presentation, and a LANL student poster session. Through this project, I gained skills in programming, computing, and using MCNP. I also gained experience that will help me decide on a career or perhaps obtain employment in the future.« less
Computational Fluid Dynamics Program at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1989-01-01
The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.
A City Manager Looks at Trends Affecting Public Libraries.
ERIC Educational Resources Information Center
Kemp, Roger L.
1999-01-01
Highlights some important conditions, both present and future, which will have an impact on public libraries. Discusses holding down expenses, including user fees, alternative funding sources, and private cosponsorship of programs; increasing productivity; use of computers and new technologies; staff development and internal marketing; improving…
Touching the Future by Training Students as Technology Workers.
ERIC Educational Resources Information Center
Wodarz, Nan
1999-01-01
Describes a technology consultant's training of promising students as network administrators as part of a high-school work-study program. Success hinged on combining work with education, providing supervision and mentoring, using knowledgeable trainers, not substituting students for staff shortcomings, and installing adequate computer security.…
Artificial Intelligence: The Expert Way.
ERIC Educational Resources Information Center
Bitter, Gary G.
1989-01-01
Discussion of artificial intelligence (AI) and expert systems focuses on their use in education. Characteristics of good expert systems are explained; computer software programs that contain applications of AI are described, highlighting one used to help educators identify learning-disabled students; and the future of AI is discussed. (LRW)
48 CFR 1352.209-71 - Limitation of future contracting.
Code of Federal Regulations, 2013 CFR
2013-10-01
... feasibility, proof of design and test, or engineering of programs not yet approved for acquisition or... computer software; and may appear in cost and pricing data or involve classified information. (iv) “System...'s development, production, or support. (vi) “Systems Engineering” means preparing specifications...
48 CFR 1352.209-71 - Limitation of future contracting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... feasibility, proof of design and test, or engineering of programs not yet approved for acquisition or... computer software; and may appear in cost and pricing data or involve classified information. (iv) “System...'s development, production, or support. (vi) “Systems Engineering” means preparing specifications...
48 CFR 1352.209-71 - Limitation of future contracting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... feasibility, proof of design and test, or engineering of programs not yet approved for acquisition or... computer software; and may appear in cost and pricing data or involve classified information. (iv) “System...'s development, production, or support. (vi) “Systems Engineering” means preparing specifications...
More than Spinning Their Wheels
ERIC Educational Resources Information Center
Cassola, Joel
2007-01-01
Last fall, when Mastercam, the leading manufacturer of computer-aided manufacturing (CAM) software, announced the winners of its Innovators of the Future (IOF) contest, first, second and third prizes went to students in the advanced manufacturing program of Vincennes University's (VU's) Machine Trades Technology Department. The contest called for…
Nofre, David
2014-07-01
The spread of the modern computer is assumed to have been a smooth process of technology transfer. This view relies on an assessment of the open circulation of knowledge ensured by the US and British governments in the early post-war years. This article presents new historical evidence that question this view. At the centre of the article lies the ill-fated establishment of the UNESCO International Computation Centre. The project was initially conceived in 1946 to provide advanced computation capabilities to scientists of all nations. It soon became a prize sought by Western European countries like The Netherlands and Italy seeking to speed up their own national research programs. Nonetheless, as the article explains, the US government's limitations on the research function of the future centre resulted in the withdrawal of European support for the project. These limitations illustrate the extent to which US foreign science policy could operate as (stealth) industrial policy to secure a competitive technological advantage and the prospects of US manufacturers in a future European market.
A structurally oriented simulation system
NASA Technical Reports Server (NTRS)
Aran, Z.
1973-01-01
The computer program SOSS (Structurally Oriented Simulation System) is designed to be used as an experimental aid in the study of reliable systems. Basically, SOSS can simulate the structure and behavior of a discrete-time, finite-state, time-invariant system at various levels of structural definition. A general description of the program is given along with its modes of operation, command language of the basic system, future features to be incorporated in SOSS, and an example of usage.
Simulator for multilevel optimization research
NASA Technical Reports Server (NTRS)
Padula, S. L.; Young, K. C.
1986-01-01
A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.
NASA Technical Reports Server (NTRS)
Treon, S. L.
1979-01-01
A survey of the U.S. aerospace industry in late 1977 suggests that there will be an increasing use of computer-aided prediction-design technology (CPD Tech) in the aircraft development process but that, overall, only a modest reduction in wind-tunnel test requirements from the current level is expected in the period through 1995. Opinions were received from key spokesmen in 23 of the 26 solicited major companies or corporate divisions involved in the design and manufacture of nonrotary wing aircraft. Development programs for nine types of aircraft related to test phases and wind-tunnel size and speed range were considered.
Interactive systems design and synthesis of future spacecraft concepts
NASA Technical Reports Server (NTRS)
Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.
1984-01-01
An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced spacecraft (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze and conduct parametric studies and modify Earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.
Using high-performance networks to enable computational aerosciences applications
NASA Technical Reports Server (NTRS)
Johnson, Marjory J.
1992-01-01
One component of the U.S. Federal High Performance Computing and Communications Program (HPCCP) is the establishment of a gigabit network to provide a communications infrastructure for researchers across the nation. This gigabit network will provide new services and capabilities, in addition to increased bandwidth, to enable future applications. An understanding of these applications is necessary to guide the development of the gigabit network and other high-performance networks of the future. In this paper we focus on computational aerosciences applications run remotely using the Numerical Aerodynamic Simulation (NAS) facility located at NASA Ames Research Center. We characterize these applications in terms of network-related parameters and relate user experiences that reveal limitations imposed by the current wide-area networking infrastructure. Then we investigate how the development of a nationwide gigabit network would enable users of the NAS facility to work in new, more productive ways.
Program optimizations: The interplay between power, performance, and energy
Leon, Edgar A.; Karlin, Ian; Grant, Ryan E.; ...
2016-05-16
Practical considerations for future supercomputer designs will impose limits on both instantaneous power consumption and total energy consumption. Working within these constraints while providing the maximum possible performance, application developers will need to optimize their code for speed alongside power and energy concerns. This paper analyzes the effectiveness of several code optimizations including loop fusion, data structure transformations, and global allocations. A per component measurement and analysis of different architectures is performed, enabling the examination of code optimizations on different compute subsystems. Using an explicit hydrodynamics proxy application from the U.S. Department of Energy, LULESH, we show how code optimizationsmore » impact different computational phases of the simulation. This provides insight for simulation developers into the best optimizations to use during particular simulation compute phases when optimizing code for future supercomputing platforms. Here, we examine and contrast both x86 and Blue Gene architectures with respect to these optimizations.« less
The future challenge for aeropropulsion
NASA Technical Reports Server (NTRS)
Rosen, Robert; Bowditch, David N.
1992-01-01
NASA's research in aeropropulsion is focused on improving the efficiency, capability, and environmental compatibility for all classes of future aircraft. The development of innovative concepts, and theoretical, experimental, and computational tools provide the knowledge base for continued propulsion system advances. Key enabling technologies include advances in internal fluid mechanics, structures, light-weight high-strength composite materials, and advanced sensors and controls. Recent emphasis has been on the development of advanced computational tools in internal fluid mechanics, structural mechanics, reacting flows, and computational chemistry. For subsonic transport applications, very high bypass ratio turbofans with increased engine pressure ratio are being investigated to increase fuel efficiency and reduce airport noise levels. In a joint supersonic cruise propulsion program with industry, the critical environmental concerns of emissions and community noise are being addressed. NASA is also providing key technologies for the National Aerospaceplane, and is studying propulsion systems that provide the capability for aircraft to accelerate to and cruise in the Mach 4-6 speed range. The combination of fundamental, component, and focused technology development underway at NASA will make possible dramatic advances in aeropropulsion efficiency and environmental compatibility for future aeronautical vehicles.
Human-machine interface hardware: The next decade
NASA Technical Reports Server (NTRS)
Marcus, Elizabeth A.
1991-01-01
In order to understand where human-machine interface hardware is headed, it is important to understand where we are today, how we got there, and what our goals for the future are. As computers become more capable, faster, and programs become more sophisticated, it becomes apparent that the interface hardware is the key to an exciting future in computing. How can a user interact and control a seemingly limitless array of parameters effectively? Today, the answer is most often a limitless array of controls. The link between these controls and human sensory motor capabilities does not utilize existing human capabilities to their full extent. Interface hardware for teleoperation and virtual environments is now facing a crossroad in design. Therefore, we as developers need to explore how the combination of interface hardware, human capabilities, and user experience can be blended to get the best performance today and in the future.
Configuring Airspace Sectors with Approximate Dynamic Programming
NASA Technical Reports Server (NTRS)
Bloem, Michael; Gupta, Pramod
2010-01-01
In response to changing traffic and staffing conditions, supervisors dynamically configure airspace sectors by assigning them to control positions. A finite horizon airspace sector configuration problem models this supervisor decision. The problem is to select an airspace configuration at each time step while considering a workload cost, a reconfiguration cost, and a constraint on the number of control positions at each time step. Three algorithms for this problem are proposed and evaluated: a myopic heuristic, an exact dynamic programming algorithm, and a rollouts approximate dynamic programming algorithm. On problem instances from current operations with only dozens of possible configurations, an exact dynamic programming solution gives the optimal cost value. The rollouts algorithm achieves costs within 2% of optimal for these instances, on average. For larger problem instances that are representative of future operations and have thousands of possible configurations, excessive computation time prohibits the use of exact dynamic programming. On such problem instances, the rollouts algorithm reduces the cost achieved by the heuristic by more than 15% on average with an acceptable computation time.
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.
1981-01-01
A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.
A New Look at NASA: Strategic Research In Information Technology
NASA Technical Reports Server (NTRS)
Alfano, David; Tu, Eugene (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.
NASA Technical Reports Server (NTRS)
Bailey, F. R.; Kutler, Paul
1988-01-01
Discussed are the capabilities of NASA's Numerical Aerodynamic Simulation (NAS) Program and its application as an advanced supercomputing system for computational fluid dynamics (CFD) research. First, the paper describes the NAS computational system, called the NAS Processing System Network, and the advanced computational capabilities it offers as a consequence of carrying out the NAS pathfinder objective. Second, it presents examples of pioneering CFD research accomplished during NAS's first operational year. Examples are included which illustrate CFD applications for predicting fluid phenomena, complementing and supplementing experimentation, and aiding in design. Finally, pacing elements and future directions for CFD and NAS are discussed.
Exploring the Integration of Computational Modeling in the ASU Modeling Curriculum
NASA Astrophysics Data System (ADS)
Schatz, Michael; Aiken, John; Burk, John; Caballero, Marcos; Douglas, Scott; Thoms, Brian
2012-03-01
We describe the implementation of computational modeling in a ninth grade classroom in the context of the Arizona Modeling Instruction physics curriculum. Using a high-level programming environment (VPython), students develop computational models to predict the motion of objects under a variety of physical situations (e.g., constant net force), to simulate real world phenomenon (e.g., car crash), and to visualize abstract quantities (e.g., acceleration). We discuss how VPython allows students to utilize all four structures that describe a model as given by the ASU Modeling Instruction curriculum. Implications for future work will also be discussed.
Computational prediction of chemical reactions: current status and outlook.
Engkvist, Ola; Norrby, Per-Ola; Selmi, Nidhal; Lam, Yu-Hong; Peng, Zhengwei; Sherer, Edward C; Amberg, Willi; Erhard, Thomas; Smyth, Lynette A
2018-06-01
Over the past few decades, various computational methods have become increasingly important for discovering and developing novel drugs. Computational prediction of chemical reactions is a key part of an efficient drug discovery process. In this review, we discuss important parts of this field, with a focus on utilizing reaction data to build predictive models, the existing programs for synthesis prediction, and usage of quantum mechanics and molecular mechanics (QM/MM) to explore chemical reactions. We also outline potential future developments with an emphasis on pre-competitive collaboration opportunities. Copyright © 2018 Elsevier Ltd. All rights reserved.
Medical education as a science: the quality of evidence for computer-assisted instruction.
Letterie, Gerard S
2003-03-01
A marked increase in the number of computer programs for computer-assisted instruction in the medical sciences has occurred over the past 10 years. The quality of both the programs and the literature that describe these programs has varied considerably. The purposes of this study were to evaluate the published literature that described computer-assisted instruction in medical education and to assess the quality of evidence for its implementation, with particular emphasis on obstetrics and gynecology. Reports published between 1988 and 2000 on computer-assisted instruction in medical education were identified through a search of MEDLINE and Educational Resource Identification Center and a review of the bibliographies of the articles that were identified. Studies were selected if they included a description of computer-assisted instruction in medical education, regardless of the type of computer program. Data were extracted with a content analysis of 210 reports. The reports were categorized according to study design (comparative, prospective, descriptive, review, or editorial), type of computer-assisted instruction, medical specialty, and measures of effectiveness. Computer-assisted instruction programs included online technologies, CD-ROMs, video laser disks, multimedia work stations, virtual reality, and simulation testing. Studies were identified in all medical specialties, with a preponderance in internal medicine, general surgery, radiology, obstetrics and gynecology, pediatrics, and pathology. Ninety-six percent of the articles described a favorable impact of computer-assisted instruction in medical education, regardless of the quality of the evidence. Of the 210 reports that were identified, 60% were noncomparative, descriptive reports of new techniques in computer-assisted instruction, and 15% and 14% were reviews and editorials, respectively, of existing technology. Eleven percent of studies were comparative and included some form of assessment of the effectiveness of the computer program. These assessments included pre- and posttesting and questionnaires to score program quality, perceptions of the medical students and/or residents regarding the program, and impact on learning. In one half of these comparative studies, computer-assisted instruction was compared with traditional modes of teaching, such as text and lectures. Six studies compared performance before and after the computer-assisted instruction. Improvements were shown in 5 of the studies. In the remainder of the studies, computer-assisted instruction appeared to result in similar test performance. Despite study design or outcome, most articles described enthusiastic endorsement of the programs by the participants, including medical students, residents, and practicing physicians. Only 1 study included cost analysis. Thirteen of the articles were in obstetrics and gynecology. Computer-assisted instruction has assumed to have an increasing role in medical education. In spite of enthusiastic endorsement and continued improvements in software, few studies of good design clearly demonstrate improvement in medical education over traditional modalities. There are no comparative studies in obstetrics and gynecology that demonstrate a clear-cut advantage. Future studies of computer-assisted instruction that include comparisons and cost assessments to gauge their effectiveness over traditional methods may better define their precise role.
Computing Models of M-type Host Stars and their Panchromatic Spectral Output
NASA Astrophysics Data System (ADS)
Linsky, Jeffrey; Tilipman, Dennis; France, Kevin
2018-06-01
We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.
Educational Technology in Voc Ed. Information Series No. 268.
ERIC Educational Resources Information Center
Lipson, Joseph I.
This monograph provides a vision of the future for vocational educators in a position to improve programs, such as teachers and administrators of local educational agencies and state leaders who set priorities in educational agencies. The monograph addresses nationwide technological concerns of the computer, image storage and creation, and…
Toward Global Communication Networks: How Television is Forging New Thinking Patterns.
ERIC Educational Resources Information Center
Adams, Dennis M.; Fuchs, Mary
1986-01-01
Recent alliances between communication providers and computer manufacturers will lead to new technological combinations that will deliver visually-based ideas and information to a worldwide audience. Urges that those in charge of future video programs to consider their effects on children's language skills, thinking patterns, and intellectual…
Analysis of propellant feedline dynamics
NASA Technical Reports Server (NTRS)
Astleford, W. J.; Holster, J. L.; Gerlach, C. R.
1972-01-01
An analytical model and computer program were developed for studying the disturbances of liquid propellants in engine feedline systems. It was found that the predominant effect of turbulence is to increase the spatial attenuation at low frequencies; at high frequencies the laminar and turbulent frequencies coincide. Recommendations for future work are included.
Methodologies and systems for heterogeneous concurrent computing
NASA Technical Reports Server (NTRS)
Sunderam, V. S.
1994-01-01
Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.
Exploring the role of computers in sex and relationship education within British families.
Turnbull, Triece; van Schaik, Paul; van Wersch, Anna
2013-04-01
In this study, we aimed to identify the impact that computers can have in relation to sex and relationship education, as well as to provide a communication model that can be used within British families. We used a mixed-methods approach to explore the factors that influence communication of sexual matters within British families. Twenty families from the northeast of England were recruited through purposive sampling. First, semistructured interviews were conducted to identify how sexual matters were discussed within families. Interviews were recorded, transcribed verbatim, and then analyzed using the grounded theory approach. The second part of the research involved identifying the impact of using a computer program on knowledge and confidence within families to enhance communication about sexual matters. Although the majority of parents and their children were found to discuss sexual matters, the computer program was found to increase knowledge and confidence, which led to greater communication within families. The results highlighted the beneficial role that computer programs can have when educating and increasing communication within families. Future research needs to focus on improving access to information relating to sex and relationship education for parents so they can educate and talk openly about sexual matters with their children. A resource that does exactly this is www.safecoolsex.com.
Cable Connected Spinning Spacecraft, 1. the Canonical Equations, 2. Urban Mass Transportation, 3
NASA Technical Reports Server (NTRS)
Sitchin, A.
1972-01-01
Work on the dynamics of cable-connected spinning spacecraft was completed by formulating the equations of motion by both the canonical equations and Lagrange's equations and programming them for numerical solution on a digital computer. These energy-based formulations will permit future addition of the effect of cable mass. Comparative runs indicate that the canonical formulation requires less computer time. Available literature on urban mass transportation was surveyed. Areas of the private rapid transit concept of urban transportation are also studied.
Technology advances and market forces: Their impact on high performance architectures
NASA Technical Reports Server (NTRS)
Best, D. R.
1978-01-01
Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.
Scout: high-performance heterogeneous computing made simple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice
2011-01-26
Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less
NASA Astrophysics Data System (ADS)
Pellas, Nikolaos; Peroutseas, Efstratios
2017-01-01
Students in secondary education strive hard enough to understand basic programming concepts. With all that is known regarding the benefits of programming, little is the published evidence showing how high school students can learn basic programming concepts following innovative instructional formats correctly with the respect to gain/enhance their computational thinking skills. This distinction has caused lack of their motivation and interest in Computer Science courses. This case study presents the opinions of twenty-eight (n = 28) high school students who participated voluntarily in a 3D-game-like environment created in Second Life. This environment was combined with the 2D programming environment of Scratch4SL for the implementation of programming concepts (i.e. sequence and concurrent programming commands) in a blended instructional format. An instructional framework based on Papert's theory of Constructionism to assist students how to coordinate or manage better the learning material in collaborative practice-based learning activities is also proposed. By conducting a mixed-method research, before and after finishing several learning tasks, students' participation in focus group (qualitative data) and their motivation based on their experiences (quantitative data) are measured. Findings indicated that an instructional design framework based on Constructionism for acquiring or empowering students' social, cognitive, higher order and computational thinking skills is meaningful. Educational implications and recommendations for future research are also discussed.
Training the Future - Interns Harvesting & Testing Plant Experim
2017-07-19
In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre, left, and Payton Barnwell are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is pursuing a degree in computer science and chemistry at Rocky Mountain College in Billings, Montana. Barnwell is a mechanical engineering and nanotechnology major at Florida Polytechnic University. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
1992-08-01
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-10-01
Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.
IBM PC enhances the world's future
NASA Technical Reports Server (NTRS)
Cox, Jozelle
1988-01-01
Although the purpose of this research is to illustrate the importance of computers to the public, particularly the IBM PC, present examinations will include computers developed before the IBM PC was brought into use. IBM, as well as other computing facilities, began serving the public years ago, and is continuing to find ways to enhance the existence of man. With new developments in supercomputers like the Cray-2, and the recent advances in artificial intelligence programming, the human race is gaining knowledge at a rapid pace. All have benefited from the development of computers in the world; not only have they brought new assets to life, but have made life more and more of a challenge everyday.
POSE Algorithms for Automated Docking
NASA Technical Reports Server (NTRS)
Heaton, Andrew F.; Howard, Richard T.
2011-01-01
POSE (relative position and attitude) can be computed in many different ways. Given a sensor that measures bearing to a finite number of spots corresponding to known features (such as a target) of a spacecraft, a number of different algorithms can be used to compute the POSE. NASA has sponsored the development of a flash LIDAR proximity sensor called the Vision Navigation Sensor (VNS) for use by the Orion capsule in future docking missions. This sensor generates data that can be used by a variety of algorithms to compute POSE solutions inside of 15 meters, including at the critical docking range of approximately 1-2 meters. Previously NASA participated in a DARPA program called Orbital Express that achieved the first automated docking for the American space program. During this mission a large set of high quality mated sensor data was obtained at what is essentially the docking distance. This data set is perhaps the most accurate truth data in existence for docking proximity sensors in orbit. In this paper, the flight data from Orbital Express is used to test POSE algorithms at 1.22 meters range. Two different POSE algorithms are tested for two different Fields-of-View (FOVs) and two different pixel noise levels. The results of the analysis are used to predict future performance of the POSE algorithms with VNS data.
An Overview: NASA LeRC Structures Programs
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
1998-01-01
A workshop on National Structures Programs was held, jointly sponsored by the AIAA Structures Technical Committee, the University of Virginia's Center for Advanced Computational Technology and NASA. The Objectives of the Workshop were to: provide a forum for discussion of current Government-sponsored programs in the structures area; identify high potential research areas for future aerospace systems; and initiate suitable interaction mechanisms with the managers of structures programs. The presentations covered structures programs at NASA, DOD (AFOSR, ONR, ARO and DARPA), and DOE. This publication is the presentation of the Structures and Acoustics Division of the NASA Lewis Research Center. The Structures and Acoustics Division has its genesis dating back to 1943. It is responsible for NASA research related to rotating structures and structural hot sections of both airbreathing and rocket engines. The work of the division encompasses but is not limited to aeroelasticity, structural life prediction and reliability, fatigue and fracture, mechanical components such as bearings, gears, and seals, and aeroacoustics. These programs are discussed and the names of responsible individuals are provided for future reference.
Simulation of n-qubit quantum systems. I. Quantum registers and quantum gates
NASA Astrophysics Data System (ADS)
Radtke, T.; Fritzsche, S.
2005-12-01
During recent years, quantum computations and the study of n-qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing quantum computations, however, these investigations also revealed a great deal of difficulties which still need to be solved in practice. In quantum computing, unitary and non-unitary quantum operations act on a given set of qubits to form (entangled) states, in which the information is encoded by the overall system often referred to as quantum registers. To facilitate the simulation of such n-qubit quantum systems, we present the FEYNMAN program to provide all necessary tools in order to define and to deal with quantum registers and quantum operations. Although the present version of the program is restricted to unitary transformations, it equally supports—whenever possible—the representation of the quantum registers both, in terms of their state vectors and density matrices. In addition to the composition of two or more quantum registers, moreover, the program also supports their decomposition into various parts by applying the partial trace operation and the concept of the reduced density matrix. Using an interactive design within the framework of MAPLE, therefore, we expect the FEYNMAN program to be helpful not only for teaching the basic elements of quantum computing but also for studying their physical realization in the future. Program summaryTitle of program:FEYNMAN Catalogue number:ADWE Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWE Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:None Computers for which the program is designed:All computers with a license of the computer algebra system MAPLE [Maple is a registered trademark of Waterlo Maple Inc.] Operating systems or monitors under which the program has been tested:Linux, MS Windows XP Programming language used:MAPLE 9.5 (but should be compatible with 9.0 and 8.0, too) Memory and time required to execute with typical data:Storage and time requirements critically depend on the number of qubits, n, in the quantum registers due to the exponential increase of the associated Hilbert space. In particular, complex algebraic operations may require large amounts of memory even for small qubit numbers. However, most of the standard commands (see Section 4 for simple examples) react promptly for up to five qubits on a normal single-processor machine ( ⩾1GHz with 512 MB memory) and use less than 10 MB memory. No. of lines in distributed program, including test data, etc.: 8864 No. of bytes in distributed program, including test data, etc.: 493 182 Distribution format: tar.gz Nature of the physical problem:During the last decade, quantum computing has been found to provide a revolutionary new form of computation. The algorithms by Shor [P.W. Shor, SIAM J. Sci. Statist. Comput. 26 (1997) 1484] and Grover [L.K. Grover, Phys. Rev. Lett. 79 (1997) 325. [2
Program on application of communications satellites to educational development
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.
1971-01-01
Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.
Computer-assisted instruction: a library service for the community teaching hospital.
McCorkel, J; Cook, V
1986-04-01
This paper reports on five years of experience with computer-assisted instruction (CAI) at Winthrop-University Hospital, a major affiliate of the SUNY at Stony Brook School of Medicine. It compares CAI programs available from Ohio State University and Massachusetts General Hospital (accessed by telephone and modem), and software packages purchased from the Health Sciences Consortium (MED-CAPS) and Scientific American (DISCOTEST). The comparison documents one library's experience of the cost of these programs and the use made of them by medical students, house staff, and attending physicians. It describes the space allocated for necessary equipment, as well as the marketing of CAI. Finally, in view of the decision of the National Board of Medical Examiners to administer the Part III examination on computer (the so-called CBX) starting in 1988, the paper speculates on the future importance of CAI in the community teaching hospital.
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Small Interactive Image Processing System (SMIPS) users manual
NASA Technical Reports Server (NTRS)
Moik, J. G.
1973-01-01
The Small Interactive Image Processing System (SMIP) is designed to facilitate the acquisition, digital processing and recording of image data as well as pattern recognition in an interactive mode. Objectives of the system are ease of communication with the computer by personnel who are not expert programmers, fast response to requests for information on pictures, complete error recovery as well as simplification of future programming efforts for extension of the system. The SMIP system is intended for operation under OS/MVT on an IBM 360/75 or 91 computer equipped with the IBM-2250 Model 1 display unit. This terminal is used as an interface between user and main computer. It has an alphanumeric keyboard, a programmed function keyboard and a light pen which are used for specification of input to the system. Output from the system is displayed on the screen as messages and pictures.
Hershberger, Patricia E; Gallo, Agatha M; Molokie, Robert; Thompson, Alexis A; Suarez, Marie L; Yao, Yingwei; Wilkie, Diana J
2016-06-01
To gain an in-depth understanding of the perceptions of young adults with sickle cell disease and sickle cell trait about parenthood and participating in the CHOICES randomized controlled trial that used computer-based, educational programmes. In the USA, there is insufficient education to assure that all young adults with sickle cell disease or sickle cell trait understand genetic inheritance risks and reproductive options to make informed reproductive decisions. To address this educational need, we developed a computer-based, multimedia program (CHOICES) and reformatted usual care into a computer-based (e-Book) program. We then conducted a two-year randomized controlled trial that included a qualitative component that would deepen understanding of young adults' perceptions of parenthood and use of computer-based, educational programmes. A qualitative descriptive approach completed after a randomized controlled trial. Sixty-eight men and women of childbearing age participated in semi-structured interviews at the completion of the randomized controlled trial from 2012-2013. Thematic content analysis guided the qualitative description. Three main themes were identified: (1) increasing knowledge and new ways of thinking and behaving; (2) rethinking parenting plans; and (3) appraising the program design and delivery. Most participants reported increased knowledge and rethinking of their parenting plans and were supportive of computer-based learning. Some participants expressed difficulty in determining individual transmission risks. Participants perceived the computer programs as beneficial to their learning. Future development of an Internet-based educational programme is warranted, with emphasis on providing tailored education or memory boosters about individual transmission risks. © 2015 John Wiley & Sons Ltd.
Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.
2011-01-01
The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.
Digital Screen Media and Cognitive Development.
Anderson, Daniel R; Subrahmanyam, Kaveri
2017-11-01
In this article, we examine the impact of digital screen devices, including television, on cognitive development. Although we know that young infants and toddlers are using touch screen devices, we know little about their comprehension of the content that they encounter on them. In contrast, research suggests that children begin to comprehend child-directed television starting at ∼2 years of age. The cognitive impact of these media depends on the age of the child, the kind of programming (educational programming versus programming produced for adults), the social context of viewing, as well the particular kind of interactive media (eg, computer games). For children <2 years old, television viewing has mostly negative associations, especially for language and executive function. For preschool-aged children, television viewing has been found to have both positive and negative outcomes, and a large body of research suggests that educational television has a positive impact on cognitive development. Beyond the preschool years, children mostly consume entertainment programming, and cognitive outcomes are not well explored in research. The use of computer games as well as educational computer programs can lead to gains in academically relevant content and other cognitive skills. This article concludes by identifying topics and goals for future research and provides recommendations based on current research-based knowledge. Copyright © 2017 by the American Academy of Pediatrics.
Future directions for postdoctoral training in cancer prevention: insights from a panel of experts.
Nelson, David E; Faupel-Badger, Jessica; Phillips, Siobhan; Belcher, Britni; Chang, Shine; Abrams, David B; Kramer, Barnett S; White, Mary C; O'Malley, Michael; Varanasi, Arti P; Fabian, Carol J; Wiest, Jonathan S; Colditz, Graham A; Hall, Kara; Shields, Peter G; Weitzel, Jeffrey N
2014-04-01
Cancer prevention postdoctoral fellowships have existed since the 1970s. The National Cancer Institute facilitated a meeting by a panel of experts in April 2013 to consider four important topics for future directions for cancer prevention postdoctoral training programs: (i) future research needs; (ii) underrepresented disciplines; (iii) curriculum; and (iv) career preparation. Panelists proffered several areas needing more research or emphasis, ranging from computational science to culture. Health care providers, along with persons from nontraditional disciplines in scientific training programs such as engineers and lawyers, were among those recognized as being underrepresented in training programs. Curriculum suggestions were that fellows receive training in topics such as leadership and human relations, in addition to learning the principles of epidemiology, cancer biologic mechanisms, and behavioral science. For career preparation, there was a clear recognition of the diversity of employment options available besides academic positions, and that program leaders should do more to help fellows identify and prepare for different career paths. The major topics and strategies covered at this meeting can help form the basis for cancer prevention training program leaders to consider modifications or new directions, and keep them updated with the changing scientific and employment climate for doctoral degree recipients and postdoctoral fellows.
A DNA network as an information processing system.
Santini, Cristina Costa; Bath, Jonathan; Turberfield, Andrew J; Tyrrell, Andy M
2012-01-01
Biomolecular systems that can process information are sought for computational applications, because of their potential for parallelism and miniaturization and because their biocompatibility also makes them suitable for future biomedical applications. DNA has been used to design machines, motors, finite automata, logic gates, reaction networks and logic programs, amongst many other structures and dynamic behaviours. Here we design and program a synthetic DNA network to implement computational paradigms abstracted from cellular regulatory networks. These show information processing properties that are desirable in artificial, engineered molecular systems, including robustness of the output in relation to different sources of variation. We show the results of numerical simulations of the dynamic behaviour of the network and preliminary experimental analysis of its main components.
NASA Technical Reports Server (NTRS)
Ray, Charles D.; Carrasquillo, Robyn L.; Minton-Summers, Silvia
1997-01-01
This paper provides a summary of current work accomplished under technical task agreement (TTA) by the Marshall Space Flight Center (MSFC) regarding the Environmental Control and Life Support System (ECLSS) as well as future planning activities in support of the International Space Station (ISS). Current activities include ECLSS computer model development, component design and development, subsystem integrated system testing, life testing, and government furnished equipment delivered to the ISS program. A long range plan for the MSFC ECLSS test facility is described whereby the current facility would be upgraded to support integrated station ECLSS operations. ECLSS technology development efforts proposed to be performed under the Advanced Engineering Technology Development (AETD) program are also discussed.
Recent Accomplishments and Future Directions in US Fusion Safety & Environmental Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
David A. Petti; Brad J. Merrill; Phillip Sharpe
2006-07-01
The US fusion program has long recognized that the safety and environmental (S&E) potential of fusion can be attained by prudent materials selection, judicious design choices, and integration of safety requirements into the design of the facility. To achieve this goal, S&E research is focused on understanding the behavior of the largest sources of radioactive and hazardous materials in a fusion facility, understanding how energy sources in a fusion facility could mobilize those materials, developing integrated state of the art S&E computer codes and risk tools for safety assessment, and evaluating S&E issues associated with current fusion designs. In thismore » paper, recent accomplishments are reviewed and future directions outlined.« less
CONTACT: An Air Force technical report on military satellite control technology
NASA Astrophysics Data System (ADS)
Weakley, Christopher K.
1993-07-01
This technical report focuses on Military Satellite Control Technologies and their application to the Air Force Satellite Control Network (AFSCN). This report is a compilation of articles that provide an overview of the AFSCN and the Advanced Technology Program, and discusses relevant technical issues and developments applicable to the AFSCN. Among the topics covered are articles on Future Technology Projections; Future AFSCN Topologies; Modeling of the AFSCN; Wide Area Communications Technology Evolution; Automating AFSCN Resource Scheduling; Health & Status Monitoring at Remote Tracking Stations; Software Metrics and Tools for Measuring AFSCN Software Performance; Human-Computer Interface Working Group; Trusted Systems Workshop; and the University Technical Interaction Program. In addition, Key Technology Area points of contact are listed in the report.
Electromagnetic Scattering from Two Dielectric Spheres: Comparison between Theory and Experiment,
1982-08-30
struct, a very complex computer program to perform the calculations. It should be noted that several errors (probably typographical) were disco - vered in...their equations and virtually all of them had to be rederived. In a future publication we will present a corrected set of equations along with several
Whenever You Use a Computer You Are Using a Program Called an Operating System.
ERIC Educational Resources Information Center
Cook, Rick
1984-01-01
Examines design, features, and shortcomings of eight disk-based operating systems designed for general use that are popular or most likely to affect the future of microcomputing. Included are the CP/M family, MS-DOS, Apple DOS/ProDOS, Unix, Pick, the p-System, TRSDOS, and Macintosh/Lisa. (MBR)
Career Repertoires of IT Students: A Group Counselling Case Study in Higher Education
ERIC Educational Resources Information Center
Penttinen, Leena; Vesisenaho, Mikko
2013-01-01
Uncertainty about future career prospects has increased enormously for students enrolled in higher education Information Technology (IT) programs. However, many computer science programmes pay little attention to career counselling. This article reports the results of a pilot study intended to develop group counselling for IT students to promote…
Raising a Programmer: Teaching Saudi Children How to Code
ERIC Educational Resources Information Center
Meccawy, Maram
2017-01-01
Teaching computer coding to children from a young age provides with them a competitive advantage for the future in a continually changing workplace. Programming strengthens logical and critical thinking as well as problem-solving skills, which lead to creative solutions for today's problems. The Little Programmer is an application for mobile…
Empathy in Future Teachers of the Pedagogical and Technological University of Colombia
ERIC Educational Resources Information Center
Herrera Torres, Lucía; Buitrago Bonilla, Rafael Enrique; Avila Moreno, Aida Karina
2016-01-01
This study analyzes cognitive and emotional empathy in students who started their training at the Education Science Faculty of the Pedagogical and Technological University of Colombia. The sample was formed by 317 students enrolled in the study programs of Preschool, Plastic Arts, Natural Sciences, Physical Education, Philosophy, Computer Science,…
ERIC Educational Resources Information Center
Hinton, Vanessa; Flores, Margaret; Burton, Megan; Curtis, Rebecca
2015-01-01
The purpose of this mixed method study was to investigate future special education teachers' preparation for effectively teaching mathematics. During the last semester of their program, pre-service special education teachers completed elementary level mathematics computation and problem solving assessments, a mathematics efficacy beliefs survey,…
A Computer Model of Simple Forms of Learning.
ERIC Educational Resources Information Center
Jones, Thomas L.
A basic unsolved problem in science is that of understanding learning, the process by which people and machines use their experience in a situation to guide future action in similar situations. The ideas of Piaget, Pavlov, Hull, and other learning theorists, as well as previous heuristic programing models of human intelligence, stimulated this…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, X; Liu, L; Xing, L
Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.
The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searchedmore » and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.« less
Structural dynamics of shroudless, hollow fan blades with composite in-lays
NASA Technical Reports Server (NTRS)
Aiello, R. A.; Hirschbein, M. S.; Chamis, C. C.
1982-01-01
Structural and dynamic analyses are presented for a shroudless, hollow titanium fan blade proposed for future use in aircraft turbine engines. The blade was modeled and analyzed using the composite blade structural analysis computer program (COBSTRAN); an integrated program consisting of mesh generators, composite mechanics codes, NASTRAN, and pre- and post-processors. Vibration and impact analyses are presented. The vibration analysis was conducted with COBSTRAN. Results show the effect of the centrifugal force field on frequencies, twist, and blade camber. Bird impact analysis was performed with the multi-mode blade impact computer program. This program uses the geometric model and modal analysis from the COBSTRAN vibration analysis to determine the gross impact response of the fan blades to bird strikes. The structural performance of this blade is also compared to a blade of similar design but with composite in-lays on the outer surface. Results show that the composite in-lays can be selected (designed) to substantially modify the mechanical performance of the shroudless, hollow fan blade.
Biyikli, Emre; To, Albert C.
2015-01-01
A new topology optimization method called the Proportional Topology Optimization (PTO) is presented. As a non-sensitivity method, PTO is simple to understand, easy to implement, and is also efficient and accurate at the same time. It is implemented into two MATLAB programs to solve the stress constrained and minimum compliance problems. Descriptions of the algorithm and computer programs are provided in detail. The method is applied to solve three numerical examples for both types of problems. The method shows comparable efficiency and accuracy with an existing optimality criteria method which computes sensitivities. Also, the PTO stress constrained algorithm and minimum compliance algorithm are compared by feeding output from one algorithm to the other in an alternative manner, where the former yields lower maximum stress and volume fraction but higher compliance compared to the latter. Advantages and disadvantages of the proposed method and future works are discussed. The computer programs are self-contained and publicly shared in the website www.ptomethod.org. PMID:26678849
Marshall Space Flight Center CFD overview
NASA Technical Reports Server (NTRS)
Schutzenhofer, Luke A.
1989-01-01
Computational Fluid Dynamics (CFD) activities at Marshall Space Flight Center (MSFC) have been focused on hardware specific and research applications with strong emphasis upon benchmark validation. The purpose here is to provide insight into the MSFC CFD related goals, objectives, current hardware related CFD activities, propulsion CFD research efforts and validation program, future near-term CFD hardware related programs, and CFD expectations. The current hardware programs where CFD has been successfully applied are the Space Shuttle Main Engines (SSME), Alternate Turbopump Development (ATD), and Aeroassist Flight Experiment (AFE). For the future near-term CFD hardware related activities, plans are being developed that address the implementation of CFD into the early design stages of the Space Transportation Main Engine (STME), Space Transportation Booster Engine (STBE), and the Environmental Control and Life Support System (ECLSS) for the Space Station. Finally, CFD expectations in the design environment will be delineated.
NASA and CFD - Making investments for the future
NASA Technical Reports Server (NTRS)
Hessenius, Kristin A.; Richardson, P. F.
1992-01-01
From a NASA perspective, CFD is a new tool for fluid flow simulation and prediction with virtually none of the inherent limitations of other ground-based simulation techniques. A primary goal of NASA's CFD research program is to develop efficient and accurate computational techniques for utilization in the design and analysis of aerospace vehicles. The program in algorithm development has systematically progressed through the hierarchy of engineering simplifications of the Navier-Stokes equations, starting with the inviscid formulations such as transonic small disturbance, full potential, and Euler.
NASA's supercomputing experience
NASA Technical Reports Server (NTRS)
Bailey, F. Ron
1990-01-01
A brief overview of NASA's recent experience in supercomputing is presented from two perspectives: early systems development and advanced supercomputing applications. NASA's role in supercomputing systems development is illustrated by discussion of activities carried out by the Numerical Aerodynamical Simulation Program. Current capabilities in advanced technology applications are illustrated with examples in turbulence physics, aerodynamics, aerothermodynamics, chemistry, and structural mechanics. Capabilities in science applications are illustrated by examples in astrophysics and atmospheric modeling. Future directions and NASA's new High Performance Computing Program are briefly discussed.
Continuation of research into language concepts for the mission support environment
NASA Technical Reports Server (NTRS)
1991-01-01
A concept for a more intuitive and graphically based Computation (Comp) Builder was developed. The Graphical Comp Builder Prototype was developed, which is an X Window based graphical tool that allows the user to build Comps using graphical symbols. Investigation was conducted to determine the availability and suitability of the Ada programming language for the development of future control center type software. The Space Station Freedom Project identified Ada as the desirable programming language for the development of Space Station Control Center software systems.
1979-01-01
syn- thesis proceed s by ignoring unacceptable syntax or other errors , pro- tection against subsequent execution of a faulty reaction scheme can be...resulting TAPE9 . During subroutine syn thesis and reaction processing, a search is made (fo r each secondary electron collision encountered) to...program library, which can be cat- alogued and saved if any future specialized modifications (beyond the scope of the syn thesis capability of LASER
Progress on Variable Cycle Engines
NASA Technical Reports Server (NTRS)
Westmoreland, J. S.; Howlett, R. A.; Lohmann, R. P.
1979-01-01
Progress in the development and future requirements of the Variable Stream Control Engine (VSCE) are presented. The two most critical components of this advanced system for future supersonic transports, the high performance duct burner for thrust augmentation, and the low jet coannular nozzle were studied. Nozzle model tests substantiated the jet noise benefit associated with the unique velocity profile possible with a coannular nozzle system on a VSCE. Additional nozzle model performance tests have established high thrust efficiency levels only at takeoff and supersonic cruise for this nozzle system. An experimental program involving both isolated component and complete engine tests has been conducted for the high performance, low emissions duct burner with good results and large scale testing of these two components is being conducted using a F100 engine as the testbed for simulating the VSCE. Future work includes application of computer programs for supersonic flow fields to coannular nozzle geometries, further experimental testing with the duct burner segment rig, and the use of the Variable Cycle Engine (VCE) Testbed Program for evaluating the VSCE duct burner and coannular nozzle technologies.
Use of a Computer Program for Advance Care Planning with African American Participants.
Markham, Sarah A; Levi, Benjamin H; Green, Michael J; Schubart, Jane R
2015-02-01
The authors wish to acknowledge the support and assistance of Dr. William Lawrence for his contribution to the M.A.UT model used in the decision aid, Making Your Wishes Known: Planning Your Medical Future (MYWK), Dr. Cheryl Dellasega for her leadership in focus group activities, Charles Sabatino for his review of legal aspects of MYWK, Dr. Robert Pearlman and his collaborative team for use of the advance care planning booklet "Your Life, Your Choices," Megan Whitehead for assistance in grant preparation and project organization, and the Instructional Media Development Center at the University of Wisconsin as well as JPL Integrated Communications for production and programming of MYWK. For various cultural and historical reasons, African Americans are less likely than Caucasians to engage in advance care planning (ACP) for healthcare decisions. This pilot study tested whether an interactive computer program could help overcome barriers to effective ACP among African Americans. African American adults were recruited from traditionally Black churches to complete an interactive computer program on ACP, pre-/post-questionnaires, and a follow-up phone interview. Eighteen adults (mean age =53.2 years, 83% female) completed the program without any problems. Knowledge about ACP significantly increased following the computer intervention (44.9% → 61.3%, p=0.0004), as did individuals' sense of self-determination. Participants were highly satisfied with the ACP process (9.4; 1 = not at all satisfied, 10 = extremely satisfied), and reported that the computer-generated advance directive accurately reflected their wishes (6.4; 1 = not at all accurate, 7 = extremely accurate). Follow-up phone interviews found that >80% of participants reported having shared their advance directives with family members and spokespeople. Preliminary evidence suggests that an interactive computer program can help African Americans engage in effective advance care planning, including creating an accurate advance directive document that will be shared with loved ones. © 2015 National Medical Association. Published by Elsevier Inc. All rights reserved.
Recent advances and future prospects for Monte Carlo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B
2010-01-01
The history of Monte Carlo methods is closely linked to that of computers: The first known Monte Carlo program was written in 1947 for the ENIAC; a pre-release of the first Fortran compiler was used for Monte Carlo In 1957; Monte Carlo codes were adapted to vector computers in the 1980s, clusters and parallel computers in the 1990s, and teraflop systems in the 2000s. Recent advances include hierarchical parallelism, combining threaded calculations on multicore processors with message-passing among different nodes. With the advances In computmg, Monte Carlo codes have evolved with new capabilities and new ways of use. Production codesmore » such as MCNP, MVP, MONK, TRIPOLI and SCALE are now 20-30 years old (or more) and are very rich in advanced featUres. The former 'method of last resort' has now become the first choice for many applications. Calculations are now routinely performed on office computers, not just on supercomputers. Current research and development efforts are investigating the use of Monte Carlo methods on FPGAs. GPUs, and many-core processors. Other far-reaching research is exploring ways to adapt Monte Carlo methods to future exaflop systems that may have 1M or more concurrent computational processes.« less
A new implementation of the programming system for structural synthesis (PROSSS-2)
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.
1984-01-01
This new implementation of the PROgramming System for Structural Synthesis (PROSSS-2) combines a general-purpose finite element computer program for structural analysis, a state-of-the-art optimization program, and several user-supplied, problem-dependent computer programs. The results are flexibility of the optimization procedure, organization, and versatility of the formulation of constraints and design variables. The analysis-optimization process results in a minimized objective function, typically the mass. The analysis and optimization programs are executed repeatedly by looping through the system until the process is stopped by a user-defined termination criterion. However, some of the analysis, such as model definition, need only be one time and the results are saved for future use. The user must write some small, simple FORTRAN programs to interface between the analysis and optimization programs. One of these programs, the front processor, converts the design variables output from the optimizer into the suitable format for input into the analyzer. Another, the end processor, retrieves the behavior variables and, optionally, their gradients from the analysis program and evaluates the objective function and constraints and optionally their gradients. These quantities are output in a format suitable for input into the optimizer. These user-supplied programs are problem-dependent because they depend primarily upon which finite elements are being used in the model. PROSSS-2 differs from the original PROSSS in that the optimizer and front and end processors have been integrated into the finite element computer program. This was done to reduce the complexity and increase portability of the system, and to take advantage of the data handling features found in the finite element program.
NAS-current status and future plans
NASA Technical Reports Server (NTRS)
Bailey, F. R.
1987-01-01
The Numerical Aerodynamic Simulation (NAS) has met its first major milestone, the NAS Processing System Network (NPSN) Initial Operating Configuration (IOC). The program has met its goal of providing a national supercomputer facility capable of greatly enhancing the Nation's research and development efforts. Furthermore, the program is fulfilling its pathfinder role by defining and implementing a paradigm for supercomputing system environments. The IOC is only the begining and the NAS Program will aggressively continue to develop and implement emerging supercomputer, communications, storage, and software technologies to strengthen computations as a critical element in supporting the Nation's leadership role in aeronautics.
Brown, C Hendricks; Mohr, David C; Gallo, Carlos G; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher
2013-06-01
African Americans and Hispanics in the United States have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. Although a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We proposed that innovative approaches involving computational technologies be explored for their use in both developing new interventions and in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Second, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods including social network analysis, agent-based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and 3 effective HIV prevention programs, we illustrated how 8 areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability.
NASA Technical Reports Server (NTRS)
Vallee, J.; Wilson, T.
1976-01-01
Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Short, C E; James, E L; Rebar, A L; Duncan, M J; Courneya, K S; Plotnikoff, R C; Crutzen, R; Bidargaddi, N; Vandelanotte, C
2017-11-01
Participating in regular physical activity is a recommended cancer recovery strategy for breast cancer survivors. However, tailored support services are not widely available and most survivors are insufficiently active to obtain health benefits. Delivering tailored programs via the Internet offers one promising approach. However, recent evaluations of such programs suggest that major improvements are needed to ensure programs meet the needs of users and are delivered in an engaging way. Understanding participants' experiences with current programs can help to inform the next generation of systems. The purposes of this study are to explore breast cancer survivor's perspectives of and experiences using a novel computer-tailored intervention and to describe recommendations for future iterations. Qualitative data from a sub-sample of iMove More for Life study participants were analysed thematically to identify key themes. Participants long-term goals for participating in the program were explored by analysing open-ended data extracted from action plans completed during the intervention (n = 370). Participants negative and positive perceptions of the website and recommendations for improvement were explored using data extracted from open-ended survey items collected at the immediate intervention follow-up (n = 156). The majority of participants reported multi-faceted goals, consisting of two or more outcomes they hoped to achieve within a year. While clear themes were identified (e.g. 'being satisfied with body weight'), there was considerable variability in the scope of the goal (e.g. desired weight loss ranged from 2 to 30 kg). Participants' perceptions of the website were mixed, but clear indications were provided of how intervention content and structure could be improved. This study provides insight into how to better accommodate breast cancer survivors in the future and ultimately design more engaging computer-tailored interventions.
Application of supercomputers to computational aerodynamics
NASA Technical Reports Server (NTRS)
Peterson, V. L.
1984-01-01
Computers are playing an increasingly important role in the field of aerodynamics such that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. Example results obtained from the successively refined forms of the governing equations are discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to problems of practical importance. Finally, the Numerical Aerodynamic Simulation (NAS) Program - with its 1988 target of achieving a sustained computational rate of 1 billion floating point operations per second and operating with a memory of 240 million words - is discussed in terms of its goals and its projected effect on the future of computational aerodynamics.
Study of basic computer competence among public health nurses in Taiwan.
Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling
2004-03-01
Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.
Vector computer memory bank contention
NASA Technical Reports Server (NTRS)
Bailey, D. H.
1985-01-01
A number of vector supercomputers feature very large memories. Unfortunately the large capacity memory chips that are used in these computers are much slower than the fast central processing unit (CPU) circuitry. As a result, memory bank reservation times (in CPU ticks) are much longer than on previous generations of computers. A consequence of these long reservation times is that memory bank contention is sharply increased, resulting in significantly lowered performance rates. The phenomenon of memory bank contention in vector computers is analyzed using both a Markov chain model and a Monte Carlo simulation program. The results of this analysis indicate that future generations of supercomputers must either employ much faster memory chips or else feature very large numbers of independent memory banks.
Multiscale Computation. Needs and Opportunities for BER Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheibe, Timothy D.; Smith, Jeremy C.
2015-01-01
The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSLmore » decisions regarding future computational (hardware and software) architectures.« less
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
Two new books about intrusions and computer viruses remind us that attacks against our computers on networks are the actions of human beings. Cliff Stoll's book about the hacker who spent a year, beginning in Aug. 1986, attempting to use the Lawrence Berkeley Computer as a stepping-stone for access to military secrets is a spy thriller that illustrates the weaknesses of our password systems and the difficulties in compiling evidence against a hacker engaged in espionage. Pamela Kane's book about viruses that attack IBM PC's shows that viruses are the modern version of the old problem of a Trojan horse attack. It discusses the most famous viruses and their countermeasures, and it comes with a floppy disk of utility programs that will disinfect your PC and thwart future attack.
Vector computer memory bank contention
NASA Technical Reports Server (NTRS)
Bailey, David H.
1987-01-01
A number of vector supercomputers feature very large memories. Unfortunately the large capacity memory chips that are used in these computers are much slower than the fast central processing unit (CPU) circuitry. As a result, memory bank reservation times (in CPU ticks) are much longer than on previous generations of computers. A consequence of these long reservation times is that memory bank contention is sharply increased, resulting in significantly lowered performance rates. The phenomenon of memory bank contention in vector computers is analyzed using both a Markov chain model and a Monte Carlo simulation program. The results of this analysis indicate that future generations of supercomputers must either employ much faster memory chips or else feature very large numbers of independent memory banks.
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
1979-09-01
joint orientetion and joint slippage than to failure of the intact rock mass. Dixon (1971) noted the importance of including the confining influence of...dedicated computer. The area of research not covered by this investigation which holds promise for a future study is a detailed comparison of the results of...block data, type key "W". The program writes this data on Linc tapes for future retripval. This feature can be used to store the consolidated block
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
NASA Technical Reports Server (NTRS)
1991-01-01
Analytical Design Service Corporation, Ann Arbor, MI, used NASTRAN (a NASA Structural Analysis program that analyzes a design and predicts how parts will perform) in tests of transmissions, engine cooling systems, internal engine parts, and body components. They also use it to design future automobiles. Analytical software can save millions by allowing computer simulated analysis of performance even before prototypes are built.
NASA Technical Reports Server (NTRS)
Daly, J. K.; Torian, J. G.
1979-01-01
An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.
ERIC Educational Resources Information Center
Henard, Ralph E.
Possible future developments in artificial intelligence (AI) as well as its limitations are considered that have implications for institutional research in higher education, and especially decision making and decision support systems. It is noted that computer software programs have been developed that store knowledge and mimic the decision-making…
ERIC Educational Resources Information Center
Chi, Angel
2013-01-01
When Bill Gates published his book "The Road Ahead" (1995), he summarized the transformative implications of the personal computing revolution and described a future profoundly changed by the arrival of a global information super highway. Almost twenty years later, the tsunami of online programs and the MOOCs (massive online open…
Staff development and secondary science teachers: Factors that affect voluntary participation
NASA Astrophysics Data System (ADS)
Corley, Theresa Roebuck
2000-10-01
A researcher-designed survey assessed the perceptions of Alabama secondary science public school teachers toward the need for staff development and toward certain staff development strategies and programs. Factors that encouraged or discouraged attendance at voluntary staff development programs and opinions regarding effective and ineffective features of programs were identified. Data were analyzed using descriptive techniques. Percentages and frequencies were noted. Average rankings were computed for the staff development techniques considered most and least effective and for the preferred designs of future staff development offerings. Chi squares were computed to respond to each of the 4 research hypotheses. Narrative discussions and tables were utilized to report the data and provide clarification. This study related demographic information to the research hypotheses. Analysis of the research hypotheses revealed that experienced teachers agree more strongly about the features of staff development programs that they consider effective and about the factors that may affect participation in staff development programs. Analysis of the research questions revealed that secondary science teachers in Alabama agree that staff development is a personal responsibility but that the school systems are responsible for providing staff development opportunities. Teachers believe that staff development is needed annually in both science content and teaching strategies and favor lengthening the school year for staff development. Teachers identified interest level, graduate credit, ability to implement material, scheduling factors, and the reputation of the organizer as the most important factors in determining participation in voluntary staff development programs. Hands-on workshops were identified as the most effective type of voluntary staff development and teachers requested that future staff development experiences include hands-on workshops, networking, curriculum development, mentoring, support groups, training trainers, cooperative learning groups, coaching, implementing changes, and collecting resources.
The Continental Margins Program in Georgia
Cocker, M.D.; Shapiro, E.A.
1999-01-01
From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These addtional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These additional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.
Overview of the Integrated Programs for Aerospace Vehicle Design (IPAD) project
NASA Technical Reports Server (NTRS)
Venneri, S. L.
1983-01-01
To respond to national needs for improved productivity in engineering design and manufacturing, a NASA supported joint industry/government project is underway denoted Integrated Programs for Aerospace Vehicle Design (IPAD). The objective is to improve engineering productivity through better use of computer technology. It focuses on development of data base management technology and associated software for integrated company wide management of engineering and manufacturing information. Results to date on the IPAD project include an in depth documentation of a representative design process for a large engineering project, the definition and design of computer aided design software needed to support that process, and the release of prototype software to manage engineering information. This paper provides an overview of the IPAD project and summarizes progress to date and future plans.
NASA Astrophysics Data System (ADS)
Rogers, P. J.; Fischer, R. E.
1983-01-01
Topics considered include: optical system requirements, analysis, and system engineering; optical system design using microcomputers and minicomputers; optical design theory and computer programs; optical design methods and computer programs; optical design methods and philosophy; unconventional optical design; diffractive and gradient index optical system design; optical production and system integration; and optical systems engineering. Particular attention is given to: stray light control as an integral part of optical design; current and future directions of lens design software; thin-film technology in the design and production of optical systems; aspherical lenses in optical scanning systems; the application of volume phase holograms to avionic displays; the effect of lens defects on thermal imager performance; and a wide angle zoom for the Space Shuttle.
Study of propellant dynamics in a shuttle type launch vehicle
NASA Technical Reports Server (NTRS)
Jones, C. E.; Feng, G. C.
1972-01-01
A method and an associated digital computer program for evaluating the vibrational characteristics of large liquid-filled rigid wall tanks of general shape are presented. A solution procedure was developed in which slosh modes and frequencies are computed for systems mathematically modeled as assemblages of liquid finite elements. To retain sparsity in the assembled system mass and stiffness matrices, a compressible liquid element formulation was incorporated in the program. The approach taken in the liquid finite element formulation is compatible with triangular and quadrilateral structural finite elements so that the analysis of liquid motion can be coupled with flexible tank wall motion at some future time. The liquid element repertoire developed during the course of this study consists of a two-dimensional triangular element and a three-dimensional tetrahedral element.
Fault management for data systems
NASA Technical Reports Server (NTRS)
Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann
1993-01-01
Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.
Physics through the 1990s: Scientific interfaces and technological applications
NASA Technical Reports Server (NTRS)
1986-01-01
The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.
Wise, Meg; Marchand, Lucille; Cleary, James F.; Aeschlimann, Elizabeth; Causier, Daniel
2012-01-01
We describe an online narrative and life review education program for cancer patients and the results of a small implementation test to inform future directions for further program development and full-scale evaluation research. The intervention combined three types of psycho-oncology narrative interventions that have been shown to help patients address emotional and existential issues: 1) a physician-led dignity-enhancing telephone interview to elicit the life narrative and delivery of an edited life manuscript, 2) life review education, delivered via 3) a website self-directed instructional materials and expert consultation to help people revise and share their story. Eleven cancer patients tested the intervention and provided feedback in an in-depth exit interview. While everyone said telling and receiving the edited story manuscript was helpful and meaningful, only people with high death salience and prior computer experience used the web tools to enhance and share their story. Computer users prodded us to provide more sophisticated tools and older (>70 years) users needed more staff and family support. We conclude that combining a telephone expert-led interview with online life review education can extend access to integrative oncology services, are most feasible for computer-savvy patients with advanced cancer, and must use platforms that allow patients to upload files and invite their social network. PMID:19476731
The GI Project: a prototype electronic textbook for high school biology.
Calhoun, P S; Fishman, E K
1997-01-01
A prototype electronic science textbook for secondary education was developed to help bridge the gap between state-of-the-art medical technology and the basic science classroom. The prototype combines the latest in radiologic imaging techniques with a user-friendly multimedia computer program to teach the anatomy, physiology, and diseases of the gastrointestinal (GI) tract. The program includes original text, illustrations, photographs, animations, images from upper GI studies, plain radiographs, computed tomographic images, and three-dimensional reconstructions. These features are intended to create a stimulus-rich environment in which the high school science student can enjoy a variety of interactive experiences that will facilitate the learning process. The computer-based book is a new educational tool that promises to play a prominent role in the coming years. Current research suggests that computer-based books are valuable as an alternative educational medium. Although it is not yet clear what form textbooks will take in the future, computer-based books are already proving valuable as an alternative educational medium. For beginning students, they reinforce the material found in traditional textbooks and class presentations; for advanced students, they provide motivation to learn outside the traditional classroom.
A programmable five qubit quantum computer using trapped atomic ions
NASA Astrophysics Data System (ADS)
Debnath, Shantanu
2017-04-01
In order to harness the power of quantum information processing, several candidate systems have been investigated, and tailored to demonstrate only specific computations. In my thesis work, we construct a general-purpose multi-qubit device using a linear chain of trapped ion qubits, which in principle can be programmed to run any quantum algorithm. To achieve such flexibility, we develop a pulse shaping technique to realize a set of fully connected two-qubit rotations that entangle arbitrary pairs of qubits using multiple motional modes of the chain. Following a computation architecture, such highly expressive two-qubit gates along with arbitrary single-qubit rotations can be used to compile modular universal logic gates that are effected by targeted optical fields and hence can be reconfigured according to any algorithm circuit programmed in the software. As a demonstration, we run the Deutsch-Jozsa and Bernstein-Vazirani algorithm, and a fully coherent quantum Fourier transform, that we use to solve the `period finding' and `quantum phase estimation' problem. Combining these results with recent demonstrations of quantum fault-tolerance, Grover's search algorithm, and simulation of boson hopping establishes the versatility of such a computation module that can potentially be connected to other modules for future large-scale computations.
NASA's 3D Flight Computer for Space Applications
NASA Technical Reports Server (NTRS)
Alkalai, Leon
2000-01-01
The New Millennium Program (NMP) Integrated Product Development Team (IPDT) for Microelectronics Systems was planning to validate a newly developed 3D Flight Computer system on its first deep-space flight, DS1, launched in October 1998. This computer, developed in the 1995-97 time frame, contains many new computer technologies previously never used in deep-space systems. They include: advanced 3D packaging architecture for future low-mass and low-volume avionics systems; high-density 3D packaged chip-stacks for both volatile and non-volatile mass memory: 400 Mbytes of local DRAM memory, and 128 Mbytes of Flash memory; high-bandwidth Peripheral Component Interface (Per) local-bus with a bridge to VME; high-bandwidth (20 Mbps) fiber-optic serial bus; and other attributes, such as standard support for Design for Testability (DFT). Even though this computer system did not complete on time for delivery to the DS1 project, it was an important development along a technology roadmap towards highly integrated and highly miniaturized avionics systems for deep-space applications. This continued technology development is now being performed by NASA's Deep Space System Development Program (also known as X2000) and within JPL's Center for Integrated Space Microsystems (CISM).
Modeling climate change impacts on water trading.
Luo, Bin; Maqsood, Imran; Gong, Yazhen
2010-04-01
This paper presents a new method of evaluating the impacts of climate change on the long-term performance of water trading programs, through designing an indicator to measure the mean of periodic water volume that can be released by trading through a water-use system. The indicator is computed with a stochastic optimization model which can reflect the random uncertainty of water availability. The developed method was demonstrated in the Swift Current Creek watershed of Prairie Canada under two future scenarios simulated by a Canadian Regional Climate Model, in which total water availabilities under future scenarios were estimated using a monthly water balance model. Frequency analysis was performed to obtain the best probability distributions for both observed and simulated water quantity data. Results from the case study indicate that the performance of a trading system is highly scenario-dependent in future climate, with trading effectiveness highly optimistic or undesirable under different future scenarios. Trading effectiveness also largely depends on trading costs, with high costs resulting in failure of the trading program. (c) 2010 Elsevier B.V. All rights reserved.
[Activities of Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2001-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
Ferguson, Melanie; Henshaw, Helen
2015-09-01
The aim of this research forum article was to examine accessibility, use, and adherence to computerized and online interventions for people with hearing loss. Four intervention studies of people with hearing loss were examined: 2 auditory training studies, 1 working memory training study, and 1 study of multimedia educational support. A small proportion (approximately 15%) of participants had never used a computer, which may be a barrier to the accessibility of computer and Internet-based interventions. Computer competence was not a factor in intervention use or adherence. Computer skills and Internet access influenced participant preference for the delivery method of the multimedia educational support program. It is important to be aware of current barriers to computer and Internet-delivered interventions for people with hearing loss. However, there is a clear need to develop and future-proof hearing-related applications for online delivery.
Applied Computational Fluid Dynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Kwak, Dochan (Technical Monitor)
1994-01-01
The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.
NASA Technical Reports Server (NTRS)
Miller, R. D.; Anderson, L. R.
1979-01-01
The LOADS program L218, a digital computer program that calculates dynamic load coefficient matrices utilizing the force summation method, is described. The load equations are derived for a flight vehicle in straight and level flight and excited by gusts and/or control motions. In addition, sensor equations are calculated for use with an active control system. The load coefficient matrices are calculated for the following types of loads: translational and rotational accelerations, velocities, and displacements; panel aerodynamic forces; net panel forces; shears and moments. Program usage and a brief description of the analysis used are presented. A description of the design and structure of the program to aid those who will maintain and/or modify the program in the future is included.
A meta-analysis of pedagogical tools used in introductory programming courses
NASA Astrophysics Data System (ADS)
Trees, Frances P.
Programming is recognized as being challenging for teachers to teach and difficult for students to learn. For decades, computer science educators have looked at innovative approaches by creating pedagogical software tools that attempt to facilitate both the teaching of and the learning of programming. This dissertation investigates the motivations for the integration of pedagogical tools in introductory programming courses and the characteristics that are perceived to contribute to the effectiveness of these tools. The study employs three research stages that examine the tool characteristics and their use. The first stage surveys teachers who use pedagogical tools in an introductory programming course. The second interviews teachers to explore the survey results in more detail and to add greater depth into the choice and use of pedagogical tools in the introductory programming class. The third interviews tool developers to provide an explanatory insight of the tool and the motivation for its creation. The results indicate that the pedagogical tools perceived to be effective share common characteristics: They provide an environment that is manageable, flexible and visual; they provide for active engagement in learning activities and support programming in small pieces; they allow for an easy transition to subsequent courses and more robust environments; they provide technical support and resource materials. The results of this study also indicate that recommendations from other computer science educators have a strong impact on a teacher's initial tool choice for an introductory programming course. This study informs present and future tool developers of the characteristics that the teachers perceive to contribute to the effectiveness of a pedagogical tool and how to present their tools to encourage a more efficient and more effective widespread adoption of the tool into the teacher's curriculum. The teachers involved in this study are actively involved in the computer science education community. The results of this study, based on the perceptions of these computer science educators, provide guidance to those educators choosing to introduce a new pedagogical tool into their programming course.
Orbach, Ron; Willner, Bilha; Willner, Itamar
2015-03-11
This feature article addresses the implementation of catalytic nucleic acids as functional units for the construction of logic gates and computing circuits, and discusses the future applications of these systems. The assembly of computational modules composed of DNAzymes has led to the operation of a universal set of logic gates, to field programmable logic gates and computing circuits, to the development of multiplexers/demultiplexers, and to full-adder systems. Also, DNAzyme cascades operating as logic gates and computing circuits were demonstrated. DNAzyme logic systems find important practical applications. These include the use of DNAzyme-based systems for sensing and multiplexed analyses, for the development of controlled release and drug delivery systems, for regulating intracellular biosynthetic pathways, and for the programmed synthesis and operation of cascades.
Quality assurance for respiratory care services: a computer-assisted program.
Elliott, C G
1993-01-01
At present, the principal advantage of computer-assisted quality assurance is the acquisition of quality assurance date without resource-consuming chart reviews. A surveillance program like the medical director's alert may reduce morbidity and mortality. Previous research suggests that inadequate oxygen therapy or failures in airway management are important causes of preventable deaths in hospitals. Furthermore, preventable deaths tend to occur among patients who have lower severity-of-illness scores and who are not in ICUs. Thus, surveillance of the entire hospital, as performed by the HIS medical director's alert, may significantly impact hospital mortality related to respiratory care. Future research should critically examine the potential of such computerized systems to favorably change the morbidity and mortality of hospitalized patients. The departments of respiratory care and medical informatics at LDS Hospital have developed a computer-assisted approach to quality assurance monitoring of respiratory care services. This system provides frequent and consistent samples of a variety of respiratory care data. The immediate needs of patients are addressed through a daily surveillance system (medical director's alert). The departmental quality assurance program utilizes a separate program that monitors clinical indicators of staff performance in terms of stated departmental policies and procedures (rate-based clinical indicators). The availability of an integrated patient database allows these functions to be performed without labor-intensive chart audits.
Future Directions for Postdoctoral Training in Cancer Prevention: Insights from a Panel of Experts
Nelson, David E.; Faupel-Badger, Jessica; Phillips, Siobhan; Belcher, Britni; Chang, Shine; Abrams, David B.; Kramer, Barnett S.; White, Mary C.; O’Malley, Michael; Varanasi, Arti P.; Fabian, Carol J.; Wiest, Jonathan S.; Colditz, Graham A.; Hall, Kara; Shields, Peter G.; Weitzel, Jeffrey N.
2014-01-01
Cancer prevention postdoctoral fellowships have existed since the 1970s. The National Cancer Institute facilitated a meeting by a panel of experts in April 2013 to consider four important topics for future directions for cancer prevention postdoctoral training programs: 1) future research needs; 2) underrepresented disciplines; 3) curriculum; and 4) career preparation. Panelists proffered several areas needing more research or emphasis, ranging from computational science to culture. Health care providers, along with persons from non-traditional disciplines such as engineers and lawyers, were among disciplines recognized as being underrepresented in training programs. Curriculum suggestions were that fellows receive training in topics such as leadership and human relations, in addition to learning the principles of epidemiology, cancer biological mechanisms, and behavioral science. For career preparation, there was a clear recognition of the diversity of employment options available besides academic positions, and that program leaders should do more to help fellows identify and prepare for different career paths. The major topics and strategies covered at this meeting can help form the basis for cancer prevention training program leaders to consider modifications or new directions, and keep them current with the changing scientific and employment climate for doctoral degree recipients and postdoctoral fellows. PMID:24604827
Optical reversible programmable Boolean logic unit.
Chattopadhyay, Tanay
2012-07-20
Computing with reversibility is the only way to avoid dissipation of energy associated with bit erase. So, a reversible microprocessor is required for future computing. In this paper, a design of a simple all-optical reversible programmable processor is proposed using a polarizing beam splitter, liquid crystal-phase spatial light modulators, a half-wave plate, and plane mirrors. This circuit can perform 16 logical operations according to three programming inputs. Also, inputs can be easily recovered from the outputs. It is named the "reversible programmable Boolean logic unit (RPBLU)." The logic unit is the basic building block of many complex computational operations. Hence the design is important in sense. Two orthogonally polarized lights are defined here as two logical states, respectively.
NASA Technical Reports Server (NTRS)
Escher, William J. D.; Herr, Paul N.; Stephenson, Frank W., Jr.
1990-01-01
NASA's Civil Space Technology Initiative encompasses among its major elements the Earth-to-Orbit Propulsion Program (ETOPP) for future launch vehicles, which is budgeted to the extent of $20-30 million/year for the development of essential technologies. ETOPP technologies include, in addition to advanced materials and processes and design/analysis computational tools, the advanced systems-synthesis technologies required for definition of highly reliable LH2 and hydrocarbon fueled rocket engines to be operated at significantly reduced levels of risk and cost relative to the SSME. Attention is given to the technology-transfer services of ETOPP.
The Julia programming language: the future of scientific computing
NASA Astrophysics Data System (ADS)
Gibson, John
2017-11-01
Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.
Performance of a parallel code for the Euler equations on hypercube computers
NASA Technical Reports Server (NTRS)
Barszcz, Eric; Chan, Tony F.; Jesperson, Dennis C.; Tuminaro, Raymond S.
1990-01-01
The performance of hypercubes were evaluated on a computational fluid dynamics problem and the parallel environment issues were considered that must be addressed, such as algorithm changes, implementation choices, programming effort, and programming environment. The evaluation focuses on a widely used fluid dynamics code, FLO52, which solves the two dimensional steady Euler equations describing flow around the airfoil. The code development experience is described, including interacting with the operating system, utilizing the message-passing communication system, and code modifications necessary to increase parallel efficiency. Results from two hypercube parallel computers (a 16-node iPSC/2, and a 512-node NCUBE/ten) are discussed and compared. In addition, a mathematical model of the execution time was developed as a function of several machine and algorithm parameters. This model accurately predicts the actual run times obtained and is used to explore the performance of the code in interesting but yet physically realizable regions of the parameter space. Based on this model, predictions about future hypercubes are made.
[Computer simulation of a clinical magnet resonance tomography scanner for training purposes].
Hackländer, T; Mertens, H; Cramer, B M
2004-08-01
The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. The simulation has been used in the university education for more than 1 year, successfully illustrating the dependence of the MR images on the measuring parameters. This should facititate the approach of students to the understanding MR imaging in the future.
METal matrix composite ANalyzer (METCAN): Theoretical manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1993-01-01
This manuscript is intended to be a companion volume to the 'METCAN User's Manual' and the 'METAN Demonstration Manual.' The primary purpose of the manual is to give details pertaining to micromechanics and macromechanics equations of high temperature metal matrix composites that are programmed in the METCAN computer code. The subroutines which contain the programmed equations are also mentioned in order to facilitate any future changes or modifications that the user may intend to incorporate in the code. Assumptions and derivations leading to the micromechanics equations are briefly mentioned.
An "Intelligent" Optical Design Program
NASA Astrophysics Data System (ADS)
Bohachevsky, I. O.; Viswanathan, V. K.; Woodfin, G.
1984-06-01
Described is a general approach to the development of computer programs capable of designing image-forming optical systems without human intervention and of improving their performance with repeated attempts. The approach utilizes two ideas: 1) interpretation of technical design as a mapping in the configuration space of technical characteristics and 2) development of an "intelligent" routine that recognizes global optima. Examples of lens systems designed and used in the development of the general approach are presented, current status of the project is summarized, and plans for the future efforts are indicated.
Product Operations Status Summary Metrics
NASA Technical Reports Server (NTRS)
Takagi, Atsuya; Toole, Nicholas
2010-01-01
The Product Operations Status Summary Metrics (POSSUM) computer program provides a readable view into the state of the Phoenix Operations Product Generation Subsystem (OPGS) data pipeline. POSSUM provides a user interface that can search the data store, collect product metadata, and display the results in an easily-readable layout. It was designed with flexibility in mind for support in future missions. Flexibility over various data store hierarchies is provided through the disk-searching facilities of Marsviewer. This is a proven program that has been in operational use since the first day of the Phoenix mission.
Computational tools for multi-linked flexible structures
NASA Technical Reports Server (NTRS)
Lee, Gordon K. F.; Brubaker, Thomas A.; Shults, James R.
1990-01-01
A software module which designs and tests controllers and filters in Kalman Estimator form, based on a polynomial state-space model is discussed. The user-friendly program employs an interactive graphics approach to simplify the design process. A variety of input methods are provided to test the effectiveness of the estimator. Utilities are provided which address important issues in filter design such as graphical analysis, statistical analysis, and calculation time. The program also provides the user with the ability to save filter parameters, inputs, and outputs for future use.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Passive TCP Reconstruction and Forensic Analysis with tcpflow
2013-09-01
Detection Evaluation [17]. 6 Future Work The original tcpflow was developed at a time when computers had relatively small memories and, as a result...4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 6...1 2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 3 Requirements
ERIC Educational Resources Information Center
Hoang, Kiem, Ed.; Tran, Van Hao, Ed.; Luu, Tien Hiep, Ed.; Phan, Viet Hoang, Ed.; Owens, Thomas, Ed.; Nguyen, Son Thanh, Ed.; Vuong, Son Thanh, Ed.; Dong Thi, Bich Thuy, Ed.; Phan Thi, Tuoi, Ed.
This proceedings volume includes the following 29 papers: Session 1--(1) "Technology for Learning: The Present and Future in the United States" (Thomas Owens, Carolyn Cohen); (2) "Computer Systems Technology Programs at the British Columbia Institute of Technology (Canada). A Technology-Based Model for Information Technology"…
Computer Forensics Education - the Open Source Approach
NASA Astrophysics Data System (ADS)
Huebner, Ewa; Bem, Derek; Cheung, Hon
In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.
Cloudgene: A graphical execution platform for MapReduce programs on private and public clouds
2012-01-01
Background The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Results Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds) and rented clusters (public clouds). The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Conclusions Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio) in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are integrated and have been successfully deployed. Cloudgene is freely available at http://cloudgene.uibk.ac.at. PMID:22888776
Final Project Report. Scalable fault tolerance runtime technology for petascale computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Sadayappan, P
With the massive number of components comprising the forthcoming petascale computer systems, hardware failures will be routinely encountered during execution of large-scale applications. Due to the multidisciplinary, multiresolution, and multiscale nature of scientific problems that drive the demand for high end systems, applications place increasingly differing demands on the system resources: disk, network, memory, and CPU. In addition to MPI, future applications are expected to use advanced programming models such as those developed under the DARPA HPCS program as well as existing global address space programming models such as Global Arrays, UPC, and Co-Array Fortran. While there has been amore » considerable amount of work in fault tolerant MPI with a number of strategies and extensions for fault tolerance proposed, virtually none of advanced models proposed for emerging petascale systems is currently fault aware. To achieve fault tolerance, development of underlying runtime and OS technologies able to scale to petascale level is needed. This project has evaluated range of runtime techniques for fault tolerance for advanced programming models.« less
Toward full life cycle control: Adding maintenance measurement to the SEL
NASA Technical Reports Server (NTRS)
Rombach, H. Dieter; Ulery, Bradford T.; Valett, Jon D.
1992-01-01
Organization-wide measurement of software products and processes is needed to establish full life cycle control over software products. The Software Engineering Laboratory (SEL)--a joint venture between NASA GSFC, the University of Maryland, and Computer Sciences Corporation--started measurement of software development more than 15 years ago. Recently, the measurement of maintenance was added to the scope of the SEL. In this article, the maintenance measurement program is presented as an addition to the already existing and well-established SEL development measurement program and evaluated in terms of its immediate benefits and long-term improvement potential. Immediate benefits of this program for the SEL include an increased understanding of the maintenance domain, the differences and commonalities between development and maintenance, and the cause-effect relationships between development and maintenance. Initial results from a sample maintenance study are presented to substantiate these benefits. The long-term potential of this program includes the use of maintenance baselines to better plan and manage future projects and to improve development and maintenance practices for future projects wherever warranted.
Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...
2014-09-28
Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less
PCB tester selection for future systems. Volume 2. Final report, August 1989-June 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, W.
1992-06-01
This report describes a computer program (to run on an IBM compatible PC) designed to aid in the selection of a PCB tester, given the characteristics of the PC board to be tested. The program contains a limited data base of PCB testers, and others may be added easily. This report also provides a specification for a limited family of PCB testers to fill the gap between what the U.S. Air Force is expected to need and what is expected to be available within the next four to six years. The parameters used in the computer program and the specificationmore » are based on a survey of military and commercial PCBs - both those now available and those expected to come on line within the next four to six years. The results of the survey are covered in volume 2 - available from DTIC. Automatic Test Equipment, Technology Forecast, Air Force Avionics.« less
PCB tester selection for future systems. Volume 1. Final report, August 1989-June 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, W.
1992-06-01
This report describes a computer program (to run on an IBM compatible PC) designed to aid in the selection of a PCB tester, given the characteristics of the PC board to be tested. The program contains a limited data base of PCB testers, and others may be added easily. This report also provides a specification for a limited family of PCB testers to fill the gap between what the U.S. Air Force is expected to need and what is expected to be available within the next four to six years. The parameters used in the computer program and the specificationmore » are based on a survey of military and commercial PCBs - both those now available and those expected to come on line within the next four to six years. The results of the survey are covered in volume 2 - available from DTIC. Automatic Test Equipment, Technology Forecast, Air Force Avionics.« less
Programming Tools: Status, Evaluation, and Comparison
NASA Technical Reports Server (NTRS)
Cheng, Doreen Y.; Cooper, D. M. (Technical Monitor)
1994-01-01
In this tutorial I will first describe the characteristics of scientific applications and their developers, and describe the computing environment in a typical high-performance computing center. I will define the user requirements for tools that support application portability and present the difficulties to satisfy them. These form the basis of the evaluation and comparison of the tools. I will then describe the tools available in the market and the tools available in the public domain. Specifically, I will describe the tools for converting sequential programs, tools for developing portable new programs, tools for debugging and performance tuning, tools for partitioning and mapping, and tools for managing network of resources. I will introduce the main goals and approaches of the tools, and show main features of a few tools in each category. Meanwhile, I will compare tool usability for real-world application development and compare their different technological approaches. Finally, I will indicate the future directions of the tools in each category.
Integrated Data Base Program: a status report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Notz, K.J.; Klein, J.A.
1984-06-01
The Integrated Data Base (IDB) Program provides official Department of Energy (DOE) data on spent fuel and radioactive waste inventories, projections, and characteristics. The accomplishments of FY 1983 are summarized for three broad areas: (1) upgrading and issuing of the annual report on spent fuel and radioactive waste inventories, projections, and characteristics, including ORIGEN2 applications and a quality assurance plan; (2) creation of a summary data file in user-friendly format for use on a personal computer and enhancing user access to program data; and (3) optimizing and documentation of the data handling methodology used by the IDB Program and providingmore » direct support to other DOE programs and sites in data handling. Plans for future work in these three areas are outlined. 23 references, 11 figures.« less
Lenert, Leslie; Lurie, Jon; Coleman, Robert; Klosterman, Heidrun; Blaschke, Terrence
1990-01-01
In this paper, we will describe an advanced drug dosing program, Aminoglycoside Therapy Manager that reasons using Bayesian pharmacokinetic modeling and symbolic modeling of patient status and drug response. Our design is similar to the design of the Digitalis Therapy Advisor program, but extends previous work by incorporating a Bayesian pharmacokinetic model, a “meta-level” analysis of drug concentrations to identify sampling errors and changes in pharmacokinetics, and including the results of the “meta-level” analysis in reasoning for dosing and therapeutic monitoring recommendations. The program is user friendly and runs on low cost general-purpose hardware. Validation studies show that the program is as accurate in predicting future drug concentrations as an expert using commercial Bayesian forecasting software.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
Design of fast earth-return trajectories from a lunar base
NASA Astrophysics Data System (ADS)
Anhorn, Walter
1991-09-01
The Apollo Lunar Program utilized efficient transearth trajectories which employed parking orbits in order to minimize energy requirements. This thesis concentrates on a different type of transearth trajectory. These are direct-ascent, hyperbolic trajectories which omit the parking orbits in order to achieve short flight times to and from a future lunar base. The object of the thesis is the development of a three-dimensional transearth trajectory model and associated computer program for exploring trade-offs between flight-time and energy, given various mission constraints. The program also targets the Moon with a hyperbolic trajectory, which can be used for targeting Earth impact points. The first-order model is based on an Earth-centered conic and a massless spherical Moon, using MathCAD version 3.0. This model is intended as the basis for future patched-conic formulations for the design of fast Earth-return trajectories. Applications include placing nuclear deterrent arsenals on the Moon, various space support related activities, and finally protection against Earth-threatening asteroids and comets using lunar bases.
SPARX, a new environment for Cryo-EM image processing.
Hohn, Michael; Tang, Grant; Goodyear, Grant; Baldwin, P R; Huang, Zhong; Penczek, Pawel A; Yang, Chao; Glaeser, Robert M; Adams, Paul D; Ludtke, Steven J
2007-01-01
SPARX (single particle analysis for resolution extension) is a new image processing environment with a particular emphasis on transmission electron microscopy (TEM) structure determination. It includes a graphical user interface that provides a complete graphical programming environment with a novel data/process-flow infrastructure, an extensive library of Python scripts that perform specific TEM-related computational tasks, and a core library of fundamental C++ image processing functions. In addition, SPARX relies on the EMAN2 library and cctbx, the open-source computational crystallography library from PHENIX. The design of the system is such that future inclusion of other image processing libraries is a straightforward task. The SPARX infrastructure intelligently handles retention of intermediate values, even those inside programming structures such as loops and function calls. SPARX and all dependencies are free for academic use and available with complete source.
Microgravity sciences application visiting scientist program
NASA Technical Reports Server (NTRS)
Glicksman, Martin; Vanalstine, James
1995-01-01
Marshall Space Flight Center pursues scientific research in the area of low-gravity effects on materials and processes. To facilitate these Government performed research responsibilities, a number of supplementary research tasks were accomplished by a group of specialized visiting scientists. They participated in work on contemporary research problems with specific objectives related to current or future space flight experiments and defined and established independent programs of research which were based on scientific peer review and the relevance of the defined research to NASA microgravity for implementing a portion of the national program. The programs included research in the following areas: protein crystal growth, X-ray crystallography and computer analysis of protein crystal structure, optimization and analysis of protein crystal growth techniques, and design and testing of flight hardware.
NASA technology program for future civil air transports
NASA Technical Reports Server (NTRS)
Wright, H. T.
1983-01-01
An assessment is undertaken of the development status of technology, applicable to future civil air transport design, which is currently undergoing conceptual study or testing at NASA facilities. The NASA civil air transport effort emphasizes advanced aerodynamic computational capabilities, fuel-efficient engines, advanced turboprops, composite primary structure materials, advanced aerodynamic concepts in boundary layer laminarization and aircraft configuration, refined control, guidance and flight management systems, and the integration of all these design elements into optimal systems. Attention is given to such novel transport aircraft design concepts as forward swept wings, twin fuselages, sandwich composite structures, and swept blade propfans.
Yu, Xuefei; Lin, Liangzhuo; Shen, Jie; Chen, Zhi; Jian, Jun; Li, Bin; Xin, Sherman Xuegang
2018-01-01
The mean amplitude of glycemic excursions (MAGE) is an essential index for glycemic variability assessment, which is treated as a key reference for blood glucose controlling at clinic. However, the traditional "ruler and pencil" manual method for the calculation of MAGE is time-consuming and prone to error due to the huge data size, making the development of robust computer-aided program an urgent requirement. Although several software products are available instead of manual calculation, poor agreement among them is reported. Therefore, more studies are required in this field. In this paper, we developed a mathematical algorithm based on integer nonlinear programming. Following the proposed mathematical method, an open-code computer program named MAGECAA v1.0 was developed and validated. The results of the statistical analysis indicated that the developed program was robust compared to the manual method. The agreement among the developed program and currently available popular software is satisfied, indicating that the worry about the disagreement among different software products is not necessary. The open-code programmable algorithm is an extra resource for those peers who are interested in the related study on methodology in the future.
GTOOLS: an Interactive Computer Program to Process Gravity Data for High-Resolution Applications
NASA Astrophysics Data System (ADS)
Battaglia, M.; Poland, M. P.; Kauahikaua, J. P.
2012-12-01
An interactive computer program, GTOOLS, has been developed to process gravity data acquired by the Scintrex CG-5 and LaCoste & Romberg EG, G and D gravity meters. The aim of GTOOLS is to provide a validated methodology for computing relative gravity values in a consistent way accounting for as many environmental factors as possible (e.g., tides, ocean loading, solar constraints, etc.), as well as instrument drift. The program has a modular architecture. Each processing step is implemented in a tool (function) that can be either run independently or within an automated task. The tools allow the user to (a) read the gravity data acquired during field surveys completed using different types of gravity meters; (b) compute Earth tides using an improved version of Longman's (1959) model; (c) compute ocean loading using the HARDISP code by Petit and Luzum (2010) and ocean loading harmonics from the TPXO7.2 ocean tide model; (d) estimate the instrument drift using linear functions as appropriate; and (e) compute the weighted least-square-adjusted gravity values and their errors. The corrections are performed up to microGal ( μGal) precision, in accordance with the specifications of high-resolution surveys. The program has the ability to incorporate calibration factors that allow for surveys done using different gravimeters to be compared. Two additional tools (functions) allow the user to (1) estimate the instrument calibration factor by processing data collected by a gravimeter on a calibration range; (2) plot gravity time-series at a chosen benchmark. The interactive procedures and the program output (jpeg plots and text files) have been designed to ease data handling and archiving, to provide useful information for future data interpretation or modeling, and facilitate comparison of gravity surveys conducted at different times. All formulas have been checked for typographical errors in the original reference. GTOOLS, developed using Matlab, is open source and machine independent. We will demonstrate program use and utility with data from multiple microgravity surveys at Kilauea volcano, Hawai'i.
Strategic Computing Computer Vision: Taking Image Understanding To The Next Plateau
NASA Astrophysics Data System (ADS)
Simpson, R. L., Jr.
1987-06-01
The overall objective of the Strategic Computing (SC) Program of the Defense Advanced Research Projects Agency (DARPA) is to develop and demonstrate a new generation of machine intelligence technology which can form the basis for more capable military systems in the future and also maintain a position of world leadership for the US in computer technology. Begun in 1983, SC represents a focused research strategy for accelerating the evolution of new technology and its rapid prototyping in realistic military contexts. Among the very ambitious demonstration prototypes being developed within the SC Program are: 1) the Pilot's Associate which will aid the pilot in route planning, aerial target prioritization, evasion of missile threats, and aircraft emergency safety procedures during flight; 2) two battle management projects one for the for the Army, which is just getting started, called the AirLand Battle Management program (ALBM) which will use knowledge-based systems technology to assist in the generation and evaluation of tactical options and plans at the Corps level; 3) the other more established program for the Navy is the Fleet Command Center Battle Management Program (FCCBIVIP) at Pearl Harbor. The FCCBMP is employing knowledge-based systems and natural language technology in a evolutionary testbed situated in an operational command center to demonstrate and evaluate intelligent decision-aids which can assist in the evaluation of fleet readiness and explore alternatives during contingencies; and 4) the Autonomous Land Vehicle (ALV) which integrates in a major robotic testbed the technologies for dynamic image understanding, knowledge-based route planning with replanning during execution, hosted on new advanced parallel architectures. The goal of the Strategic Computing computer vision technology base (SCVision) is to develop generic technology that will enable the construction of complete, robust, high performance image understanding systems to support a wide range of DoD applications. Possible applications include autonomous vehicle navigation, photointerpretation, smart weapons, and robotic manipulation. This paper provides an overview of the technical and program management plans being used in evolving this critical national technology.
Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A
2012-01-01
Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.
Ayres, Daniel L.; Darling, Aaron; Zwickl, Derrick J.; Beerli, Peter; Holder, Mark T.; Lewis, Paul O.; Huelsenbeck, John P.; Ronquist, Fredrik; Swofford, David L.; Cummings, Michael P.; Rambaut, Andrew; Suchard, Marc A.
2012-01-01
Abstract Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software. PMID:21963610
High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations
NASA Technical Reports Server (NTRS)
Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.
2003-01-01
Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.
Experimental comparison of two quantum computing architectures.
Linke, Norbert M; Maslov, Dmitri; Roetteler, Martin; Debnath, Shantanu; Figgatt, Caroline; Landsman, Kevin A; Wright, Kenneth; Monroe, Christopher
2017-03-28
We run a selection of algorithms on two state-of-the-art 5-qubit quantum computers that are based on different technology platforms. One is a publicly accessible superconducting transmon device (www. ibm.com/ibm-q) with limited connectivity, and the other is a fully connected trapped-ion system. Even though the two systems have different native quantum interactions, both can be programed in a way that is blind to the underlying hardware, thus allowing a comparison of identical quantum algorithms between different physical systems. We show that quantum algorithms and circuits that use more connectivity clearly benefit from a better-connected system of qubits. Although the quantum systems here are not yet large enough to eclipse classical computers, this experiment exposes critical factors of scaling quantum computers, such as qubit connectivity and gate expressivity. In addition, the results suggest that codesigning particular quantum applications with the hardware itself will be paramount in successfully using quantum computers in the future.
Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.
2016-01-01
A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.
NASA Astrophysics Data System (ADS)
Choi, Heekyung
2009-12-01
The purpose of this study is to learn about students' perspectives of an undergraduate level information technology (IT) education program. The IT program is a recent effort to create a new educational opportunity for computing in college, with recognition that the recent IT developments have had a greater influence on various aspects of people's lives than ever. Students' perspectives are a necessary piece of information to develop this innovative IT education program into a sound educational opportunity. Data were gathered through qualitative in-depth interviews conducted with 28 undergraduate students, most of whom have taken one or more IT classes before. The interview data were analyzed using the grounded theory approach. The analysis found that college students perceived that they were very competent in dealing with IT primarily due to their continued exposure to computers since youth. However, this perceived competency was not very stable. Students felt that they did not have sufficient IT competency when technical skills of dealing with IT came to attention. They also felt so when comparing their IT competency with that of their peers, examining it in a class context, and confronting a transition from education to the real world. In spite of their preference for and confidence in self-guided learning, students wanted to receive a formal instruction in IT when they needed to learn something difficult, something that they were not very interested in, and something important for their future lives. They also expressed a desire to gain a comprehensive understanding of computers without needing to learn fundamental computing principles. Students' various interests in IT education were dispersed around learning practical technical skills and understanding social implications of IT. Many participants' focus was a mix of the two factors, which was often expressed as an area that dealt with "how humans and computers interact." This blended interest suggested a potential defining characteristic for IT education. Students' motivations for pursuing IT education ranged from their passion to some practical considerations. The majority of students expressed mixed motivations, often more strongly inclined to practicality. This finding implied that students' practical considerations as well as their pure interests were an important factor to consider in administering an IT program. Participants found that the primary value of the IT program was that it incorporated technological and social topics which had not been well connected previously. Yet, balancing the technical and non-technical components in the curriculum also proved to be the most controversial aspect. Students perceived that the weaknesses of the IT program were also associated with its interdisciplinary nature. Students also viewed that the topics in the IT program were more closely related to many real world problems than the curricula of typical college education programs. Finally, the analysis revealed that students determined the value of the IT minor program in relation to their majors and career interests. Students took the IT minor to supplement their majors, in terras of their interests in developing their careers beyond formal education. Overall, this investigation showed that students perceived this broad-based education program for IT as an intermediate field that filled a significant niche in college education caused by the recent technological innovations: between technical and social, between school and everyday life, and between formal education and the "real world." The results have practical implications for the development of IT programs in college and for future research directions.
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
The IDEAS**2 computing environment
NASA Technical Reports Server (NTRS)
Racheli, Ugo
1990-01-01
This document presents block diagrams of the IDEAS**2 computing environment. IDEAS**2 is the computing environment selected for system engineering (design and analysis) by the Center for Space Construction (CSC) at the University of Colorado (UCB). It is intended to support integration and analysis of any engineering system and at any level of development, from Pre-Phase A conceptual studies to fully mature Phase C/D projects. The University of Colorado (through the Center for Space Construction) has joined the Structural Dynamics Research Corporation (SDRC) University Consortium which makes available unlimited software licenses for instructional purposes. In addition to providing the backbone for the implementation of the IDEAS**2 computing environment, I-DEAS can be used as a stand-alone product for undergraduate CAD/CAE instruction. Presently, SDRC is in the process of releasing I-DEAS level 5.0 which represents a substantial improvement in both the user interface and graphic processing capabilities. IDEAS**2 will be immediately useful for a number of current programs within CSC (such as DYCAM and the 'interruptability problem'). In the future, the following expansions of the basic IDEAS**2 program will be pursued, consistent with the overall objectives of the Center and of the College: upgrade I-DEAS and IDEAS**2 to level 5.0; create new analytical programs for applications not limited to orbital platforms; research the semantic organization of engineering databases; and create an 'interoperability' testbed.
Using computer graphics to enhance astronaut and systems safety
NASA Technical Reports Server (NTRS)
Brown, J. W.
1985-01-01
Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.
Medical Applications of the PHITS Code (3): User Assistance Program for Medical Physics Computation.
Furuta, Takuya; Hashimoto, Shintaro; Sato, Tatsuhiko
2016-01-01
DICOM2PHITS and PSFC4PHITS are user assistance programs for medical physics PHITS applications. DICOM2PHITS is a program to construct the voxel PHITS simulation geometry from patient CT DICOM image data by using a conversion table from CT number to material composition. PSFC4PHITS is a program to convert the IAEA phase-space file data to PHITS format to be used as a simulation source of PHITS. Both of the programs are useful for users who want to apply PHITS simulation to verification of the treatment planning of radiation therapy. We are now developing a program to convert dose distribution obtained by PHITS to DICOM RT-dose format. We also want to develop a program which is able to implement treatment information included in other DICOM files (RT-plan and RT-structure) as a future plan.
NASA Astrophysics Data System (ADS)
Hunter, Geoffrey
2004-01-01
A computational process is classified according to the theoretical model that is capable of executing it; computational processes that require a non-predeterminable amount of intermediate storage for their execution are Turing-machine (TM) processes, while those whose storage are predeterminable are Finite Automation (FA) processes. Simple processes (such as traffic light controller) are executable by Finite Automation, whereas the most general kind of computation requires a Turing Machine for its execution. This implies that a TM process must have a non-predeterminable amount of memory allocated to it at intermediate instants of its execution; i.e. dynamic memory allocation. Many processes encountered in practice are TM processes. The implication for computational practice is that the hardware (CPU) architecture and its operating system must facilitate dynamic memory allocation, and that the programming language used to specify TM processes must have statements with the semantic attribute of dynamic memory allocation, for in Alan Turing"s thesis on computation (1936) the "standard description" of a process is invariant over the most general data that the process is designed to process; i.e. the program describing the process should never have to be modified to allow for differences in the data that is to be processed in different instantiations; i.e. data-invariant programming. Any non-trivial program is partitioned into sub-programs (procedures, subroutines, functions, modules, etc). Examination of the calls/returns between the subprograms reveals that they are nodes in a tree-structure; this tree-structure is independent of the programming language used to encode (define) the process. Each sub-program typically needs some memory for its own use (to store values intermediate between its received data and its computed results); this locally required memory is not needed before the subprogram commences execution, and it is not needed after its execution terminates; it may be allocated as its execution commences, and deallocated as its execution terminates, and if the amount of this local memory is not known until just before execution commencement, then it is essential that it be allocated dynamically as the first action of its execution. This dynamically allocated/deallocated storage of each subprogram"s intermediate values, conforms with the stack discipline; i.e. last allocated = first to be deallocated, an incidental benefit of which is automatic overlaying of variables. This stack-based dynamic memory allocation was a semantic implication of the nested block structure that originated in the ALGOL-60 programming language. AGLOL-60 was a TM language, because the amount of memory allocated on subprogram (block/procedure) entry (for arrays, etc) was computable at execution time. A more general requirement of a Turing machine process is for code generation at run-time; this mandates access to the source language processor (compiler/interpretor) during execution of the process. This fundamental aspect of computer science is important to the future of system design, because it has been overlooked throughout the 55 years since modern computing began in 1048. The popular computer systems of this first half-century of computing were constrained by compile-time (or even operating system boot-time) memory allocation, and were thus limited to executing FA processes. The practical effect was that the distinction between the data-invariant program and its variable data was blurred; programmers had to make trial and error executions, modifying the program"s compile-time constants (array dimensions) to iterate towards the values required at run-time by the data being processed. This era of trial and error computing still persists; it pervades the culture of current (2003) computing practice.
An Overview of Recent Developments in Computational Aeroelasticity
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Edwards, John W.
2004-01-01
The motivation for Computational Aeroelasticity (CA) and the elements of one type of the analysis or simulation process are briefly reviewed. The need for streamlining and improving the overall process to reduce elapsed time and improve overall accuracy is discussed. Further effort is needed to establish the credibility of the methodology, obtain experience, and to incorporate the experience base to simplify the method for future use. Experience with the application of a variety of Computational Aeroelasticity programs is summarized for the transonic flutter of two wings, the AGARD 445.6 wing and a typical business jet wing. There is a compelling need for a broad range of additional flutter test cases for further comparisons. Some existing data sets that may offer CA challenges are presented.
Digital video applications in radiologic education: theory, technique, and applications.
Hennessey, J G; Fishman, E K; Ney, D R
1994-05-01
Computer-assisted instruction (CAI) has great potential in medical education. The recent explosion of multimedia platforms provides an environment for the seemless integration of text, images, and sound into a single program. This article discusses the role of digital video in the current educational environment as well as its future potential. An indepth review of the technical decisions of this new technology is also presented.
ERIC Educational Resources Information Center
Okita, Sandra Y.
2014-01-01
This study examined whether developing earlier forms of knowledge in specific learning environments prepares students better for future learning when they are placed in an unfamiliar learning environment. Forty-one students in the fifth and sixth grades learned to program robot movements using abstract concepts of speed, distance and direction.…
Intelligent Control for Future Autonomous Distributed Sensor Systems
2007-03-26
recognized, the use of a pre-computed reconfiguration solution that fits the recognized scenario could allow reconfiguration to take place without...This data was loaded into the program developed to visualize the seabed and then the simulation was performed using frames to denote the target...to generate separate images for each eye. Users wear lightweight, inexpensive polarized eyeglasses and see a stereoscopic image. 35 Fig. 10
Development of IS2100: An Information Systems Laboratory.
1985-03-01
systems for digital logic; hardware architecture; machine, assembly, and high order language programming; and application packages such as database... applications and limitations. They should be able to define, demonstrate and/or discuss how computers are used, how they do their work, how to use them, and...limitations. Hands on operation of the hardware and software provides experience that aids in future selection of hardware systems and applications
Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris
NASA Technical Reports Server (NTRS)
Hill, Nicole M.
2009-01-01
There are over 13,000 pieces of catalogued objects 10cm and larger in orbit around Earth [ODQN, January 2009, p12]. More than 6000 of these objects are fragments from explosions and collisions. As the earth-orbiting object count increases, debris-generating collisions in the future become a statistical inevitability. To aid in understanding this collision risk, the NASA Orbital Debris Program Office has developed computer models that calculate quantity and orbits of debris both currently in orbit and in future epochs. In order to create a reasonable computer model of the orbital debris environment, it is important to understand the mechanics of creation of debris as a result of a collision. The measurement of the physical characteristics of debris resulting from ground-based, hypervelocity impact testing aids in understanding the sizes and shapes of debris produced from potential impacts in orbit. To advance the accuracy of fragment shape/size determination, the NASA Orbital Debris Program Office recently implemented a computerized measurement system. The goal of this system is to improve knowledge and understanding of the relation between commonly used dimensions and overall shape. The technique developed involves scanning a single fragment with a hand-held laser device, measuring its size properties using a sophisticated software tool, and creating a three-dimensional computer model to demonstrate how the object might appear in orbit. This information is used to aid optical techniques in shape determination. This more automated and repeatable method provides higher accuracy in the size and shape determination of debris.
Parallel hyperbolic PDE simulation on clusters: Cell versus GPU
NASA Astrophysics Data System (ADS)
Rostrup, Scott; De Sterck, Hans
2010-12-01
Increasingly, high-performance computing is looking towards data-parallel computational devices to enhance computational performance. Two technologies that have received significant attention are IBM's Cell Processor and NVIDIA's CUDA programming model for graphics processing unit (GPU) computing. In this paper we investigate the acceleration of parallel hyperbolic partial differential equation simulation on structured grids with explicit time integration on clusters with Cell and GPU backends. The message passing interface (MPI) is used for communication between nodes at the coarsest level of parallelism. Optimizations of the simulation code at the several finer levels of parallelism that the data-parallel devices provide are described in terms of data layout, data flow and data-parallel instructions. Optimized Cell and GPU performance are compared with reference code performance on a single x86 central processing unit (CPU) core in single and double precision. We further compare the CPU, Cell and GPU platforms on a chip-to-chip basis, and compare performance on single cluster nodes with two CPUs, two Cell processors or two GPUs in a shared memory configuration (without MPI). We finally compare performance on clusters with 32 CPUs, 32 Cell processors, and 32 GPUs using MPI. Our GPU cluster results use NVIDIA Tesla GPUs with GT200 architecture, but some preliminary results on recently introduced NVIDIA GPUs with the next-generation Fermi architecture are also included. This paper provides computational scientists and engineers who are considering porting their codes to accelerator environments with insight into how structured grid based explicit algorithms can be optimized for clusters with Cell and GPU accelerators. It also provides insight into the speed-up that may be gained on current and future accelerator architectures for this class of applications. Program summaryProgram title: SWsolver Catalogue identifier: AEGY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v3 No. of lines in distributed program, including test data, etc.: 59 168 No. of bytes in distributed program, including test data, etc.: 453 409 Distribution format: tar.gz Programming language: C, CUDA Computer: Parallel Computing Clusters. Individual compute nodes may consist of x86 CPU, Cell processor, or x86 CPU with attached NVIDIA GPU accelerator. Operating system: Linux Has the code been vectorised or parallelized?: Yes. Tested on 1-128 x86 CPU cores, 1-32 Cell Processors, and 1-32 NVIDIA GPUs. RAM: Tested on Problems requiring up to 4 GB per compute node. Classification: 12 External routines: MPI, CUDA, IBM Cell SDK Nature of problem: MPI-parallel simulation of Shallow Water equations using high-resolution 2D hyperbolic equation solver on regular Cartesian grids for x86 CPU, Cell Processor, and NVIDIA GPU using CUDA. Solution method: SWsolver provides 3 implementations of a high-resolution 2D Shallow Water equation solver on regular Cartesian grids, for CPU, Cell Processor, and NVIDIA GPU. Each implementation uses MPI to divide work across a parallel computing cluster. Additional comments: Sub-program numdiff is used for the test run.
Multi-Strain Deterministic Chaos in Dengue Epidemiology, A Challenge for Computational Mathematics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Kooi, Bob W.; Stollenwerk, Nico
2009-09-01
Recently, we have analysed epidemiological models of competing strains of pathogens and hence differences in transmission for first versus secondary infection due to interaction of the strains with previously aquired immunities, as has been described for dengue fever, known as antibody dependent enhancement (ADE). These models show a rich variety of dynamics through bifurcations up to deterministic chaos. Including temporary cross-immunity even enlarges the parameter range of such chaotic attractors, and also gives rise to various coexisting attractors, which are difficult to identify by standard numerical bifurcation programs using continuation methods. A combination of techniques, including classical bifurcation plots and Lyapunov exponent spectra has to be applied in comparison to get further insight into such dynamical structures. Especially, Lyapunov spectra, which quantify the predictability horizon in the epidemiological system, are computationally very demanding. We show ways to speed up computations of such Lyapunov spectra by a factor of more than ten by parallelizing previously used sequential C programs. Such fast computations of Lyapunov spectra will be especially of use in future investigations of seasonally forced versions of the present models, as they are needed for data analysis.
High-temperature behavior of advanced spacecraft TPS
NASA Technical Reports Server (NTRS)
Pallix, Joan
1994-01-01
The objective of this work has been to develop more efficient, lighter weight, and higher temperature thermal protection systems (TPS) for future reentry space vehicles. The research carried out during this funding period involved the design, analysis, testing, fabrication, and characterization of thermal protection materials to be used on future hypersonic vehicles. This work is important for the prediction of material performance at high temperature and aids in the design of thermal protection systems for a number of programs including programs such as the National Aerospace Plane (NASP), Pegasus and Pegasus/SWERVE, the Comet Rendezvous and Flyby Vehicle (CRAF), and the Mars mission entry vehicles. Research has been performed in two main areas including development and testing of thermal protection systems (TPS) and computational research. A variety of TPS materials and coatings have been developed during this funding period. Ceramic coatings were developed for flexible insulations as well as for low density ceramic insulators. Chemical vapor deposition processes were established for the fabrication of ceramic matrix composites. Experimental testing and characterization of these materials has been carried out in the NASA Ames Research Center Thermophysics Facilities and in the Ames time-of-flight mass spectrometer facility. By means of computation, we have been better able to understand the flow structure and properties of the TPS components and to estimate the aerothermal heating, stress, ablation rate, thermal response, and shape change on the surfaces of TPS. In addition, work for the computational surface thermochemistry project has included modification of existing computer codes and creating new codes to model material response and shape change on atmospheric entry vehicles in a variety of environments (e.g., earth and Mars atmospheres).
High-temperature behavior of advanced spacecraft TPS
NASA Astrophysics Data System (ADS)
Pallix, Joan
1994-05-01
The objective of this work has been to develop more efficient, lighter weight, and higher temperature thermal protection systems (TPS) for future reentry space vehicles. The research carried out during this funding period involved the design, analysis, testing, fabrication, and characterization of thermal protection materials to be used on future hypersonic vehicles. This work is important for the prediction of material performance at high temperature and aids in the design of thermal protection systems for a number of programs including programs such as the National Aerospace Plane (NASP), Pegasus and Pegasus/SWERVE, the Comet Rendezvous and Flyby Vehicle (CRAF), and the Mars mission entry vehicles. Research has been performed in two main areas including development and testing of thermal protection systems (TPS) and computational research. A variety of TPS materials and coatings have been developed during this funding period. Ceramic coatings were developed for flexible insulations as well as for low density ceramic insulators. Chemical vapor deposition processes were established for the fabrication of ceramic matrix composites. Experimental testing and characterization of these materials has been carried out in the NASA Ames Research Center Thermophysics Facilities and in the Ames time-of-flight mass spectrometer facility. By means of computation, we have been better able to understand the flow structure and properties of the TPS components and to estimate the aerothermal heating, stress, ablation rate, thermal response, and shape change on the surfaces of TPS. In addition, work for the computational surface thermochemistry project has included modification of existing computer codes and creating new codes to model material response and shape change on atmospheric entry vehicles in a variety of environments (e.g., earth and Mars atmospheres).
Dan Goldin Presentation: Pathway to the Future
NASA Technical Reports Server (NTRS)
1999-01-01
In the "Path to the Future" presentation held at NASA's Langley Center on March 31, 1999, NASA's Administrator Daniel S. Goldin outlined the future direction and strategies of NASA in relation to the general space exploration enterprise. NASA's Vision, Future System Characteristics, Evolutions of Engineering, and Revolutionary Changes are the four main topics of the presentation. In part one, the Administrator talks in detail about NASA's vision in relation to the NASA Strategic Activities that are Space Science, Earth Science, Human Exploration, and Aeronautics & Space Transportation. Topics discussed in this section include: space science for the 21st century, flying in mars atmosphere (mars plane), exploring new worlds, interplanetary internets, earth observation and measurements, distributed information-system-in-the-sky, science enabling understanding and application, space station, microgravity, science and exploration strategies, human mars mission, advance space transportation program, general aviation revitalization, and reusable launch vehicles. In part two, he briefly talks about the future system characteristics. He discusses major system characteristics like resiliencey, self-sufficiency, high distribution, ultra-efficiency, and autonomy and the necessity to overcome any distance, time, and extreme environment barriers. Part three of Mr. Goldin's talk deals with engineering evolution, mainly evolution in the Computer Aided Design (CAD)/Computer Aided Engineering (CAE) systems. These systems include computer aided drafting, computerized solid models, virtual product development (VPD) systems, networked VPD systems, and knowledge enriched networked VPD systems. In part four, the last part, the Administrator talks about the need for revolutionary changes in communication and networking areas of a system. According to the administrator, the four major areas that need cultural changes in the creativity process are human-centered computing, an infrastructure for distributed collaboration, rapid synthesis and simulation tools, and life-cycle integration and validation. Mr. Goldin concludes his presentation with the following maxim "Collaborate, Integrate, Innovate or Stagnate and Evaporate." He also answers some questions after the presentation.
Building a virtual network in a community health research training program.
Lau, F; Hayward, R
2000-01-01
To describe the experiences, lessons, and implications of building a virtual network as part of a two-year community health research training program in a Canadian province. An action research field study in which 25 health professionals from 17 health regions participated in a seven-week training course on health policy, management, economics, research methods, data analysis, and computer technology. The participants then returned to their regions to apply the knowledge in different community health research projects. Ongoing faculty consultations and support were provided as needed. Each participant was given a notebook computer with the necessary software, Internet access, and technical support for two years, to access information resources, engage in group problem solving, share ideas and knowledge, and collaborate on projects. Data collected over two years consisted of program documents, records of interviews with participants and staff, meeting notes, computer usage statistics, automated online surveys, computer conference postings, program Web site, and course feedback. The analysis consisted of detailed review and comparison of the data from different sources. NUD*IST was then used to validate earlier study findings. The ten key lessons are that role clarity, technology vision, implementation staging, protected time, just-in-time training, ongoing facilitation, work integration, participatory design, relationship building, and the demonstration of results are essential ingredients for building a successful network. This study provides a descriptive model of the processes involved in developing, in the community health setting, virtual networks that can be used as the basis for future research and as a practical guide for managers.
Energy Efficient Engine: Control system component performance report
NASA Technical Reports Server (NTRS)
Beitler, R. S.; Bennett, G. W.
1984-01-01
An Energy Efficient Engine (E3) program was established to develop technology for improving the energy efficiency of future commercial transport aircraft engines. As part of this program, General Electric designed and tested a new engine. The design, fabrication, bench and engine testing of the Full Authority Digital Electronic Control (FADEC) system used for controlling the E3 Demonstrator Engine is described. The system design was based on many of the proven concepts and component designs used on the General Electric family of engines. One significant difference is the use of the FADEC in place of hydromechanical computation currently used.
Telescience testbed pilot program, volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Leiner, Barry M.
1989-01-01
Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.
Ascent guidance algorithm using lidar wind measurements
NASA Technical Reports Server (NTRS)
Cramer, Evin J.; Bradt, Jerre E.; Hardtla, John W.
1990-01-01
The formulation of a general nonlinear programming guidance algorithm that incorporates wind measurements in the computation of ascent guidance steering commands is discussed. A nonlinear programming (NLP) algorithm that is designed to solve a very general problem has the potential to address the diversity demanded by future launch systems. Using B-splines for the command functional form allows the NLP algorithm to adjust the shape of the command profile to achieve optimal performance. The algorithm flexibility is demonstrated by simulation of ascent with dynamic loading constraints through a set of random wind profiles with and without wind sensing capability.
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1979-01-01
The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.
NASA Technical Reports Server (NTRS)
Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.
1982-01-01
Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.
UC Merced Center for Computational Biology Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colvin, Michael; Watanabe, Masakatsu
Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less
A design methodology for portable software on parallel computers
NASA Technical Reports Server (NTRS)
Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.
1993-01-01
This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured the performance of a portion of this subsystem on the Intel iPSC/2 parallel computer. These results are provided in section four. Our future work is summarized in section five, our acknowledgements are stated in section six, and references for published papers associated with NAG-1-995 are provided in section seven.
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
Molecular implementation of simple logic programs.
Ran, Tom; Kaplan, Shai; Shapiro, Ehud
2009-10-01
Autonomous programmable computing devices made of biomolecules could interact with a biological environment and be used in future biological and medical applications. Biomolecular implementations of finite automata and logic gates have already been developed. Here, we report an autonomous programmable molecular system based on the manipulation of DNA strands that is capable of performing simple logical deductions. Using molecular representations of facts such as Man(Socrates) and rules such as Mortal(X) <-- Man(X) (Every Man is Mortal), the system can answer molecular queries such as Mortal(Socrates)? (Is Socrates Mortal?) and Mortal(X)? (Who is Mortal?). This biomolecular computing system compares favourably with previous approaches in terms of expressive power, performance and precision. A compiler translates facts, rules and queries into their molecular representations and subsequently operates a robotic system that assembles the logical deductions and delivers the result. This prototype is the first simple programming language with a molecular-scale implementation.
NASA Technical Reports Server (NTRS)
Hall, G. F.
1975-01-01
A numerical analysis was developed to determine the airloads on helicopter rotors operating under near-hovering flight conditions capable of producing impulsive noise. A computer program was written in which the solutions for the rotor tip vortex geometry, inflow, aeroelastic response, and airloads are solved in a coupled manner at sequential time steps, with or without the influence of an imposed steady ambient wind or transient gust. The program was developed for future applications in which predicted airloads would be incorporated in an acoustics analysis to attempt to predict and analyze impulsive noise (blade slap). The analysis was applied to a hovering full-scale rotor for which impulsive noise was recorded in the presence of ambient wind. The predicted tip vortex coordinates are in reasonable agreement with the test data, and the blade airload solutions converged to a periodic behavior for an imposed steady ambient wind conditions.
Management of CAD/CAM information: Key to improved manufacturing productivity
NASA Technical Reports Server (NTRS)
Fulton, R. E.; Brainin, J.
1984-01-01
A key element to improved industry productivity is effective management of CAD/CAM information. To stimulate advancements in this area, a joint NASA/Navy/Industry project designated Integrated Programs for Aerospace-Vehicle Design (IPAD) is underway with the goal of raising aerospace industry productivity through advancement of technology to integrate and manage information involved in the design and manufacturing process. The project complements traditional NASA/DOD research to develop aerospace design technology and the Air Force's Integrated Computer-Aided Manufacturing (ICAM) program to advance CAM technology. IPAD research is guided by an Industry Technical Advisory Board (ITAB) composed of over 100 repesentatives from aerospace and computer companies. The IPAD accomplishments to date in development of requirements and prototype software for various levels of company-wide CAD/CAM data management are summarized and plans for development of technology for management of distributed CAD/CAM data and information required to control future knowledge-based CAD/CAM systems are discussed.
Design of the aerosol sampling manifold for the Southern Great Plains site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leifer, R.; Knuth, R.H.; Guggenheim, S.F.
1995-04-01
To meet the needs of the ARM program, the Environmental Measurements Laboratory (EML) has the responsibility to establish a surface aerosol measurements program at the Southern Great Plains (SGP) site in Lamont, OK. At the present time, EML has scheduled installation of five instruments at SGP: a single wavelength nephelometer, an optical particle counter (OPC), a condensation particle counter (CPC), an optical absorption monitor (OAM), and an ozone monitor. ARM`s operating protocol requires that all the observational data be placed online and sent to the main computer facility in real time. EML currently maintains a computer file containing back trajectorymore » (BT) analyses for the SGP site. These trajectories are used to characterize air mass types as they pass over the site. EML is continuing to calculate and store the resulting trajectory analyses for future use by the ARM science team.« less
Benchmarks for target tracking
NASA Astrophysics Data System (ADS)
Dunham, Darin T.; West, Philip D.
2011-09-01
The term benchmark originates from the chiseled horizontal marks that surveyors made, into which an angle-iron could be placed to bracket ("bench") a leveling rod, thus ensuring that the leveling rod can be repositioned in exactly the same place in the future. A benchmark in computer terms is the result of running a computer program, or a set of programs, in order to assess the relative performance of an object by running a number of standard tests and trials against it. This paper will discuss the history of simulation benchmarks that are being used by multiple branches of the military and agencies of the US government. These benchmarks range from missile defense applications to chemical biological situations. Typically, a benchmark is used with Monte Carlo runs in order to tease out how algorithms deal with variability and the range of possible inputs. We will also describe problems that can be solved by a benchmark.
Numerical, analytical, experimental study of fluid dynamic forces in seals
NASA Technical Reports Server (NTRS)
Shapiro, William; Artiles, Antonio; Aggarwal, Bharat; Walowit, Jed; Athavale, Mahesh M.; Preskwas, Andrzej J.
1992-01-01
NASA/Lewis Research Center is sponsoring a program for providing computer codes for analyzing and designing turbomachinery seals for future aerospace and engine systems. The program is made up of three principal components: (1) the development of advanced three dimensional (3-D) computational fluid dynamics codes, (2) the production of simpler two dimensional (2-D) industrial codes, and (3) the development of a knowledge based system (KBS) that contains an expert system to assist in seal selection and design. The first task has been to concentrate on cylindrical geometries with straight, tapered, and stepped bores. Improvements have been made by adoption of a colocated grid formulation, incorporation of higher order, time accurate schemes for transient analysis and high order discretization schemes for spatial derivatives. This report describes the mathematical formulations and presents a variety of 2-D results, including labyrinth and brush seal flows. Extensions of 3-D are presently in progress.
NASA Marshall Engineering Thermosphere Model. 2.0
NASA Technical Reports Server (NTRS)
Owens, J. K.
2002-01-01
This Technical Memorandum describes the NASA Marshall Engineering Thermosphere Model-Version 2.0 (MET-V 2.0) and contains an explanation on the use of the computer program along with an example of the MET-V 2.0 model products. The MET-V 2.0 provides an update to the 1988 version of the model. It provides information on the total mass density, temperature, and individual species number densities for any altitude between 90 and 2,500 km as a function of latitude, longitude, time, and solar and geomagnetic activity. A description is given for use of estimated future 13-mo smoothed solar flux and geomagnetic index values as input to the model. Address technical questions on the MET-V 2.0 and associated computer program to Jerry K. Owens, Spaceflight Experiments Group, Marshall Space Flight Center, Huntsville, AL 35812 (256-961-7576; e-mail Jerry.Owens@msfc.nasa.gov).
Future perspectives - proposal for Oxford Physiome Project.
Oku, Yoshitaka
2010-01-01
The Physiome Project is an effort to understand living creatures using "analysis by synthesis" strategy, i.e., by reproducing their behaviors. In order to achieve its goal, sharing developed models between different computer languages and application programs to incorporate into integrated models is critical. To date, several XML-based markup languages has been developed for this purpose. However, source codes written with XML-based languages are very difficult to read and edit using text editors. An alternative way is to use an object-oriented meta-language, which can be translated to different computer languages and transplanted to different application programs. Object-oriented languages are suitable for describing structural organization by hierarchical classes and taking advantage of statistical properties to reduce the number of parameter while keeping the complexity of behaviors. Using object-oriented languages to describe each element and posting it to a public domain should be the next step to build up integrated models of the respiratory control system.
New computing systems and their impact on structural analysis and design
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1989-01-01
A review is given of the recent advances in computer technology that are likely to impact structural analysis and design. The computational needs for future structures technology are described. The characteristics of new and projected computing systems are summarized. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism. The strategy is designed for computers with a shared memory and a small number of powerful processors (or a small number of clusters of medium-range processors). It is based on approximating the response of the structure by a combination of symmetric and antisymmetric response vectors, each obtained using a fraction of the degrees of freedom of the original finite element model. The strategy was implemented on the CRAY X-MP/4 and the Alliant FX/8 computers. For nonlinear dynamic problems on the CRAY X-MP with four CPUs, it resulted in an order of magnitude reduction in total analysis time, compared with the direct analysis on a single-CPU CRAY X-MP machine.
On the efficacy of a computer-based program to teach visual Braille reading.
Scheithauer, Mindy C; Tiger, Jeffrey H; Miller, Sarah J
2013-01-01
Scheithauer and Tiger (2012) created an efficient computerized program that taught 4 sighted college students to select text letters when presented with visual depictions of Braille alphabetic characters and resulted in the emergence of some braille reading. The current study extended these results to a larger sample (n = 81) and compared the efficacy and efficiency of the instructional program using 2 different response modalities. One variation of the program required a response in a multiple-choice format, and the other variation required a keyed response. Both instructional programs resulted in increased braille letter identification and braille reading. These skills were maintained at a follow-up session 7 to 14 days later. The mean time needed to complete the program was 22.8 min across participants. Implications of these results for future research, as well as practical implications for teaching the braille alphabet, are discussed. © Society for the Experimental Analysis of Behavior.
Aerodynamics of advanced axial-flow turbomachinery
NASA Technical Reports Server (NTRS)
Serovy, G. K.; Kavanagh, P.; Kiishi, T. H.
1980-01-01
A multi-task research program on aerodynamic problems in advanced axial-flow turbomachine configurations was carried out at Iowa State University. The elements of this program were intended to contribute directly to the improvement of compressor, fan, and turbine design methods. Experimental efforts in intra-passage flow pattern measurements, unsteady blade row interaction, and control of secondary flow are included, along with computational work on inviscid-viscous interaction blade passage flow techniques. This final report summarizes the results of this program and indicates directions which might be taken in following up these results in future work. In a separate task a study was made of existing turbomachinery research programs and facilities in universities located in the United States. Some potentially significant research topics are discussed which might be successfully attacked in the university atmosphere.
NASA Astrophysics Data System (ADS)
Stevens, Rick
2008-07-01
The fourth annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held June 13-18, 2008, in Seattle, Washington. The SciDAC conference series is the premier communitywide venue for presentation of results from the DOE Office of Science's interdisciplinary computational science program. Started in 2001 and renewed in 2006, the DOE SciDAC program is the country's - and arguably the world's - most significant interdisciplinary research program supporting the development of advanced scientific computing methods and their application to fundamental and applied areas of science. SciDAC supports computational science across many disciplines, including astrophysics, biology, chemistry, fusion sciences, and nuclear physics. Moreover, the program actively encourages the creation of long-term partnerships among scientists focused on challenging problems and computer scientists and applied mathematicians developing the technology and tools needed to address those problems. The SciDAC program has played an increasingly important role in scientific research by allowing scientists to create more accurate models of complex processes, simulate problems once thought to be impossible, and analyze the growing amount of data generated by experiments. To help further the research community's ability to tap into the capabilities of current and future supercomputers, Under Secretary for Science, Raymond Orbach, launched the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program in 2003. The INCITE program was conceived specifically to seek out computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. The program encourages proposals from universities, other research institutions, and industry. During the first two years of the INCITE program, 10 percent of the resources at NERSC were allocated to INCITE awardees. However, demand for supercomputing resources far exceeded available systems; and in 2003, the Office of Science identified increasing computing capability by a factor of 100 as the second priority on its Facilities of the Future list. The goal was to establish leadership-class computing resources to support open science. As a result of a peer reviewed competition, the first leadership computing facility was established at Oak Ridge National Laboratory in 2004. A second leadership computing facility was established at Argonne National Laboratory in 2006. This expansion of computational resources led to a corresponding expansion of the INCITE program. In 2008, Argonne, Lawrence Berkeley, Oak Ridge, and Pacific Northwest national laboratories all provided resources for INCITE. By awarding large blocks of computer time on the DOE leadership computing facilities, the INCITE program enables the largest-scale computations to be pursued. In 2009, INCITE will award over half a billion node-hours of time. The SciDAC conference celebrates progress in advancing science through large-scale modeling and simulation. Over 350 participants attended this year's talks, poster sessions, and tutorials, spanning the disciplines supported by DOE. While the principal focus was on SciDAC accomplishments, this year's conference also included invited presentations and posters from DOE INCITE awardees. Another new feature in the SciDAC conference series was an electronic theater and video poster session, which provided an opportunity for the community to see over 50 scientific visualizations in a venue equipped with many high-resolution large-format displays. To highlight the growing international interest in petascale computing, this year's SciDAC conference included a keynote presentation by Herman Lederer from the Max Planck Institut, one of the leaders of DEISA (Distributed European Infrastructure for Supercomputing Applications) project and a member of the PRACE consortium, Europe's main petascale project. We also heard excellent talks from several European groups, including Laurent Gicquel of CERFACS, who spoke on `Large-Eddy Simulations of Turbulent Reacting Flows of Real Burners: Status and Challenges', and Jean-Francois Hamelin from EDF, who presented a talk on `Getting Ready for Petaflop Capacities and Beyond: A Utility Perspective'. Two other compelling addresses gave attendees a glimpse into the future. Tomas Diaz de la Rubia of Lawrence Livermore National Laboratory spoke on a vision for a fusion/fission hybrid reactor known as the `LIFE Engine' and discussed some of the materials and modeling challenges that need to be overcome to realize the vision for a 1000-year greenhouse-gas-free power source. Dan Reed from Microsoft gave a capstone talk on the convergence of technology, architecture, and infrastructure for cloud computing, data-intensive computing, and exascale computing (1018 flops/sec). High-performance computing is making rapid strides. The SciDAC community's computational resources are expanding dramatically. In the summer of 2008 the first general purpose petascale system (IBM Cell-based RoadRunner at Los Alamos National Laboratory) was recognized in the top 500 list of fastest machines heralding in the dawning of the petascale era. The DOE's leadership computing facility at Argonne reached number three on the Top 500 and is at the moment the most capable open science machine based on an IBM BG/P system with a peak performance of over 550 teraflops/sec. Later this year Oak Ridge is expected to deploy a 1 petaflops/sec Cray XT system. And even before the scientific community has had an opportunity to make significant use of petascale systems, the computer science research community is forging ahead with ideas and strategies for development of systems that may by the end of the next decade sustain exascale performance. Several talks addressed barriers to, and strategies for, achieving exascale capabilities. The last day of the conference was devoted to tutorials hosted by Microsoft Research at a new conference facility in Redmond, Washington. Over 90 people attended the tutorials, which covered topics ranging from an introduction to BG/P programming to advanced numerical libraries. The SciDAC and INCITE programs and the DOE Office of Advanced Scientific Computing Research core program investments in applied mathematics, computer science, and computational and networking facilities provide a nearly optimum framework for advancing computational science for DOE's Office of Science. At a broader level this framework also is benefiting the entire American scientific enterprise. As we look forward, it is clear that computational approaches will play an increasingly significant role in addressing challenging problems in basic science, energy, and environmental research. It takes many people to organize and support the SciDAC conference, and I would like to thank as many of them as possible. The backbone of the conference is the technical program; and the task of selecting, vetting, and recruiting speakers is the job of the organizing committee. I thank the members of this committee for all the hard work and the many tens of conference calls that enabled a wonderful program to be assembled. This year the following people served on the organizing committee: Jim Ahrens, LANL; David Bader, LLNL; Bryan Barnett, Microsoft; Peter Beckman, ANL; Vincent Chan, GA; Jackie Chen, SNL; Lori Diachin, LLNL; Dan Fay, Microsoft; Ian Foster, ANL; Mark Gordon, Ames; Mohammad Khaleel, PNNL; David Keyes, Columbia University; Bob Lucas, University of Southern California; Tony Mezzacappa, ORNL; Jeff Nichols, ORNL; David Nowak, ANL; Michael Papka, ANL; Thomas Schultess, ORNL; Horst Simon, LBNL; David Skinner, LBNL; Panagiotis Spentzouris, Fermilab; Bob Sugar, UCSB; and Kathy Yelick, LBNL. I owe a special thanks to Mike Papka and Jim Ahrens for handling the electronic theater. I also thank all those who submitted videos. It was a highly successful experiment. Behind the scenes an enormous amount of work is required to make a large conference go smoothly. First I thank Cheryl Zidel for her tireless efforts as organizing committee liaison and posters chair and, in general, handling all of my end of the program and keeping me calm. I also thank Gail Pieper for her work in editing the proceedings, Beth Cerny Patino for her work on the Organizing Committee website and electronic theater, and Ken Raffenetti for his work in keeping that website working. Jon Bashor and John Hules did an excellent job in handling conference communications. I thank Caitlin Youngquist for the striking graphic design; Dan Fay for tutorials arrangements; and Lynn Dory, Suzanne Stevenson, Sarah Pebelske and Sarah Zidel for on-site registration and conference support. We all owe Yeen Mankin an extra-special thanks for choosing the hotel, handling contracts, arranging menus, securing venues, and reassuring the chair that everything was under control. We are pleased to have obtained corporate sponsorship from Cray, IBM, Intel, HP, and SiCortex. I thank all the speakers and panel presenters. I also thank the former conference chairs Tony Metzzacappa, Bill Tang, and David Keyes, who were never far away for advice and encouragement. Finally, I offer my thanks to Michael Strayer, without whose leadership, vision, and persistence the SciDAC program would not have come into being and flourished. I am honored to be part of his program and his friend. Rick Stevens Seattle, Washington July 18, 2008
The scaling issue: scientific opportunities
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2009-07-01
A brief history of the Leadership Computing Facility (LCF) initiative is presented, along with the importance of SciDAC to the initiative. The initiative led to the initiation of the Innovative and Novel Computational Impact on Theory and Experiment program (INCITE), open to all researchers in the US and abroad, and based solely on scientific merit through peer review, awarding sizeable allocations (typically millions of processor-hours per project). The development of the nation's LCFs has enabled available INCITE processor-hours to double roughly every eight months since its inception in 2004. The 'top ten' LCF accomplishments in 2009 illustrate the breadth of the scientific program, while the 75 million processor hours allocated to American business since 2006 highlight INCITE contributions to US competitiveness. The extrapolation of INCITE processor hours into the future brings new possibilities for many 'classic' scaling problems. Complex systems and atomic displacements to cracks are but two examples. However, even with increasing computational speeds, the development of theory, numerical representations, algorithms, and efficient implementation are required for substantial success, exhibiting the crucial role that SciDAC will play.
NASA Technical Reports Server (NTRS)
Cockrell, Charles E., Jr.
2003-01-01
The Next Generation Launch Technology (NGLT) program, Vehicle Systems Research and Technology (VSR&T) project is pursuing technology advancements in aerothermodynamics, aeropropulsion and flight mechanics to enable development of future reusable launch vehicle (RLV) systems. The current design trade space includes rocket-propelled, hypersonic airbreathing and hybrid systems in two-stage and single-stage configurations. Aerothermodynamics technologies include experimental and computational databases to evaluate stage separation of two-stage vehicles as well as computational and trajectory simulation tools for this problem. Additionally, advancements in high-fidelity computational tools and measurement techniques are being pursued along with the study of flow physics phenomena, such as boundary-layer transition. Aero-propulsion technology development includes scramjet flowpath development and integration, with a current emphasis on hypervelocity (Mach 10 and above) operation, as well as the study of aero-propulsive interactions and the impact on overall vehicle performance. Flight mechanics technology development is focused on advanced guidance, navigation and control (GN&C) algorithms and adaptive flight control systems for both rocket-propelled and airbreathing vehicles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Simulation of Rotary-Wing Near-Wake Vortex Structures Using Navier-Stokes CFD Methods
NASA Technical Reports Server (NTRS)
Kenwright, David; Strawn, Roger; Ahmad, Jasim; Duque, Earl; Warmbrodt, William (Technical Monitor)
1997-01-01
This paper will use high-resolution Navier-Stokes computational fluid dynamics (CFD) simulations to model the near-wake vortex roll-up behind rotor blades. The locations and strengths of the trailing vortices will be determined from newly-developed visualization and analysis software tools applied to the CFD solutions. Computational results for rotor nearwake vortices will be used to study the near-wake vortex roll up for highly-twisted tiltrotor blades. These rotor blades typically have combinations of positive and negative spanwise loading and complex vortex wake interactions. Results of the computational studies will be compared to vortex-lattice wake models that are frequently used in rotorcraft comprehensive codes. Information from these comparisons will be used to improve the rotor wake models in the Tilt-Rotor Acoustic Code (TRAC) portion of NASA's Short Haul Civil Transport program (SHCT). Accurate modeling of the rotor wake is an important part of this program and crucial to the successful design of future civil tiltrotor aircraft. The rotor wake system plays an important role in blade-vortex interaction noise, a major problem for all rotorcraft including tiltrotors.
Navy Enhanced Sierra Mechanics (NESM): Toolbox for predicting Navy shock and damage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moyer, Thomas; Stergiou, Jonathan; Reese, Garth
Here, the US Navy is developing a new suite of computational mechanics tools (Navy Enhanced Sierra Mechanics) for the prediction of ship response, damage, and shock environments transmitted to vital systems during threat weapon encounters. NESM includes fully coupled Euler-Lagrange solvers tailored to ship shock/damage predictions. NESM is optimized to support high-performance computing architectures, providing the physics-based ship response/threat weapon damage predictions needed to support the design and assessment of highly survivable ships. NESM is being employed to support current Navy ship design and acquisition programs while being further developed for future Navy fleet needs.
Hypersonic Boundary-Layer Transition for X-33 Phase 2 Vehicle
NASA Technical Reports Server (NTRS)
Thompson, Richard A.; Hamilton, Harris H., II; Berry, Scott A.; Horvath, Thomas J.; Nowak, Robert J.
1998-01-01
A status review of the experimental and computational work performed to support the X-33 program in the area of hypersonic boundary-layer transition is presented. Global transition fronts are visualized using thermographic phosphor measurements. Results are used to derive transition correlations for "smooth body" and discrete roughness data and a computational tool is developed to predict transition onset for X-33 using these results. The X-33 thermal protection system appears to be conservatively designed for transition effects based on these studies. Additional study is needed to address concerns related to surface waviness. A discussion of future test plans is included.
Navy Enhanced Sierra Mechanics (NESM): Toolbox for predicting Navy shock and damage
Moyer, Thomas; Stergiou, Jonathan; Reese, Garth; ...
2016-05-25
Here, the US Navy is developing a new suite of computational mechanics tools (Navy Enhanced Sierra Mechanics) for the prediction of ship response, damage, and shock environments transmitted to vital systems during threat weapon encounters. NESM includes fully coupled Euler-Lagrange solvers tailored to ship shock/damage predictions. NESM is optimized to support high-performance computing architectures, providing the physics-based ship response/threat weapon damage predictions needed to support the design and assessment of highly survivable ships. NESM is being employed to support current Navy ship design and acquisition programs while being further developed for future Navy fleet needs.
Adaptive Fuzzy Systems in Computational Intelligence
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.
1996-01-01
In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.
Institute for Defense Analysis. Annual Report 1993.
1993-01-01
model of computation on the user . One significant advantage of this approach is that AC is used effectively to program high performance subroutines ...capability to perform focused radar imaging through random media (tree canopies and soil , for example) and its capability to over- come heavy... Materials The ability to produce advanced materials at low cost is critical to the performance and affordability of future defense systems. IDA is at
Evaluation of Shipbuilding CAD/CAM/CIM Systems - Phase II (Requirements for Future Systems)
1997-02-01
INNOVATION MARINE INDUSTRY STANDARDS WELDING INDUSTRIAL ENGINEERING EDUCATION AND TRAINING THE NATIONAL SHIPBUILDING RESEARCH PROGRAM February 1997 NSRP 0479...an analysis of CAD/CAM/CIM in shipyards, ship-design software firms, and alIied industries in Europe, Japan and the U.S. The purpose of the analysis...possible: Black and Veatch Hitachi Ariake Works Industrial Technology Institute Intergraph Corporation Kockums Computer Systems Mitsubishi Heavy Industries
Planetary cartography in the next decade: Digital cartography and emerging opportunities
NASA Technical Reports Server (NTRS)
1989-01-01
Planetary maps being produced today will represent views of the solar system for many decades to come. The primary objective of the planetary cartography program is to produce the most complete and accurate maps from hundreds of thousands of planetary images in support of scientific studies and future missions. Here, the utilization of digital techniques and digital bases in response to recent advances in computer technology are emphasized.
The Future: Challenges and Opportunities for Air War College.
1990-04-01
for commuters and other travel.(155:l; 2:--; 42:748; 125:--; 47:31) Magnetically levitating ( maglev ) trains were developed 20 years ago, and reach...These trains began operation in Japan and France in the 1970s. Page 132 Florida is considering using a maglev system for train connections among Miami...Ballistic Missile Maglev : Magnetic Levitation MPC/DPMYI: Air Force Military Personnel Center, Directorate of Plans. Programs and Analysis, Computer Support
The 747 primary flight control systems reliability and maintenance study
NASA Technical Reports Server (NTRS)
1979-01-01
The major operational characteristics of the 747 Primary Flight Control Systems (PFCS) are described. Results of reliability analysis for separate control functions are presented. The analysis makes use of a NASA computer program which calculates reliability of redundant systems. Costs for maintaining the 747 PFCS in airline service are assessed. The reliabilities and cost will provide a baseline for use in trade studies of future flight control system design.
PGMS: A Case Study of Collecting PDA-Based Geo-Tagged Malaria-Related Survey Data
Zhou, Ying; Lobo, Neil F.; Wolkon, Adam; Gimnig, John E.; Malishee, Alpha; Stevenson, Jennifer; Sulistyawati; Collins, Frank H.; Madey, Greg
2014-01-01
Using mobile devices, such as personal digital assistants (PDAs), smartphones, tablet computers, etc., to electronically collect malaria-related field data is the way for the field questionnaires in the future. This case study seeks to design a generic survey framework PDA-based geo-tagged malaria-related data collection tool (PGMS) that can be used not only for large-scale community-level geo-tagged electronic malaria-related surveys, but also for a wide variety of electronic data collections of other infectious diseases. The framework includes two parts: the database designed for subsequent cross-sectional data analysis and the customized programs for the six study sites (two in Kenya, three in Indonesia, and one in Tanzania). In addition to the framework development, we also present our methods used when configuring and deploying the PDAs to 1) reduce data entry errors, 2) conserve battery power, 3) field install the programs onto dozens of handheld devices, 4) translate electronic questionnaires into local languages, 5) prevent data loss, and 6) transfer data from PDAs to computers for future analysis and storage. Since 2008, PGMS has successfully accomplished quite a few surveys that recorded 10,871 compounds and households, 52,126 persons, and 17,100 bed nets from the six sites. These numbers are still growing. PMID:25048377
A Hybrid FPGA/Tilera Compute Element for Autonomous Hazard Detection and Navigation
NASA Technical Reports Server (NTRS)
Villalpando, Carlos Y.; Werner, Robert A.; Carson, John M., III; Khanoyan, Garen; Stern, Ryan A.; Trawny, Nikolas
2013-01-01
To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.
A hybrid FPGA/Tilera compute element for autonomous hazard detection and navigation
NASA Astrophysics Data System (ADS)
Villalpando, C. Y.; Werner, R. A.; Carson, J. M.; Khanoyan, G.; Stern, R. A.; Trawny, N.
To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joshi, Chan; Mori, W.
2013-10-21
This is the final report on the DOE grant number DE-FG02-92ER40727 titled, “Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators.” During this grant period the UCLA program on Advanced Plasma Based Accelerators, headed by Professor C. Joshi has made many key scientific advances and trained a generation of students, many of whom have stayed in this research field and even started research programs of their own. In this final report however, we will focus on the last three years of the grant and report on the scientific progress made in each of the four tasksmore » listed under this grant. Four tasks are focused on: Plasma Wakefield Accelerator Research at FACET, SLAC National Accelerator Laboratory, In House Research at UCLA’s Neptune and 20 TW Laser Laboratories, Laser-Wakefield Acceleration (LWFA) in Self Guided Regime: Experiments at the Callisto Laser at LLNL, and Theory and Simulations. Major scientific results have been obtained in each of the four tasks described in this report. These have led to publications in the prestigious scientific journals, graduation and continued training of high quality Ph.D. level students and have kept the U.S. at the forefront of plasma-based accelerators research field.« less
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Design and operations technologies - Integrating the pieces. [for future space systems design
NASA Technical Reports Server (NTRS)
Eldred, C. H.
1979-01-01
As major elements of life-cycle costs (LCC) having critical impacts on the initiation and utilization of future space programs, the areas of vehicle design and operations are reviewed in order to identify technology requirements. Common to both areas is the requirement for efficient integration of broad, complex systems. Operations technologies focus on the extension of space-based capabilities and cost reduction through the combination of innovative design, low-maintenance hardware, and increased manpower productivity. Design technologies focus on computer-aided techniques which increase productivity while maintaining a high degree of flexibility which enhances creativity and permits graceful design changes.
Video-Game-Like Engine for Depicting Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Upchurch, Paul R.
2009-01-01
GoView is a video-game-like software engine, written in the C and C++ computing languages, that enables real-time, three-dimensional (3D)-appearing visual representation of spacecraft and trajectories (1) from any perspective; (2) at any spatial scale from spacecraft to Solar-system dimensions; (3) in user-selectable time scales; (4) in the past, present, and/or future; (5) with varying speeds; and (6) forward or backward in time. GoView constructs an interactive 3D world by use of spacecraft-mission data from pre-existing engineering software tools. GoView can also be used to produce distributable application programs for depicting NASA orbital missions on personal computers running the Windows XP, Mac OsX, and Linux operating systems. GoView enables seamless rendering of Cartesian coordinate spaces with programmable graphics hardware, whereas prior programs for depicting spacecraft trajectories variously require non-Cartesian coordinates and/or are not compatible with programmable hardware. GoView incorporates an algorithm for nonlinear interpolation between arbitrary reference frames, whereas the prior programs are restricted to special classes of inertial and non-inertial reference frames. Finally, whereas the prior programs present complex user interfaces requiring hours of training, the GoView interface provides guidance, enabling use without any training.
Mind the Gap! A Journey towards Computational Toxicology.
Mangiatordi, Giuseppe Felice; Alberga, Domenico; Altomare, Cosimo Damiano; Carotti, Angelo; Catto, Marco; Cellamare, Saverio; Gadaleta, Domenico; Lattanzi, Gianluca; Leonetti, Francesco; Pisani, Leonardo; Stefanachi, Angela; Trisciuzzi, Daniela; Nicolotti, Orazio
2016-09-01
Computational methods have advanced toxicology towards the development of target-specific models based on a clear cause-effect rationale. However, the predictive potential of these models presents strengths and weaknesses. On the good side, in silico models are valuable cheap alternatives to in vitro and in vivo experiments. On the other, the unconscious use of in silico methods can mislead end-users with elusive results. The focus of this review is on the basic scientific and regulatory recommendations in the derivation and application of computational models. Attention is paid to examine the interplay between computational toxicology and drug discovery and development. Avoiding the easy temptation of an overoptimistic future, we report our view on what can, or cannot, realistically be done. Indeed, studies of safety/toxicity represent a key element of chemical prioritization programs carried out by chemical industries, and primarily by pharmaceutical companies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Strategies for the promotion of computer applications in radiology in healthcare delivery.
Reiner, B; Siegel, E; Allman, R
1998-08-01
The objective of this paper is to identify current trends in the development and implementation of computer applications in today's ever-changing healthcare environment. Marketing strategies are discussed with the goal of promoting computer applications in radiology as a means to advance future healthcare acceptance of technologic developments from the medical imaging field. With the rapid evolution of imaging and and information technologies along with the transition to filmless imaging, radiologists must assume a proactive role in the development and application of these advancements. This expansion can be accomplished in a number of ways including internet based educational programs, research partnerships, and professional membership in societies such as the Society of Computer Applications in Radiology (SCAR). Professional societies such as SCAR, in turn, should reach out to include other professionals from the healthcare community. These would include financial, administrative, and information systems disciplines to promote these technologies in a cost conscious and value added manner.
Using Tutte polynomials to analyze the structure of the benzodiazepines
NASA Astrophysics Data System (ADS)
Cadavid Muñoz, Juan José
2014-05-01
Graph theory in general and Tutte polynomials in particular, are implemented for analyzing the chemical structure of the benzodiazepines. Similarity analysis are used with the Tutte polynomials for finding other molecules that are similar to the benzodiazepines and therefore that might show similar psycho-active actions for medical purpose, in order to evade the drawbacks associated to the benzodiazepines based medicine. For each type of benzodiazepines, Tutte polynomials are computed and some numeric characteristics are obtained, such as the number of spanning trees and the number of spanning forests. Computations are done using the computer algebra Maple's GraphTheory package. The obtained analytical results are of great importance in pharmaceutical engineering. As a future research line, the usage of the chemistry computational program named Spartan, will be used to extent and compare it with the obtained results from the Tutte polynomials of benzodiazepines.
Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
Direct coal liquefaction baseline design and system analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
Interactive design and analysis of future large spacecraft concepts
NASA Technical Reports Server (NTRS)
Garrett, L. B.
1981-01-01
An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.
Structural Analysis and Design Software
NASA Technical Reports Server (NTRS)
1997-01-01
Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.
Upgrade to the Cryogenic Hydrogen Gas Target Monitoring System
NASA Astrophysics Data System (ADS)
Slater, Michael; Tribble, Robert
2013-10-01
The cryogenic hydrogen gas target at Texas A&M is a vital component for creating a secondary radioactive beam that is then used in experiments in the Momentum Achromat Recoil Spectrometer (MARS). A stable beam from the K500 superconducting cyclotron enters the gas cell and some incident particles are transmuted by a nuclear reaction into a radioactive beam, which are separated from the primary beam and used in MARS experiments. The pressure in the target chamber is monitored so that a predictable isotope production rate can be assured. A ``black box'' received the analog pressure data and sent RS232 serial data through an outdated serial connection to an outdated Visual Basic 6 (VB6) program, which plotted the chamber pressure continuously. The black box has been upgraded to an Arduino UNO microcontroller [Atmel Inc.], which can receive the pressure data and output via USB to a computer. It has been programmed to also accept temperature data for future upgrade. A new computer program, with updated capabilities, has been written in Python. The software can send email alerts, create audible alarms through the Arduino, and plot pressure and temperature. The program has been designed to better fit the needs of the users. Funded by DOE and NSF-REU Program.
Enabling Arctic Research Through Science and Engineering Partnerships
NASA Astrophysics Data System (ADS)
Kendall, E. A.; Valentic, T. A.; Stehle, R. H.
2014-12-01
Under an Arctic Research Support and Logistics contract from NSF (GEO/PLR), SRI International, as part of the CH2M HILL Polar Services (CPS) program, forms partnerships with Arctic research teams to provide data transfer, remote operations, and safety/operations communications. This teamwork is integral to the success of real-time science results and often allows for unmanned operations which are both cost-effective and safer. The CPS program utilizes a variety of communications networks, services and technologies to support researchers and instruments throughout the Arctic, including Iridium, VSAT, Inmarsat BGAN, HughesNet, TeleGreenland, radios, and personal locator beacons. Program-wide IT and communications limitations are due to the broad categories of bandwidth, availability, and power. At these sites it is essential to conserve bandwidth and power through using efficient software, coding and scheduling techniques. There are interesting new products and services on the horizon that the program may be able to take advantage of in the future such as Iridium NEXT, Inmarsat Xpress, and Omnispace mobile satellite services. Additionally, there are engineering and computer software opportunities to develop more efficient products. We will present an overview of science/engineering partnerships formed by the CPS program, discuss current limitations and identify future technological possibilities that could further advance Arctic science goals.
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)
2002-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.
2014-08-15
Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less
Development and flight test of a deployable precision landing system
NASA Technical Reports Server (NTRS)
Sim, Alex G.; Murray, James E.; Neufeld, David C.; Reed, R. Dale
1994-01-01
A joint NASA Dryden Flight Research Facility and Johnson Space Center program was conducted to determine the feasibility of the autonomous recovery of a spacecraft using a ram-air parafoil system for the final stages of entry from space that included a precision landing. The feasibility of this system was studied using a flight model of a spacecraft in the generic shape of a flattened biconic that weighed approximately 150 lb and was flown under a commercially available, ram-air parachute. Key elements of the vehicle included the Global Positioning System guidance for navigation, flight control computer, ultrasonic sensing for terminal altitude, electronic compass, and onboard data recording. A flight test program was used to develop and refine the vehicle. This vehicle completed an autonomous flight from an altitude of 10,000 ft and a lateral offset of 1.7 miles that resulted in a precision flare and landing into the wind at a predetermined location. At times, the autonomous flight was conducted in the presence of winds approximately equal to vehicle airspeed. Several novel techniques for computing the winds postflight were evaluated. Future program objectives are also presented.
Preliminary design of a high speed civil transport: The Opus 0-001
NASA Technical Reports Server (NTRS)
1992-01-01
Based on research into the technology and issues surrounding the design, development, and operation of a second generation High Speed Civil Transport, HSCT, the Opus 0-001 team completed the preliminary design of a sixty passenger, three engine aircraft. The design of this aircraft was performed using a computer program which the team wrote. This program automatically computed the geometric, aerodynamic, and performance characteristic of an aircraft whose preliminary geometry was specified. The Opus 0-001 aircraft was designed for a cruise Mach number of 2.2, a range of 4,700 nautical miles and its design was based in current or very near term technology. Its small size was a consequence of an emphasis on a profitable, low cost program, capable of delivering tomorrow's passengers in style and comfort at prices that make it an attractive competitor to both current and future subsonic transport aircraft. Several hundred thousand cases of Cruise Mach number, aircraft size and cost breakdown were investigated to obtain costs and revenues for which profit was calculated. The projected unit flyaway cost was $92.0 million per aircraft.
Configuration Aerodynamics: Past - Present - Future
NASA Technical Reports Server (NTRS)
Wood, Richard M.; Agrawal, Shreekant; Bencze, Daniel P.; Kulfan, Robert M.; Wilson, Douglas L.
1999-01-01
The Configuration Aerodynamics (CA) element of the High Speed Research (HSR) program is managed by a joint NASA and Industry team, referred to as the Technology Integration Development (ITD) team. This team is responsible for the development of a broad range of technologies for improved aerodynamic performance and stability and control characteristics at subsonic to supersonic flight conditions. These objectives are pursued through the aggressive use of advanced experimental test techniques and state of the art computational methods. As the HSR program matures and transitions into the next phase the objectives of the Configuration Aerodynamics ITD are being refined to address the drag reduction needs and stability and control requirements of High Speed Civil Transport (HSCT) aircraft. In addition, the experimental and computational tools are being refined and improved to meet these challenges. The presentation will review the work performed within the Configuration Aerodynamics element in 1994 and 1995 and then discuss the plans for the 1996-1998 time period. The final portion of the presentation will review several observations of the HSR program and the design activity within Configuration Aerodynamics.
Avoiding and tolerating latency in large-scale next-generation shared-memory multiprocessors
NASA Technical Reports Server (NTRS)
Probst, David K.
1993-01-01
A scalable solution to the memory-latency problem is necessary to prevent the large latencies of synchronization and memory operations inherent in large-scale shared-memory multiprocessors from reducing high performance. We distinguish latency avoidance and latency tolerance. Latency is avoided when data is brought to nearby locales for future reference. Latency is tolerated when references are overlapped with other computation. Latency-avoiding locales include: processor registers, data caches used temporally, and nearby memory modules. Tolerating communication latency requires parallelism, allowing the overlap of communication and computation. Latency-tolerating techniques include: vector pipelining, data caches used spatially, prefetching in various forms, and multithreading in various forms. Relaxing the consistency model permits increased use of avoidance and tolerance techniques. Each model is a mapping from the program text to sets of partial orders on program operations; it is a convention about which temporal precedences among program operations are necessary. Information about temporal locality and parallelism constrains the use of avoidance and tolerance techniques. Suitable architectural primitives and compiler technology are required to exploit the increased freedom to reorder and overlap operations in relaxed models.
Navigating the changing learning landscape: perspective from bioinformatics.ca
Ouellette, B. F. Francis
2013-01-01
With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable in the learning continuum. Bioinformatics.ca, which hosts the Canadian Bioinformatics Workshops, has blended more traditional learning styles with current online and social learning styles. Here we share our growing experiences over the past 12 years and look toward what the future holds for bioinformatics training programs. PMID:23515468
ORBIT: an integrated environment for user-customized bioinformatics tools.
Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M
1999-10-01
There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.
2014-01-01
Background Psychostimulants and cannabis are two of the three most commonly used illicit drugs by young Australians. As such, it is important to deliver prevention for these substances to prevent their misuse and to reduce associated harms. The present study aims to evaluate the feasibility and effectiveness of the universal computer-based Climate Schools: Psychostimulant and Cannabis Module. Methods A cluster randomised controlled trial was conducted with 1734 Year 10 students (mean age = 15.44 years; SD = 0.41) from 21 secondary schools in Australia. Schools were randomised to receive either the six lesson computer-based Climate Schools program or their usual health classes, including drug education, over the year. Results The Climate Schools program was shown to increase knowledge of cannabis and psychostimulants and decrease pro-drug attitudes. In the short-term the program was effective in subduing the uptake and plateauing the frequency of ecstasy use, however there were no changes in meth/amphetamine use. In addition, females who received the program used cannabis significantly less frequently than students who received drug education as usual. Finally, the Climate Schools program was related to decreasing students’ intentions to use meth/amphetamine and ecstasy in the future, however these effects did not last over time. Conclusions These findings provide support for the use of a harm-minimisation approach and computer technology as an innovative platform for the delivery of prevention education for illicit drugs in schools. The current study indicated that teachers and students enjoyed the program and that it is feasible to extend the successful Climate Schools model to the prevention of other drugs, namely cannabis and psychostimulants. Trial registration Australian and New Zealand Clinical Trials Registry ACTRN12613000492752. PMID:24943829
Vogl, Laura Elise; Newton, Nicola Clare; Champion, Katrina Elizabeth; Teesson, Maree
2014-06-18
Psychostimulants and cannabis are two of the three most commonly used illicit drugs by young Australians. As such, it is important to deliver prevention for these substances to prevent their misuse and to reduce associated harms. The present study aims to evaluate the feasibility and effectiveness of the universal computer-based Climate Schools: Psychostimulant and Cannabis Module. A cluster randomised controlled trial was conducted with 1734 Year 10 students (mean age = 15.44 years; SD = 0.41) from 21 secondary schools in Australia. Schools were randomised to receive either the six lesson computer-based Climate Schools program or their usual health classes, including drug education, over the year. The Climate Schools program was shown to increase knowledge of cannabis and psychostimulants and decrease pro-drug attitudes. In the short-term the program was effective in subduing the uptake and plateauing the frequency of ecstasy use, however there were no changes in meth/amphetamine use. In addition, females who received the program used cannabis significantly less frequently than students who received drug education as usual. Finally, the Climate Schools program was related to decreasing students' intentions to use meth/amphetamine and ecstasy in the future, however these effects did not last over time. These findings provide support for the use of a harm-minimisation approach and computer technology as an innovative platform for the delivery of prevention education for illicit drugs in schools. The current study indicated that teachers and students enjoyed the program and that it is feasible to extend the successful Climate Schools model to the prevention of other drugs, namely cannabis and psychostimulants. Australian and New Zealand Clinical Trials Registry ACTRN12613000492752.
Experimental comparison of two quantum computing architectures
Linke, Norbert M.; Maslov, Dmitri; Roetteler, Martin; Debnath, Shantanu; Figgatt, Caroline; Landsman, Kevin A.; Wright, Kenneth; Monroe, Christopher
2017-01-01
We run a selection of algorithms on two state-of-the-art 5-qubit quantum computers that are based on different technology platforms. One is a publicly accessible superconducting transmon device (www.research.ibm.com/ibm-q) with limited connectivity, and the other is a fully connected trapped-ion system. Even though the two systems have different native quantum interactions, both can be programed in a way that is blind to the underlying hardware, thus allowing a comparison of identical quantum algorithms between different physical systems. We show that quantum algorithms and circuits that use more connectivity clearly benefit from a better-connected system of qubits. Although the quantum systems here are not yet large enough to eclipse classical computers, this experiment exposes critical factors of scaling quantum computers, such as qubit connectivity and gate expressivity. In addition, the results suggest that codesigning particular quantum applications with the hardware itself will be paramount in successfully using quantum computers in the future. PMID:28325879
A survey of GPU-based medical image computing techniques
Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming
2012-01-01
Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080
Genetic Toxicology in the 21st Century: Reflections and Future Directions
Mahadevan, Brinda; Snyder, Ronald D.; Waters, Michael D.; Benz, R. Daniel; Kemper, Raymond A.; Tice, Raymond R.; Richard, Ann M.
2011-01-01
A symposium at the 40th anniversary of the Environmental Mutagen Society, held from October 24–28, 2009 in St. Louis, MO, surveyed the current status and future directions of genetic toxicology. This article summarizes the presentations and provides a perspective on the future. An abbreviated history is presented, highlighting the current standard battery of genotoxicity assays and persistent challenges. Application of computational toxicology to safety testing within a regulatory setting is discussed as a means for reducing the need for animal testing and human clinical trials, and current approaches and applications of in silico genotoxicity screening approaches across the pharmaceutical industry were surveyed and are reported here. The expanded use of toxicogenomics to illuminate mechanisms and bridge genotoxicity and carcinogenicity, and new public efforts to use high-throughput screening technologies to address lack of toxicity evaluation for the backlog of thousands of industrial chemicals in the environment are detailed. The Tox21 project involves coordinated efforts of four U.S. Government regulatory/research entities to use new and innovative assays to characterize key steps in toxicity pathways, including genotoxic and nongenotoxic mechanisms for carcinogenesis. Progress to date, highlighting preliminary test results from the National Toxicology Program is summarized. Finally, an overview is presented of ToxCast™, a related research program of the U.S. Environmental Protection Agency, using a broad array of high throughput and high content technologies for toxicity profiling of environmental chemicals, and computational toxicology modeling. Progress and challenges, including the pressing need to incorporate metabolic activation capability, are summarized. PMID:21538556
Cremers, Henricus-Paul; Mercken, Liesbeth; Candel, Math; de Vries, Hein; Oenema, Anke
2015-03-09
Smoking prevalence rates among Dutch children increase rapidly after they transit to secondary school, in particular among children with a low socioeconomic status (SES). Web-based, computer-tailored programs supplemented with prompt messages may be able to empower children to prevent them from starting to smoke when they transit to secondary school. The main aim of this study is to evaluate whether computer-tailored feedback messages, with and without prompt messages, are effective in decreasing children's smoking intentions and smoking behavior after 12 and 25 months of follow-up. Data were gathered at baseline (T0), and after 12 months (T1) and 25 months (T2) of follow-up of a smoking prevention intervention program called Fun without Smokes. A total of 162 schools were randomly allocated to a no-intervention control group, an intervention prompt group, or an intervention no-prompt group. A total of 3213 children aged 10 to 12 years old participated in the study and completed a Web-based questionnaire assessing their smoking intention, smoking behavior, and sociocognitive factors, such as attitude, social influence, and self-efficacy, related to smoking. After completion, children in the intervention groups received computer-tailored feedback messages in their own email inbox and those messages could be accessed on the intervention website. Children in the prompt group received prompt messages, via email and short message service (SMS) text messaging, to stimulate them to reuse the intervention website with nonsmoking content. Multilevel logistic regression analyses were performed using multiple imputations to assess the program effects on smoking intention and smoking behavior at T1 and T2. A total of 3213 children participated in the Fun without Smokes study at T0. Between T0 and T1 a total of 1067 children out of the original 3213 (33.21%) dropped out of the study. Between T0 and T2 the number of children that did not participate in the final measurement was 1730 out of the original 3213 (53.84%). No significant program effects were observed for any of the intervention groups compared to the control group at T1 for the intention to engage in smoking-prompt, OR 0.67 (95% CI 0.30-1.50), no-prompt, OR 0.76 (95% CI 0.34-1.67)-or for smoking behavior-prompt, OR 1.13 (95% CI 0.13-9.98), no-prompt, OR 0.50 (95% CI 0.04-5.59). Similar nonsignificant program effects were found at T2 for the intention to start smoking-prompt, OR 0.78 (95% CI 0.26-2.32), no-prompt, OR 1.31 (95% CI 0.45-3.82)-and smoking behavior-prompt, OR 0.53 (95% CI 0.12-2.47), no-prompt, OR 1.01 (95% CI 0.24-4.21). This study showed that the Web-based, computer-tailored feedback messages with and without prompt messages were not effective in modifying children's smoking intentions and smoking behavior as compared to no information. Future smoking prevention interventions are recommended to start closer to the age of actual smoking uptake. Furthermore, future studies on Web-based, computer-tailored smoking prevention programs should focus on assessing and controlling exposure to the educational content and the response to the prompt messages. Netherlands Trial Register NTR3116; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=3116 (Archived by WebCite at http://www.webcitation.org/6O0wQYuPI).
Simulations of Observations with the Far-Infrared Surveyor: Design Overview and Current Status
NASA Astrophysics Data System (ADS)
Jeong, W.; Pak, S.; Lee, H. M.; Kim, S.; Matsuura, M.; Nakagawa, T.; Yamamura, I.; Murakami, H.; Matsuura, S.; Kawada, M.; Kaneda, H.; Shibai, H.
2000-12-01
The Far-Infrared Surveyor (FIS) is one of the on-board instruments on the ASTRO-F satellite, which will be launched in early 2004. The first a half year of its mission period of 500 days is dedicated to an all sky survey in four bands between 50 and 200 micron. On the basis of the present hardware specifications and configurations of the FIS, we have written a computer program to simulate the FIS. The program can be used to evaluate the performance of the instrument as well as to produce input for the data reduction system. In this paper, we describe the current status of the program. As an example of the usage of the simulation program, we present the expected observing data for three different detector sampling rates. The functions which should be implemented into the program, in the future, are enumerated.
Telescience testbed pilot program, volume 2: Program results
NASA Technical Reports Server (NTRS)
Leiner, Barry M.
1989-01-01
Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
Propeller aircraft interior noise model utilization study and validation
NASA Technical Reports Server (NTRS)
Pope, L. D.
1984-01-01
Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.
Xinyinqin: a computer-based heart sound simulator.
Zhan, X X; Pei, J H; Xiao, Y H
1995-01-01
"Xinyinqin" is the Chinese phoneticized name of the Heart Sound Simulator (HSS). The "qin" in "Xinyinqin" is the Chinese name of a category of musical instruments, which means that the operation of HSS is very convenient--like playing an electric piano with the keys. HSS is connected to the GAME I/O of an Apple microcomputer. The generation of sound is controlled by a program. Xinyinqin is used as a teaching aid of Diagnostics. It has been applied in teaching for three years. In this demonstration we will introduce the following functions of HSS: 1) The main program has two modules. The first one is the heart auscultation training module. HSS can output a heart sound selected by the student. Another program module is used to test the student's learning condition. The computer can randomly simulate a certain heart sound and ask the student to name it. The computer gives the student's answer an assessment: "correct" or "incorrect." When the answer is incorrect, the computer will output that heart sound again for the student to listen to; this process is repeated until she correctly identifies it. 2) The program is convenient to use and easy to control. By pressing the S key, it is able to output a slow heart rate until the student can clearly identify the rhythm. The heart rate, like the actual rate of a patient, can then be restored by hitting any key. By pressing the SPACE BAR, the heart sound output can be stopped to allow the teacher to explain something to the student. The teacher can resume playing the heart sound again by hitting any key; she can also change the content of the training by hitting RETURN key. In the future, we plan to simulate more heart sounds and incorporate relevant graphs.
Scalable computing for evolutionary genomics.
Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.
Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2003-01-01
The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.
Levels and loops: the future of artificial intelligence and neuroscience.
Bell, A J
1999-01-01
In discussing artificial intelligence and neuroscience, I will focus on two themes. The first is the universality of cycles (or loops): sets of variables that affect each other in such a way that any feed-forward account of causality and control, while informative, is misleading. The second theme is based around the observation that a computer is an intrinsically dualistic entity, with its physical set-up designed so as not to interfere with its logical set-up, which executes the computation. The brain is different. When analysed empirically at several different levels (cellular, molecular), it appears that there is no satisfactory way to separate a physical brain model (or algorithm, or representation), from a physical implementational substrate. When program and implementation are inseparable and thus interfere with each other, a dualistic point-of-view is impossible. Forced by empiricism into a monistic perspective, the brain-mind appears as neither embodied by or embedded in physical reality, but rather as identical to physical reality. This perspective has implications for the future of science and society. I will approach these from a negative point-of-view, by critiquing some of our millennial culture's popular projected futures. PMID:10670021
Advances in Orion's On-Orbit Guidance and Targeting System Architecture
NASA Technical Reports Server (NTRS)
Scarritt, Sara K.; Fill, Thomas; Robinson, Shane
2015-01-01
NASA's manned spaceflight programs have a rich history of advancing onboard guidance and targeting technology. In order to support future missions, the guidance and targeting architecture for the Orion Multi-Purpose Crew Vehicle must be able to operate in complete autonomy, without any support from the ground. Orion's guidance and targeting system must be sufficiently flexible to easily adapt to a wide array of undecided future missions, yet also not cause an undue computational burden on the flight computer. This presents a unique design challenge from the perspective of both algorithm development and system architecture construction. The present work shows how Orion's guidance and targeting system addresses these challenges. On the algorithm side, the system advances the state-of-the-art by: (1) steering burns with a simple closed-loop guidance strategy based on Shuttle heritage, and (2) planning maneuvers with a cutting-edge two-level targeting routine. These algorithms are then placed into an architecture designed to leverage the advantages of each and ensure that they function in concert with one another. The resulting system is characterized by modularity and simplicity. As such, it is adaptable to the on-orbit phases of any future mission that Orion may attempt.
The Value of Biomedical Simulation Environments to Future Human Space Flight Missions
NASA Technical Reports Server (NTRS)
Mulugeta,Lealem; Myers, Jerry G.; Lewandowski, Beth; Platts, Steven H.
2011-01-01
Mars and NEO missions will expose astronaut to extended durations of reduced reduced gravity, isolation and higher radiation. These new operation conditions pose health risks that are not well understood and perhaps unanticipated. Advanced computational simulation environments can beneficially augment research to predict, assess and mitigate potential hazards to astronaut health. The NASA Digital Astronaut Project (DAP), within the NASA Human Research Program, strives to achieve this goal.
2009 High Performance Computing Modernization Program Users Group Conference
2009-06-17
Asymmetric Threats Future Peer GWoT / ungoverned areas Irregular Warfare Low-end Asymmetric 1-4-2-1 (State-to-State War) Disruptive technologies Superiority...2008 “As changes in this century’s threat environment create strategic challenges – irregular warfare, weapons of mass destruction, disruptive ... technologies – this request places greater emphasis on basic research, which in recent years has not kept pace with other parts of the budget.” • Personnel
2007-03-01
Intelligence AIS Artificial Immune System ANN Artificial Neural Networks API Application Programming Interface BFS Breadth-First Search BIS Biological...problem domain is too large for only one algorithm’s application . It ranges from network - based sniffer systems, responsible for Enterprise-wide coverage...options to network administrators in choosing detectors to employ in future ID applications . Objectives Our hypothesis validity is based on a set
An inside look at NASA planetology
NASA Technical Reports Server (NTRS)
Dwornik, S. E.
1976-01-01
Staffing, financing and budget controls, and research grant allocations of NASA are reviewed with emphasis on NASA-supported research in planetary geological sciences: studies of the composition, structure, and history of solar system planets. Programs, techniques, and research grants for studies of Mars photographs acquired through Mariner 6-10 flights are discussed at length, and particularly the handling of computer-enhanced photographic data. Scheduled future NASA-sponsored planet exploration missions (to Mars, Jupiter, Saturn, Uranus) are mentioned.
An Investigation of Unified Memory Access Performance in CUDA
Landaverde, Raphael; Zhang, Tiansheng; Coskun, Ayse K.; Herbordt, Martin
2015-01-01
Managing memory between the CPU and GPU is a major challenge in GPU computing. A programming model, Unified Memory Access (UMA), has been recently introduced by Nvidia to simplify the complexities of memory management while claiming good overall performance. In this paper, we investigate this programming model and evaluate its performance and programming model simplifications based on our experimental results. We find that beyond on-demand data transfers to the CPU, the GPU is also able to request subsets of data it requires on demand. This feature allows UMA to outperform full data transfer methods for certain parallel applications and small data sizes. We also find, however, that for the majority of applications and memory access patterns, the performance overheads associated with UMA are significant, while the simplifications to the programming model restrict flexibility for adding future optimizations. PMID:26594668
Selected methods for quantification of community exposure to aircraft noise
NASA Technical Reports Server (NTRS)
Edge, P. M., Jr.; Cawthorn, J. M.
1976-01-01
A review of the state-of-the-art for the quantification of community exposure to aircraft noise is presented. Physical aspects, people response considerations, and practicalities of useful application of scales of measure are included. Historical background up through the current technology is briefly presented. The developments of both single-event and multiple-event scales are covered. Selective choice is made of scales currently in the forefront of interest and recommended methodology is presented for use in computer programing to translate aircraft noise data into predictions of community noise exposure. Brief consideration is given to future programing developments and to supportive research needs.
NASA Technical Reports Server (NTRS)
Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.
1985-01-01
Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.
Parallel and Portable Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Lee, S. R.; Cummings, J. C.; Nolen, S. D.; Keen, N. D.
1997-08-01
We have developed a multi-group, Monte Carlo neutron transport code in C++ using object-oriented methods and the Parallel Object-Oriented Methods and Applications (POOMA) class library. This transport code, called MC++, currently computes k and α eigenvalues of the neutron transport equation on a rectilinear computational mesh. It is portable to and runs in parallel on a wide variety of platforms, including MPPs, clustered SMPs, and individual workstations. It contains appropriate classes and abstractions for particle transport and, through the use of POOMA, for portable parallelism. Current capabilities are discussed, along with physics and performance results for several test problems on a variety of hardware, including all three Accelerated Strategic Computing Initiative (ASCI) platforms. Current parallel performance indicates the ability to compute α-eigenvalues in seconds or minutes rather than days or weeks. Current and future work on the implementation of a general transport physics framework (TPF) is also described. This TPF employs modern C++ programming techniques to provide simplified user interfaces, generic STL-style programming, and compile-time performance optimization. Physics capabilities of the TPF will be extended to include continuous energy treatments, implicit Monte Carlo algorithms, and a variety of convergence acceleration techniques such as importance combing.
Global information infrastructure.
Lindberg, D A
1994-01-01
The High Performance Computing and Communications Program (HPCC) is a multiagency federal initiative under the leadership of the White House Office of Science and Technology Policy, established by the High Performance Computing Act of 1991. It has been assigned a critical role in supporting the international collaboration essential to science and to health care. Goals of the HPCC are to extend USA leadership in high performance computing and networking technologies; to improve technology transfer for economic competitiveness, education, and national security; and to provide a key part of the foundation for the National Information Infrastructure. The first component of the National Institutes of Health to participate in the HPCC, the National Library of Medicine (NLM), recently issued a solicitation for proposals to address a range of issues, from privacy to 'testbed' networks, 'virtual reality,' and more. These efforts will build upon the NLM's extensive outreach program and other initiatives, including the Unified Medical Language System (UMLS), MEDLARS, and Grateful Med. New Internet search tools are emerging, such as Gopher and 'Knowbots'. Medicine will succeed in developing future intelligent agents to assist in utilizing computer networks. Our ability to serve patients is so often restricted by lack of information and knowledge at the time and place of medical decision-making. The new technologies, properly employed, will also greatly enhance our ability to serve the patient.
NASA Technical Reports Server (NTRS)
Ramakrishnan, R.; Randall, D.; Hosier, R. N.
1976-01-01
The programing language used is FORTRAN IV. A description of all main and subprograms is provided so that any user possessing a FORTRAN compiler and random access capability can adapt the program to his facility. Rotor blade surface-pressure spectra can be used by the program to calculate: (1) blade station loading spectra, (2) chordwise and/or spanwise integrated blade-loading spectra, and (3) far-field rotational noise spectra. Any of five standard inline functions describing the chordwise distribution of the blade loading can be chosen in order to study parametrically the acoustic predictions. The program output consists of both printed and graphic descriptions of the blade-loading coefficient spectra and far-field acoustic spectrum. The results may also be written on binary file for future processing. Examples of the application of the program along with a description of the rotational noise prediction theory on which the program is based are also provided.
Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program
NASA Technical Reports Server (NTRS)
Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.
2010-01-01
The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.
NASA Astrophysics Data System (ADS)
Baytak, Ahmet
Among educational researchers and practitioners, there is a growing interest in employing computer games for pedagogical purposes. The present research integrated a technology education class and a science class where 5 th graders learned about environmental issues by designing games that involved environmental concepts. The purposes of this study were to investigate how designing computer games affected the development of students' environmental knowledge, programming knowledge, environmental awareness and interest in computers. It also explored the nature of the artifacts developed and the types of knowledge represented therein. A case study (Yin, 2003) was employed within the context of a 5 th grade elementary science classroom. Fifth graders designed computer games about environmental issues to present to 2nd graders by using Scratch software. The analysis of this study was based on multiple data sources: students' pre- and post-test scores on environmental awareness, their environmental knowledge, their interest in computer science, and their game design. Included in the analyses were also data from students' computer games, participant observations, and structured interviews. The results of the study showed that students were able to successfully design functional games that represented their understanding of environment, even though the gain between pre- and post-environmental knowledge test and environmental awareness survey were minimal. The findings indicate that all students were able to use various game characteristics and programming concepts, but their prior experience with the design software affected their representations. The analyses of the interview transcriptions and games show that students improved their programming skills and that they wanted to do similar projects for other subject areas in the future. Observations showed that game design appeared to lead to knowledge-building, interaction and collaboration among students. This, in turn, encouraged students to test and improve their designs. Sharing the games, it was found, has both positive and negative effects on the students' game design process and the representation of students' understandings of the domain subject.
Graphical Visualization of Human Exploration Capabilities
NASA Technical Reports Server (NTRS)
Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex
2016-01-01
NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description of planned future work to modify the computer program to include additional data and of alternate capability roadmap formats currently under consideration.
Toward a molecular programming language for algorithmic self-assembly
NASA Astrophysics Data System (ADS)
Patitz, Matthew John
Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.
Karanovic, Marinko; Muffels, Christopher T.; Tonkin, Matthew J.; Hunt, Randall J.
2012-01-01
Models of environmental systems have become increasingly complex, incorporating increasingly large numbers of parameters in an effort to represent physical processes on a scale approaching that at which they occur in nature. Consequently, the inverse problem of parameter estimation (specifically, model calibration) and subsequent uncertainty analysis have become increasingly computation-intensive endeavors. Fortunately, advances in computing have made computational power equivalent to that of dozens to hundreds of desktop computers accessible through a variety of alternate means: modelers have various possibilities, ranging from traditional Local Area Networks (LANs) to cloud computing. Commonly used parameter estimation software is well suited to take advantage of the availability of such increased computing power. Unfortunately, logistical issues become increasingly important as an increasing number and variety of computers are brought to bear on the inverse problem. To facilitate efficient access to disparate computer resources, the PESTCommander program documented herein has been developed to provide a Graphical User Interface (GUI) that facilitates the management of model files ("file management") and remote launching and termination of "slave" computers across a distributed network of computers ("run management"). In version 1.0 described here, PESTCommander can access and ascertain resources across traditional Windows LANs: however, the architecture of PESTCommander has been developed with the intent that future releases will be able to access computing resources (1) via trusted domains established in Wide Area Networks (WANs) in multiple remote locations and (2) via heterogeneous networks of Windows- and Unix-based operating systems. The design of PESTCommander also makes it suitable for extension to other computational resources, such as those that are available via cloud computing. Version 1.0 of PESTCommander was developed primarily to work with the parameter estimation software PEST; the discussion presented in this report focuses on the use of the PESTCommander together with Parallel PEST. However, PESTCommander can be used with a wide variety of programs and models that require management, distribution, and cleanup of files before or after model execution. In addition to its use with the Parallel PEST program suite, discussion is also included in this report regarding the use of PESTCommander with the Global Run Manager GENIE, which was developed simultaneously with PESTCommander.
Structural Weight Estimation for Launch Vehicles
NASA Technical Reports Server (NTRS)
Cerro, Jeff; Martinovic, Zoran; Su, Philip; Eldred, Lloyd
2002-01-01
This paper describes some of the work in progress to develop automated structural weight estimation procedures within the Vehicle Analysis Branch (VAB) of the NASA Langley Research Center. One task of the VAB is to perform system studies at the conceptual and early preliminary design stages on launch vehicles and in-space transportation systems. Some examples of these studies for Earth to Orbit (ETO) systems are the Future Space Transportation System [1], Orbit On Demand Vehicle [2], Venture Star [3], and the Personnel Rescue Vehicle[4]. Structural weight calculation for launch vehicle studies can exist on several levels of fidelity. Typically historically based weight equations are used in a vehicle sizing program. Many of the studies in the vehicle analysis branch have been enhanced in terms of structural weight fraction prediction by utilizing some level of off-line structural analysis to incorporate material property, load intensity, and configuration effects which may not be captured by the historical weight equations. Modification of Mass Estimating Relationships (MER's) to assess design and technology impacts on vehicle performance are necessary to prioritize design and technology development decisions. Modern CAD/CAE software, ever increasing computational power and platform independent computer programming languages such as JAVA provide new means to create greater depth of analysis tools which can be included into the conceptual design phase of launch vehicle development. Commercial framework computing environments provide easy to program techniques which coordinate and implement the flow of data in a distributed heterogeneous computing environment. It is the intent of this paper to present a process in development at NASA LaRC for enhanced structural weight estimation using this state of the art computational power.
NASA Technical Reports Server (NTRS)
Kraft, R. E.
1996-01-01
A computational method to predict modal reflection coefficients in cylindrical ducts has been developed based on the work of Homicz, Lordi, and Rehm, which uses the Wiener-Hopf method to account for the boundary conditions at the termination of a thin cylindrical pipe. The purpose of this study is to develop a computational routine to predict the reflection coefficients of higher order acoustic modes impinging on the unflanged termination of a cylindrical duct. This effort was conducted wider Task Order 5 of the NASA Lewis LET Program, Active Noise Control of aircraft Engines: Feasibility Study, and will be used as part of the development of an integrated source noise, acoustic propagation, ANC actuator coupling, and control system algorithm simulation. The reflection coefficient prediction will be incorporated into an existing cylindrical duct modal analysis to account for the reflection of modes from the duct termination. This will provide a more accurate, rapid computation design tool for evaluating the effect of reflected waves on active noise control systems mounted in the duct, as well as providing a tool for the design of acoustic treatment in inlet ducts. As an active noise control system design tool, the method can be used preliminary to more accurate but more numerically intensive acoustic propagation models such as finite element methods. The resulting computer program has been shown to give reasonable results, some examples of which are presented. Reliable data to use for comparison is scarce, so complete checkout is difficult, and further checkout is needed over a wider range of system parameters. In future efforts the method will be adapted as a subroutine to the GEAE segmented cylindrical duct modal analysis program.
Glasgow, Russell E.; Christiansen, Steve; Smith, K. Sabina; Stevens, Victor J.; Toobert, Deborah J.
2009-01-01
Computer-tailored behavior change programs offer the potential for reaching large populations at a much lower cost than individual or group-based programs. However, few of these programs to date appear to integrate behavioral theory with user choice, or combine different electronic modalities. We describe the development of an integrated CD-ROM and interactive voice response dietary change intervention that combines behavioral problem-solving theory with a high degree of user choice. The program, WISE CHOICES, is being evaluated as part of an ongoing trial. This paper describes the program development, emphasizing how user preferences are accommodated, and presents implementation and user satisfaction data. The program was successfully implemented; the linkages among the central database, the CD-ROM and the automated telephone components were robust, and participants liked the program almost as well as a counselor-delivered dietary change condition. Multi-modality programs that emphasize the strengths of each approach appear to be feasible. Future research is needed to determine the program impact and cost-effectiveness compared with counselor-delivered intervention. PMID:18711204
Clinical laboratory technician to clinical laboratory scientist articulation and distance learning.
Crowley, J R; Laurich, G A; Mobley, R C; Arnette, A H; Shaikh, A H; Martin, S M
1999-01-01
Laboratory workers and educators alike are challenged to support access to education that is current and provides opportunities for career advancement in the work place. The clinical laboratory science (CLS) program at the Medical College of Georgia in Augusta developed a clinical laboratory technician (CLT) to CLS articulation option, expanded it through distance learning, and integrated computer based learning technology into the educational process over a four year period to address technician needs for access to education. Both positive and negative outcomes were realized through these efforts. Twenty-seven students entered the pilot articulation program, graduated, and took a CLS certification examination. Measured in terms of CLS certification, promotions, pay raises, and career advancement, the program described was a success. However, major problems were encountered related to the use of unfamiliar communication technology; administration of the program at distance sites; communication between educational institutions, students, and employers; and competition with CLT programs for internship sites. These problems must be addressed in future efforts to provide a successful distance learning program. Effective methods for meeting educational needs and career ladder expectations of CLTs and their employers are important to the overall quality and appeal of the profession. Educational technology that includes computer-aided instruction, multimedia, and telecommunications can provide powerful tools for education in general and CLT articulation in particular. Careful preparation and vigilant attention to reliable delivery methods as well as students' progress and outcomes is critical for an efficient, economically feasible, and educationally sound program.
Parallel Signal Processing and System Simulation using aCe
NASA Technical Reports Server (NTRS)
Dorband, John E.; Aburdene, Maurice F.
2003-01-01
Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).
Computational modeling of epidermal cell fate determination systems.
Ryu, Kook Hui; Zheng, Xiaohua; Huang, Ling; Schiefelbein, John
2013-02-01
Cell fate decisions are of primary importance for plant development. Their simple 'either-or' outcome and dynamic nature has attracted the attention of computational modelers. Recent efforts have focused on modeling the determination of several epidermal cell types in the root and shoot of Arabidopsis where many molecular components have been defined. Results of integrated modeling and molecular biology experimentation in these systems have highlighted the importance of competitive positive and negative factors and interconnected feedback loops in generating flexible yet robust mechanisms for establishing distinct gene expression programs in neighboring cells. These models have proven useful in judging hypotheses and guiding future research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Computer simulation of explosion crater in dams with different buried depths of explosive
NASA Astrophysics Data System (ADS)
Zhang, Zhichao; Ye, Longzhen
2018-04-01
Based on multi-material ALE method, this paper conducted a computer simulation on the explosion crater in dams with different buried depths of explosive using LS-DYNA program. The results turn out that the crater size increases with the increase of buried depth of explosive at first, but closed explosion cavity rather than a visible crater is formed when the buried depth of explosive increases to some extent. The soil in the explosion cavity is taken away by the explosion products and the soil under the explosion cavity is compressed with its density increased. The research can provide some reference for the anti-explosion design of dams in the future.
NASA Astrophysics Data System (ADS)
Linn, Marcia C.
1995-06-01
Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.
Parallel programming of industrial applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, M; Koniges, A; Simon, H
1998-07-21
In the introductory material, we overview the typical MPP environment for real application computing and the special tools available such as parallel debuggers and performance analyzers. Next, we draw from a series of real applications codes and discuss the specific challenges and problems that are encountered in parallelizing these individual applications. The application areas drawn from include biomedical sciences, materials processing and design, plasma and fluid dynamics, and others. We show how it was possible to get a particular application to run efficiently and what steps were necessary. Finally we end with a summary of the lessons learned from thesemore » applications and predictions for the future of industrial parallel computing. This tutorial is based on material from a forthcoming book entitled: "Industrial Strength Parallel Computing" to be published by Morgan Kaufmann Publishers (ISBN l-55860-54).« less
NASA Technical Reports Server (NTRS)
Bradford, D. F.; Kelejian, H. H.; Brusch, R.; Gross, J.; Fishman, H.; Feenberg, D.
1974-01-01
The value of improving information for forecasting future crop harvests was investigated. Emphasis was placed upon establishing practical evaluation procedures firmly based in economic theory. The analysis was applied to the case of U.S. domestic wheat consumption. Estimates for a cost of storage function and a demand function for wheat were calculated. A model of market determinations of wheat inventories was developed for inventory adjustment. The carry-over horizon is computed by the solution of a nonlinear programming problem, and related variables such as spot and future price at each stage are determined. The model is adaptable to other markets. Results are shown to depend critically on the accuracy of current and proposed measurement techniques. The quantitative results are presented parametrically, in terms of various possible values of current and future accuracies.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…
Computation Directorate Annual Report 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L; McGraw, J R; Ashby, S F
Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less
Solving multistage stochastic programming models of portfolio selection with outstanding liabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edirisinghe, C.
1994-12-31
Models for portfolio selection in the presence of an outstanding liability have received significant attention, for example, models for pricing options. The problem may be described briefly as follows: given a set of risky securities (and a riskless security such as a bond), and given a set of cash flows, i.e., outstanding liability, to be met at some future date, determine an initial portfolio and a dynamic trading strategy for the underlying securities such that the initial cost of the portfolio is within a prescribed wealth level and the expected cash surpluses arising from trading is maximized. While the tradingmore » strategy should be self-financing, there may also be other restrictions such as leverage and short-sale constraints. Usually the treatment is limited to binomial evolution of uncertainty (of stock price), with possible extensions for developing computational bounds for multinomial generalizations. Posing as stochastic programming models of decision making, we investigate alternative efficient solution procedures under continuous evolution of uncertainty, for discrete time economies. We point out an important moment problem arising in the portfolio selection problem, the solution (or bounds) on which provides the basis for developing efficient computational algorithms. While the underlying stochastic program may be computationally tedious even for a modest number of trading opportunities (i.e., time periods), the derived algorithms may used to solve problems whose sizes are beyond those considered within stochastic optimization.« less
NASA Astrophysics Data System (ADS)
Nakagawa, Y.; Kawahara, S.; Araki, F.; Matsuoka, D.; Ishikawa, Y.; Fujita, M.; Sugimoto, S.; Okada, Y.; Kawazoe, S.; Watanabe, S.; Ishii, M.; Mizuta, R.; Murata, A.; Kawase, H.
2017-12-01
Analyses of large ensemble data are quite useful in order to produce probabilistic effect projection of climate change. Ensemble data of "+2K future climate simulations" are currently produced by Japanese national project "Social Implementation Program on Climate Change Adaptation Technology (SI-CAT)" as a part of a database for Policy Decision making for Future climate change (d4PDF; Mizuta et al. 2016) produced by Program for Risk Information on Climate Change. Those data consist of global warming simulations and regional downscaling simulations. Considering that those data volumes are too large (a few petabyte) to download to a local computer of users, a user-friendly system is required to search and download data which satisfy requests of the users. We develop "a database system for near-future climate change projections" for providing functions to find necessary data for the users under SI-CAT. The database system for near-future climate change projections mainly consists of a relational database, a data download function and user interface. The relational database using PostgreSQL is a key function among them. Temporally and spatially compressed data are registered on the relational database. As a first step, we develop the relational database for precipitation, temperature and track data of typhoon according to requests by SI-CAT members. The data download function using Open-source Project for a Network Data Access Protocol (OPeNDAP) provides a function to download temporally and spatially extracted data based on search results obtained by the relational database. We also develop the web-based user interface for using the relational database and the data download function. A prototype of the database system for near-future climate change projections are currently in operational test on our local server. The database system for near-future climate change projections will be released on Data Integration and Analysis System Program (DIAS) in fiscal year 2017. Techniques of the database system for near-future climate change projections might be quite useful for simulation and observational data in other research fields. We report current status of development and some case studies of the database system for near-future climate change projections.