Sample records for applications program tasks

  1. Development of a task-level robot programming and simulation system

    NASA Technical Reports Server (NTRS)

    Liu, H.; Kawamura, K.; Narayanan, S.; Zhang, G.; Franke, H.; Ozkan, M.; Arima, H.; Liu, H.

    1987-01-01

    An ongoing project in developing a Task-Level Robot Programming and Simulation System (TARPS) is discussed. The objective of this approach is to design a generic TARPS that can be used in a variety of applications. Many robotic applications require off-line programming, and a TARPS is very useful in such applications. Task level programming is object centered in that the user specifies tasks to be performed instead of robot paths. Graphics simulation provides greater flexibility and also avoids costly machine setup and possible damage. A TARPS has three major modules: world model, task planner and task simulator. The system architecture, design issues and some preliminary results are given.

  2. Microgravity science and applications. Program tasks and bibliography for FY 1994

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This annual report includes research projects funded by the Office of Life and Microgravity Sciences and Applications, Microgravity Science and Applications Division, during FY 1994. It is a compilation of program tasks (objective, description, significance, progress, students funded under research, and bibliographic citations) for flight research and ground-based research in five major scientific disciplines: benchmark science, biotechnology, combustion science, fluid physics, and materials science. ATD (Advanced Technology Development) program task descriptions are also included. The bibliography cites the related PI (Principal Investigator) publications and presentations for these program tasks in FY 1994. Three appendices include Table of Acronyms, Guest Investigator Index, and Principal Investigator Index.

  3. Microgravity science & applications. Program tasks and bibliography for FY 1995

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This annual report includes research projects funded by the Office of Life and Microgravity Sciences and Applications, Microgravity Science and Applications Division, during FY 1994. It is a compilation of program tasks (objective, description, significance, progress, students funded under research, and bibliographic citations) for flight research and ground based research in five major scientific disciplines: benchmark science, biotechnology, combustion science, fluid physics, and materials science. Advanced technology development (ATD) program task descriptions are also included. The bibliography cites the related principle investigator (PI) publications and presentations for these program tasks in FY 1994. Three appendices include a Table of Acronyms, a Guest Investigator index and a Principle Investigator index.

  4. Characterizing and Mitigating Work Time Inflation in Task Parallel Programs

    DOE PAGES

    Olivier, Stephen L.; de Supinski, Bronis R.; Schulz, Martin; ...

    2013-01-01

    Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems.more » Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.« less

  5. Microgravity science and applications program tasks, 1991 revision

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Presented here is a compilation of the active research tasks for FY 1991 sponsored by the Microgravity Science and Applications Division of the NASA Office of Space Science and Applications. The purpose is to provide an overview of the program scope for managers and scientists in industry, university, and government communities. Included is an introductory description of the program, the strategy and overall goal, identification of the organizational structures and the people involved, and a description of each. The tasks are grouped into several categories: electronic materials; solidification of metals, alloys, and composites; fluids, interfaces, and transport; biotechnology; combustion science; glasses and ceramics; experimental technology, instrumentation, and facilities; and Physical and Chemistry Experiments (PACE). The tasks cover both the ground based and flight programs.

  6. Synthetic Proxy Infrastructure for Task Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Junghans, Christoph; Pavel, Robert

    The Synthetic Proxy Infrastructure for Task Evaluation is a proxy application designed to support application developers in gauging the performance of various task granularities when determining how best to utilize task based programming models.The infrastructure is designed to provide examples of common communication patterns with a synthetic workload intended to provide performance data to evaluate programming model and platform overheads for the purpose of determining task granularity for task decomposition purposes. This is presented as a reference implementation of a proxy application with run-time configurable input and output task dependencies ranging from an embarrassingly parallel scenario to patterns with stencil-likemore » dependencies upon their nearest neighbors. Once all, if any, inputs are satisfied each task will execute a synthetic workload (a simple DGEMM of in this case) of varying size and output all, if any, outputs to the next tasks.The intent is for this reference implementation to be implemented as a proxy app in different programming models so as to provide the same infrastructure and to allow for application developers to simulate their own communication needs to assist in task decomposition under various models on a given platform.« less

  7. Microgravity Science and Applications Program tasks, 1987 revision

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A compilation is presented of the active research tasks as of the end of the FY87 of the Microgravity Science and Applications Program, NASA-Office of Space Science and Applications, involving several NASA centers and other organizations. An overview is provided of the program scope for managers and scientists in industry, university, and government communities. An introductory description is provided of the program along with the strategy and overall goal, identification of the organizational structures and people involved, and a description of each task. A list of recent publications is also provided. The tasks are grouped into six major categories: Electronic Materials; Solidification of Metals, Alloys, and Composites; Fluid Dynamics and Transport Phenomena; Biotechnology; Glasses and Ceramics; and Combustion. Other categories include Experimental Technology, General Studies and Surveys; Foreign Government Affiliations; Industrial Affiliations; and Physics and Chemistry Experiments (PACE). The tasks are divided into ground based and flight experiments.

  8. Microgravity Science and Applications Program tasks, 1988 revision

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The active research tasks as of the end of the fiscal year 1988 of the Microgravity Science and Applications Program, NASA-Office of Space Science and Applications, involving several NASA centers and other organizations are compiled. The purpose is to provide an overview of the program scope for managers and scientists in industry, university, and government communities. Also included are an introductory description of the program, the strategy and overall goal, identification of the organizational structures and people involved, and a description of each task. A list of recent publications is provided. The tasks are grouped into six major categories: electronic materials; solidification of metals, alloys, and composites; fluid dynamics and transport phenomena; biotechnology; glasses and ceramics; and combustion. Other categories include experimental technology, general studies and surveys; foreign government affiliations; industrial affiliations; and Physics And Chemistry Experiments (PACE). The tasks are divided into ground-based and flight experiments.

  9. Microgravity Science and Applications Program Tasks, 1984 Revision

    NASA Technical Reports Server (NTRS)

    Pentecost, E. (Compiler)

    1985-01-01

    This report is a compilation of the active research tasks as of the end of the fiscal year 1984 of the Microgravity Science and Applications Program, NASA-Office of Space Science and Applications, involving several NASA centers and other organizations. The purpose of the document is to provide an overview of the program scope for managers and scientists in industry, university, and government communities. The report is structured to include an introductory description of the program, strategy and overall goal; identification of the organizational structures and people involved; and a description of each research task, together with a list of recent publications. The tasks are grouped into six categories: (1) electronic materials; (2) solidification of metals, alloys, and composites; (3) fluid dynamics and transports; (4) biotechnology; (5) glasses and ceramics; and (6) combustion.

  10. Microgravity Science and Application Program tasks, 1989 revision

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The active research tasks, as of the fiscal year 1989, of the Microgravity Science and Applications Program, NASA Office of Space Science and Applications, involving several NASA Centers and other organizations are compiled. The purpose is to provide an overview of the program scope for managers and scientists in industry, university, and government communities. The scientists in industry, university, and government communities. An introductory description of the program, the strategy and overall goal, identification of the organizational structures and people involved, and a description of each task are included. Also provided is a list of recent publications. The tasks are grouped into several major categories: electronic materials, solidification of metals, alloys, and composites; fluids, interfaces, and transport; biotechnology; glasses and ceramics; combustion science; physical and chemistry experiments (PACE); and experimental technology, facilities, and instrumentation.

  11. Microgravity Science and Applications Program tasks, 1990 revision

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The active research tasks as of the end of the fiscal year 1990 sponsored by the Microgravity Science and Applications Division of the NASA Office of Space Science and Applications are compiled. The purpose is to provide an overview of the program scope for managers and scientists in industry, university, and government communities. The report includes an introductory description of the program, the strategy and overall goal; an index of principle investigators; and a description of each task. A list of recent publications is also provided. The tasks are grouped into six major categories: electronic materials; solidification of metals, alloys, and composites; fluid dynamics and transport phenomena; biotechnology; glasses and ceramics; combustion; experimental technology; facilities; and Physics And Chemistry Experiments (PACE). The tasks are divided into ground-based and flight experiments.

  12. An Internal Data Non-hiding Type Real-time Kernel and its Application to the Mechatronics Controller

    NASA Astrophysics Data System (ADS)

    Yoshida, Toshio

    For the mechatronics equipment controller that controls robots and machine tools, high-speed motion control processing is essential. The software system of the controller like other embedded systems is composed of three layers software such as real-time kernel layer, middleware layer, and application software layer on the dedicated hardware. The application layer in the top layer is composed of many numbers of tasks, and application function of the system is realized by the cooperation between these tasks. In this paper we propose an internal data non-hiding type real-time kernel in which customizing the task control is possible only by change in the program code of the task side without any changes in the program code of real-time kernel. It is necessary to reduce the overhead caused by the real-time kernel task control for the speed-up of the motion control of the mechatronics equipment. For this, customizing the task control function is needed. We developed internal data non-cryptic type real-time kernel ZRK to evaluate this method, and applied to the control of the multi system automatic lathe. The effect of the speed-up of the task cooperation processing was able to be confirmed by combined task control processing on the task side program code using an internal data non-hiding type real-time kernel ZRK.

  13. Dynamic mobility applications open source application development portal : Task 3.3 : concept of operations : final report.

    DOT National Transportation Integrated Search

    2016-10-12

    The Dynamic Mobility Applications (DMA) program seeks to promote the highest level of collaboration and preservation of intellectual capital generated from application development and associated research activities funded by the program. The program ...

  14. NASA-Ames workload research program

    NASA Technical Reports Server (NTRS)

    Hart, Sandra

    1988-01-01

    Research has been underway for several years to develop valid and reliable measures and predictors of workload as a function of operator state, task requirements, and system resources. Although the initial focus of this research was on aeronautics, the underlying principles and methodologies are equally applicable to space, and provide a set of tools that NASA and its contractors can use to evaluate design alternatives from the perspective of the astronauts. Objectives and approach of the research program are described, as well as the resources used in conducting research and the conceptual framework around which the program evolved. Next, standardized tasks are described, in addition to predictive models and assessment techniques and their application to the space program. Finally, some of the operational applications of these tasks and measures are reviewed.

  15. Materials processing in space program tasks

    NASA Technical Reports Server (NTRS)

    Mckannan, E. C. (Editor)

    1978-01-01

    A list of active research tasks as of the end of 1978 of the Materials Processing in Space Program of the Office of Space and Terrestrial Applications, involving several NASA Centers and other organizations is reported. An overview of the program scope for managers and scientists in industry, university and government communities is provided. The program, its history, strategy and overall goal; the organizational structures and people involved; and each research task are described. Tasks are categorized by ground based research according to four process areas. Cross references to the performing organizations and principal investigators are provided.

  16. Applications of artificial intelligence to mission planning

    NASA Technical Reports Server (NTRS)

    Ford, Donnie R.; Floyd, Stephen A.; Rogers, John S.

    1990-01-01

    The following subject areas are covered: object-oriented programming task; rule-based programming task; algorithms for resource allocation; connecting a Symbolics to a VAX; FORTRAN from Lisp; trees and forest task; software data structure conversion; software functionality modifications and enhancements; portability of resource allocation to a TI MicroExplorer; frontier of feasibility software system; and conclusions.

  17. Machine Learning Based Online Performance Prediction for Runtime Parallelization and Task Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, J; Ma, X; Singh, K

    2008-10-09

    With the emerging many-core paradigm, parallel programming must extend beyond its traditional realm of scientific applications. Converting existing sequential applications as well as developing next-generation software requires assistance from hardware, compilers and runtime systems to exploit parallelism transparently within applications. These systems must decompose applications into tasks that can be executed in parallel and then schedule those tasks to minimize load imbalance. However, many systems lack a priori knowledge about the execution time of all tasks to perform effective load balancing with low scheduling overhead. In this paper, we approach this fundamental problem using machine learning techniques first to generatemore » performance models for all tasks and then applying those models to perform automatic performance prediction across program executions. We also extend an existing scheduling algorithm to use generated task cost estimates for online task partitioning and scheduling. We implement the above techniques in the pR framework, which transparently parallelizes scripts in the popular R language, and evaluate their performance and overhead with both a real-world application and a large number of synthetic representative test scripts. Our experimental results show that our proposed approach significantly improves task partitioning and scheduling, with maximum improvements of 21.8%, 40.3% and 22.1% and average improvements of 15.9%, 16.9% and 4.2% for LMM (a real R application) and synthetic test cases with independent and dependent tasks, respectively.« less

  18. Materials processing in space programs tasks. [NASA research tasks

    NASA Technical Reports Server (NTRS)

    Pentecost, E.

    1981-01-01

    Active research tasks as of the end of fiscal year 1981 of the materials processing in space program, NASA Office of Space and Terrestrial Applications are summarized to provide an overview of the program scope for managers and scientists in industry, university, and government communities. The program, its history, strategy, and overall goal are described the organizational structures and people involved are identified and a list of recent publications is given for each research task. Four categories: Crystal Growth; Solidification of Metals, Alloys, and Composites; Fluids, Transports, and Chemical Processes, and Ultrahigh Vacuum and Containerless Processing Technologies are used to group the tasks. Some tasks are placed in more than one category to insure complete coverage of each category.

  19. Ames Research Center SR&T program and earth observations

    NASA Technical Reports Server (NTRS)

    Poppoff, I. G.

    1972-01-01

    An overview is presented of the research activities in earth observations at Ames Research Center. Most of the tasks involve the use of research aircraft platforms. The program is also directed toward the use of the Illiac 4 computer for statistical analysis. Most tasks are weighted toward Pacific coast and Pacific basin problems with emphasis on water applications, air applications, animal migration studies, and geophysics.

  20. SOLID STATE ENERGY CONVERSION ALLIANCE DELPHI SOLID OXIDE FUEL CELL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Shaffer; Sean Kelly; Subhasish Mukerjee

    2004-05-07

    The objective of this project is to develop a 5 kW Solid Oxide Fuel Cell power system for a range of fuels and applications. During Phase I, the following will be accomplished: Develop and demonstrate technology transfer efforts on a 5 kW stationary distributed power generation system that incorporates steam reforming of natural gas with the option of piped-in water (Demonstration System A). Initiate development of a 5 kW system for later mass-market automotive auxiliary power unit application, which will incorporate Catalytic Partial Oxidation (CPO) reforming of gasoline, with anode exhaust gas injected into an ultra-lean burn internal combustion engine.more » This technical progress report covers work performed by Delphi from July 1, 2003 to December 31, 2003, under Department of Energy Cooperative Agreement DE-FC-02NT41246. This report highlights technical results of the work performed under the following tasks: Task 1 System Design and Integration; Task 2 Solid Oxide Fuel Cell Stack Developments; Task 3 Reformer Developments; Task 4 Development of Balance of Plant (BOP) Components; Task 5 Manufacturing Development (Privately Funded); Task 6 System Fabrication; Task 7 System Testing; Task 8 Program Management; Task 9 Stack Testing with Coal-Based Reformate; and Task 10 Technology Transfer from SECA CORE Technology Program. In this reporting period, unless otherwise noted Task 6--System Fabrication and Task 7--System Testing will be reported within Task 1 System Design and Integration. Task 8--Program Management, Task 9--Stack Testing with Coal Based Reformate, and Task 10--Technology Transfer from SECA CORE Technology Program will be reported on in the Executive Summary section of this report.« less

  1. Building Task-Oriented Applications: An Introduction to the Legion Programming Paradigm

    DTIC Science & Technology

    2015-02-01

    These domain definitions are validated prior to execution and represent logical regions that each task can access and manipulate as per the dictates of...Introducing Enzo, an AMR cosmology application, in adaptive mesh refinement - theory and applications. Chicago (IL): Springer Berlin Heidelberg; c2005. p

  2. The Partners in Prevention Program: The Evaluation and Evolution of the Task-Centered Case Management Model

    ERIC Educational Resources Information Center

    Colvin, Julanne; Lee, Mingun; Magnano, Julienne; Smith, Valerie

    2008-01-01

    This article reports on the further development of the task-centered model for difficulties in school performance. We used Bailey-Dempsey and Reid's (1996) application of Rothman and Thomas's (1994) design and development framework and annual evaluations of the Partners in Prevention (PIP) Program to refine the task-centered case management model.…

  3. Materials processing in space program tasks

    NASA Technical Reports Server (NTRS)

    Naumann, R. J. (Editor)

    1980-01-01

    The history, strategy, and overall goal of NASA's Office of Space and Terrestrial Applications program for materials processing in space are described as well as the organizational structures and personnel involved. An overview of each research task is presented and recent publications are listed.

  4. Natural Resources Management: Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    James Madison Univ., Harrisonburg, VA.

    This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education for natural resources management courses in the agricultural resources program. Section 1 contains a validated task inventory for natural resources management. For each task, applicable information…

  5. Evolution of a minimal parallel programming model

    DOE PAGES

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-04-30

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  6. Programming distributed medical applications with XWCH2.

    PubMed

    Ben Belgacem, Mohamed; Niinimaki, Marko; Abdennadher, Nabil

    2010-01-01

    Many medical applications utilise distributed/parallel computing in order to cope with demands of large data or computing power requirements. In this paper, we present a new version of the XtremWeb-CH (XWCH) platform, and demonstrate two medical applications that run on XWCH. The platform is versatile in a way that it supports direct communication between tasks. When tasks cannot communicate directly, warehouses are used as intermediary nodes between "producer" and "consumer" tasks. New features have been developed to provide improved support for writing powerfull distributed applications using an easy API.

  7. Biomedical applications engineering tasks

    NASA Technical Reports Server (NTRS)

    Laenger, C. J., Sr.

    1976-01-01

    The engineering tasks performed in response to needs articulated by clinicians are described. Initial contacts were made with these clinician-technology requestors by the Southwest Research Institute NASA Biomedical Applications Team. The basic purpose of the program was to effectively transfer aerospace technology into functional hardware to solve real biomedical problems.

  8. Horticulture III, IV, and V. Task Analyses. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education in the horticulture program. Section 1 contains a validated task inventory for horticulture III, IV, and V. For each task, applicable information pertaining to performance and enabling objectives,…

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  10. Checkout/demonstration application program for the SEL 840MP Multi-Processing Control System: Version 1 (MPCS/1)

    NASA Technical Reports Server (NTRS)

    Anderson, W. F.; Conway, J. R.; Keller, L. C.

    1972-01-01

    The characteristics of the application program were developed to verify and demonstrate the SEL 840MP Multi-Processing Control System - Version I (MPCS/1). The application program emphasizes the display support and task control capabilities. The application program is further intended to be used as an aid to familization with MPCS/1. It complements the information provided in the MPCS/1 Users Guide, Volume I and II.

  11. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  12. Decision Aids Using Heterogeneous Intelligence Analysis

    DTIC Science & Technology

    2010-08-20

    developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY

  13. Microgravity science and applications: Program tasks and bibliography for FY 1992

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report is a compilation of the FY 1992 Principal Investigator program task descriptions funded by the Microgravity Science and Applications Division (MSAD), NASA Headquarters, Washington, DC. The document also provides a bibliography of FY 1992 publications and presentations cited by MSAD Principal Investigators, and an index of the Principal Investigators and their affiliations. The purpose of the document is to provide an overview and progress report for the funded tasks for scientists and researchers in industry, university, and government communities. The tasks are grouped into three categories appropriate to the type of research being done-space flight, ground based, and advanced technology development-and by science discipline. The science disciplines are: biotechnology, combustion science,, electronic materials, fluid physics, fundamental physics, glass and ceramics, metals and alloys, and protein crystal growth.

  14. Solar Energy Task Force Report: Technical Training Guidelines.

    ERIC Educational Resources Information Center

    O'Connor, Kevin

    This task force report offers guidelines and information for the development of vocational education programs oriented to the commercial application of solar energy in water and space heating. After Section I introduces the Solar Energy Task Force and its activities, Section II outlines the task force's objectives and raises several issues and…

  15. Application of a Curriculum Hierarchy Evaluation (CHE) Model to Sequentially Arranged Tasks.

    ERIC Educational Resources Information Center

    O'Malley, J. Michael

    A curriculum hierarchy evaluation (CHE) model was developed by combining a transfer paradigm with an aptitude-treatment-task interaction (ATTI) paradigm. Positive transfer was predicted between sequentially arranged tasks, and a programed or nonprogramed treatment was predicted to interact with aptitude and with tasks. Eighteen four and five…

  16. Student-Advising Recommendations from the Council of Residency Directors Student Advising Task Force.

    PubMed

    Hillman, Emily; Lutfy-Clayton, Lucienne; Desai, Sameer; Kellogg, Adam; Zhang, Xiao Chi; Hu, Kevin; Hess, Jamie

    2017-01-01

    Residency training in emergency medicine (EM) is highly sought after by U.S. allopathic medical school seniors; recently there has been a marked increase in the number of applications per student, raising costs for students and programs. Disseminating accurate advising information to applicants and programs could reduce excessive applying. Advising students applying to EM is a critical role for educators, clerkship directors, and program leaders (residency program director, associate and assistant program directors). A variety of advising resources is available through social media and individual organizations; however, currently there are no consensus recommendations that bridge these resources. The Council of Residency Directors (CORD) Student Advising Task Force (SATF) was initiated in 2013 to improve medical student advising. The SATF developed best-practice consensus recommendations and resources for student advising. Four documents (Medical Student Planner, EM Applicant's Frequently Asked Questions, EM Applying Guide, and EM Medical Student Advisor Resource List) were developed and are intended to support prospective applicants and their advisors. The recommendations are designed for the mid-range EM applicant and will need to be tailored to students' individual needs.

  17. Evaluation of the Trajectory Operations Applications Software Task (TOAST)

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project under the auspices of the Mission Operations Directorate. Its purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle program. As an Application Manager, TOAST provides an isolation layer between the underlying Unix operating system and the series of user programs. It provides two main services: a common interface to operating system functions with semantics appropriate for C or FORTRAN, and a structured input and output package that can be utilized by user application programs. In order to evaluate TOAST as an Application Manager, the task was to assess current and planned capabilities, compare capabilities to functions available in commercially-available off the shelf (COTS) and Flight Analysis Design System (FADS) users for TOAST implementation. As a result of the investigation, it was found that the current version of TOAST is well implemented and meets the needs of the real-time users. The plans for migrating TOAST to the X Window System are essentially sound; the Executive will port with minor changes, while Menu Handler will require a total rewrite. A series of recommendations for future TOAST directions are included.

  18. Integrated Task and Data Parallel Programming

    NASA Technical Reports Server (NTRS)

    Grimshaw, A. S.

    1998-01-01

    This research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers 1995 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program. Additional 1995 Activities During the fall I collaborated with Andrew Grimshaw and Adam Ferrari to write a book chapter which will be included in Parallel Processing in C++ edited by Gregory Wilson. I also finished two courses, Compilers and Advanced Compilers, in 1995. These courses complete my class requirements at the University of Virginia. I have only my dissertation research and defense to complete.

  19. Integrated Task And Data Parallel Programming: Language Design

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; West, Emily A.

    1998-01-01

    his research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers '95 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program m. Additional 1995 Activities During the fall I collaborated with Andrew Grimshaw and Adam Ferrari to write a book chapter which will be included in Parallel Processing in C++ edited by Gregory Wilson. I also finished two courses, Compilers and Advanced Compilers, in 1995. These courses complete my class requirements at the University of Virginia. I have only my dissertation research and defense to complete.

  20. An enhanced Ada run-time system for real-time embedded processors

    NASA Technical Reports Server (NTRS)

    Sims, J. T.

    1991-01-01

    An enhanced Ada run-time system has been developed to support real-time embedded processor applications. The primary focus of this development effort has been on the tasking system and the memory management facilities of the run-time system. The tasking system has been extended to support efficient and precise periodic task execution as required for control applications. Event-driven task execution providing a means of task-asynchronous control and communication among Ada tasks is supported in this system. Inter-task control is even provided among tasks distributed on separate physical processors. The memory management system has been enhanced to provide object allocation and protected access support for memory shared between disjoint processors, each of which is executing a distinct Ada program.

  1. Design, fabrication and acceptance testing of a zero gravity whole body shower

    NASA Technical Reports Server (NTRS)

    Schumacher, E. A.; Lenda, J. A.

    1974-01-01

    Recent research and development programs have established the ability of the zero gravity whole body shower to maintain a comfortable environment in which the crewman can safely cleanse and dry the body. The purpose of this program was to further advance the technology of whole body bathing and to demonstrate technological readiness including in-flight maintenance by component replacement for flight applications. Three task efforts of this program are discussed. Conceptual designs and system tradeoffs were accomplished in task 1. Task 2 involved the formulation of preliminary and final designs for the shower, while task 3 included the fabrication and test of the shower assembly. Particular attention is paid to the evaluation and correction of test anomalies during the final phase of the program.

  2. Robot Task Commander with Extensible Programming Environment

    NASA Technical Reports Server (NTRS)

    Hart, Stephen W (Inventor); Wightman, Brian J (Inventor); Dinh, Duy Paul (Inventor); Yamokoski, John D. (Inventor); Gooding, Dustin R (Inventor)

    2014-01-01

    A system for developing distributed robot application-level software includes a robot having an associated control module which controls motion of the robot in response to a commanded task, and a robot task commander (RTC) in networked communication with the control module over a network transport layer (NTL). The RTC includes a script engine(s) and a GUI, with a processor and a centralized library of library blocks constructed from an interpretive computer programming code and having input and output connections. The GUI provides access to a Visual Programming Language (VPL) environment and a text editor. In executing a method, the VPL is opened, a task for the robot is built from the code library blocks, and data is assigned to input and output connections identifying input and output data for each block. A task sequence(s) is sent to the control module(s) over the NTL to command execution of the task.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, C.I.C.; Gillespie, B.L.

    One of the most perplexing problems facing the coal industry is how to properly dispose of the waste and/or even recovery a small fraction of the Btu value of the waste, while minimizing the environmental concerns. UCC Research considers this monumental environmental problems as an opportunity to recovery useable organic materials and reduce the environmental problems created by coal waste. Mild gasification is the method used by UCC Research to realize these objectives. Coal feedstocks are fed into the mild gasification system yielding liquids, char, and gases for commercial application. The program consists of seven tasks: Task 1, Characterize Managementmore » of Coal Preparation Wastes; Task 2, Review Design Specifications and Prepare Preliminary Test Plan; Task 3, Select and Characterize Test Feedstocks; Task 4, Acquire/Construct Process Elements; Task 5, Prepare Final Test Plan; Task 6, Implement Final Test Plan; Task 7, Analyze Test Results and Assess System Economics. A schedule of the program is given. The program was initiated on September 30, 1984. Tasks 1, 2, 3, 4, 5, and 6 have been completed. Work is continuing on Task 7.« less

  4. Materials processing in space program tasks

    NASA Technical Reports Server (NTRS)

    Pentecost, E. (Compiler)

    1982-01-01

    Active research areas as of the end of the fiscal year 1982 of the Materials Processing in Space Program, NASA-Office of Space and Terrestrial Applications, involving several NASA centers and other organizations are highlighted to provide an overview of the program scope for managers and scientists in industry, university, and government communities. The program is described as well as its history, strategy and overall goal; the organizational structures and people involved are identified and each research task is described together with a list of recent publications. The tasks are grouped into four categories: crystal growth; solidification of metals, alloys, and composites; fluids, transports, and chemical processes; and ultrahigh vacuum and containerless processing technologies.

  5. Student-Advising Recommendations from the Council of Residency Directors Student Advising Task Force

    PubMed Central

    Hillman, Emily; Lutfy-Clayton, Lucienne; Desai, Sameer; Kellogg, Adam; Zhang, Xiao Chi; Hu, Kevin; Hess, Jamie

    2017-01-01

    Residency training in emergency medicine (EM) is highly sought after by U.S. allopathic medical school seniors; recently there has been a marked increase in the number of applications per student, raising costs for students and programs. Disseminating accurate advising information to applicants and programs could reduce excessive applying. Advising students applying to EM is a critical role for educators, clerkship directors, and program leaders (residency program director, associate and assistant program directors). A variety of advising resources is available through social media and individual organizations; however, currently there are no consensus recommendations that bridge these resources. The Council of Residency Directors (CORD) Student Advising Task Force (SATF) was initiated in 2013 to improve medical student advising. The SATF developed best-practice consensus recommendations and resources for student advising. Four documents (Medical Student Planner, EM Applicant’s Frequently Asked Questions, EM Applying Guide, and EM Medical Student Advisor Resource List) were developed and are intended to support prospective applicants and their advisors. The recommendations are designed for the mid-range EM applicant and will need to be tailored to students’ individual needs. PMID:28116016

  6. The Canonical Robot Command Language (CRCL).

    PubMed

    Proctor, Frederick M; Balakirsky, Stephen B; Kootbally, Zeid; Kramer, Thomas R; Schlenoff, Craig I; Shackleford, William P

    2016-01-01

    Industrial robots can perform motion with sub-millimeter repeatability when programmed using the teach-and-playback method. While effective, this method requires significant up-front time, tying up the robot and a person during the teaching phase. Off-line programming can be used to generate robot programs, but the accuracy of this method is poor unless supplemented with good calibration to remove systematic errors, feed-forward models to anticipate robot response to loads, and sensing to compensate for unmodeled errors. These increase the complexity and up-front cost of the system, but the payback in the reduction of recurring teach programming time can be worth the effort. This payback especially benefits small-batch, short-turnaround applications typical of small-to-medium enterprises, who need the agility afforded by off-line application development to be competitive against low-cost manual labor. To fully benefit from this agile application tasking model, a common representation of tasks should be used that is understood by all of the resources required for the job: robots, tooling, sensors, and people. This paper describes an information model, the Canonical Robot Command Language (CRCL), which provides a high-level description of robot tasks and associated control and status information.

  7. The Canonical Robot Command Language (CRCL)

    PubMed Central

    Proctor, Frederick M.; Balakirsky, Stephen B.; Kootbally, Zeid; Kramer, Thomas R.; Schlenoff, Craig I.; Shackleford, William P.

    2017-01-01

    Industrial robots can perform motion with sub-millimeter repeatability when programmed using the teach-and-playback method. While effective, this method requires significant up-front time, tying up the robot and a person during the teaching phase. Off-line programming can be used to generate robot programs, but the accuracy of this method is poor unless supplemented with good calibration to remove systematic errors, feed-forward models to anticipate robot response to loads, and sensing to compensate for unmodeled errors. These increase the complexity and up-front cost of the system, but the payback in the reduction of recurring teach programming time can be worth the effort. This payback especially benefits small-batch, short-turnaround applications typical of small-to-medium enterprises, who need the agility afforded by off-line application development to be competitive against low-cost manual labor. To fully benefit from this agile application tasking model, a common representation of tasks should be used that is understood by all of the resources required for the job: robots, tooling, sensors, and people. This paper describes an information model, the Canonical Robot Command Language (CRCL), which provides a high-level description of robot tasks and associated control and status information. PMID:28529393

  8. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  9. Biomedical application in space, pilot program in the southern California region

    NASA Technical Reports Server (NTRS)

    Kelton, A. A.

    1979-01-01

    A pilot program is presented which was to promote utilization of the Shuttle/Spacelab for medical and biological research applied to terrestrial needs. The program was limited to the Southern California region and consisted of the following five tasks: (1) preparation of educational materials; (2) identification of principal investigators; (3) initial contact and visit; (4)development of promising applications; and (5) evaluation of regional program methodology.

  10. Baltimore applications project

    NASA Technical Reports Server (NTRS)

    Golden, T. S.; Yaffee, P.

    1978-01-01

    The Baltimore Applications Project (BAP) was originally designed as an experimental effort to assist the government of the City of Baltimore in applying technology to the solution of municipal problems. Recent modifications in the structuring and operation of the program are discussed. A tabular update on the individual tasks undertaken and their treatment is provided. Details of energy and nonenergy related tasks are presented in appendices.

  11. Toward a Systematic Approach for Selection of NASA Technology Portfolios

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Rodriguez, Guillermo; Alberto, Elfes; Smith, Jeffrey H.

    2004-01-01

    There is an important need for a consistent analytical foundation supporting the selection and monitoring of R&D tasks that support new system concepts that enable future NASA missions. This capability should be applicable at various degrees of abstraction, depending upon whether one is interested in formulation, development, or operations. It should also be applicable to a single project, a program comprised of a group of projects, an enterprise typically including multiple programs, and the overall agency itself. Emphasis here is on technology selection and new initiatives, but the same approach can be generalized to other applications, dealing, for example, with new system architectures, risk reduction, and task allocation among humans and machines. The purpose of this paper is to describe one such approach, which is in its early stages of implementation within NASA programs, and to discuss several illustrative examples.

  12. Management Auditing. Evaluation of the Marine Corps Task Analysis Program. Technical Report No. 5.

    ERIC Educational Resources Information Center

    Hemphill, John M., Jr.; Yoder, Dale

    The management audit is described for possible application as an extension of the mission of the Office of Manpower Utilization (OMU) of the U.S. Marine Corps. The present mission of OMU is viewed as a manpower research program to conduct task analysis of Marine Corps occupational fields. Purpose of the analyses is to improve the functional areas…

  13. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  14. Terrestrial applications of NASA space telerobotics technologies

    NASA Technical Reports Server (NTRS)

    Lavery, Dave

    1994-01-01

    In 1985 the National Aeronautics and Space Administration (NASA) instituted a research program in telerobotics to develop and provide the technology for applications of telerobotics to the United States space program. The activities of the program are intended to most effectively utilize limited astronaut time by facilitating tasks such as inspection, assembly, repair, and servicing, as well as providing extended capability for remotely conducting planetary surface operations. As the program matured, it also developed a strong heritage of working with government and industry to directly transfer the developed technology into industrial applications.

  15. Life Sciences Program Tasks and Bibliography

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This document includes information on all peer reviewed projects funded by the Office of Life and Microgravity Sciences and Applications, Life Sciences Division during fiscal year 1995. Additionally, this inaugural edition of the Task Book includes information for FY 1994 programs. This document will be published annually and made available to scientists in the space life sciences field both as a hard copy and as an interactive Internet web page

  16. Active microwave remote sensing research program plan. Recommendations of the Earth Resources Synthetic Aperture Radar Task Force. [application areas: vegetation canopies, surface water, surface morphology, rocks and soils, and man-made structures

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A research program plan developed by the Office of Space and Terrestrial Applications to provide guidelines for a concentrated effort to improve the understanding of the measurement capabilities of active microwave imaging sensors, and to define the role of such sensors in future Earth observations programs is outlined. The focus of the planned activities is on renewable and non-renewable resources. Five general application areas are addressed: (1) vegetation canopies, (2) surface water, (3) surface morphology, (4) rocks and soils, and (5) man-made structures. Research tasks are described which, when accomplished, will clearly establish the measurement capabilities in each area, and provide the theoretical and empirical results needed to specify and justify satellite systems using imaging radar sensors for global observations.

  17. COMP Superscalar, an interoperable programming framework

    NASA Astrophysics Data System (ADS)

    Badia, Rosa M.; Conejero, Javier; Diaz, Carlos; Ejarque, Jorge; Lezzi, Daniele; Lordan, Francesc; Ramon-Cortes, Cristian; Sirvent, Raul

    2015-12-01

    COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel tasks and (ii) annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  18. Determination of Tasks Required by Graduates of Manufacturing Engineering Technology Programs.

    ERIC Educational Resources Information Center

    Zirbel, Jay H.

    1993-01-01

    A Delphi panel of 14 experts identified 37 tasks performed by/qualities needed by manufacturing engineering technologists. Most important were work ethic, performance quality, communication skills, teamwork, computer applications, manufacturing basics, materials knowledge, troubleshooting, supervision, and global issues. (SK)

  19. Increasing self-regulatory energy using an Internet-based training application delivered by smartphone technology.

    PubMed

    Cranwell, Jo; Benford, Steve; Houghton, Robert J; Golembewski, Michael; Golembewksi, Michael; Fischer, Joel E; Hagger, Martin S

    2014-03-01

    Self-control resources can be defined in terms of "energy." Repeated attempts to override desires and impulses can result in a state of reduced self-control energy termed "ego depletion" leading to a reduced capacity to regulate future self-control behaviors effectively. Regular practice or "training" on self-control tasks may improve an individual's capacity to overcome ego depletion effectively. The current research tested the effectiveness of training using a novel Internet-based smartphone application to improve self-control and reduce ego depletion. In two experiments, participants were randomly assigned to either an experimental group, which received a daily program of self-control training using a modified Stroop-task Internet-based application delivered via smartphone to participants over a 4-week period, or a no-training control group. Participants assigned to the experimental group performed significantly better on post-training laboratory self-control tasks relative to participants in the control group. Findings support the hypothesized training effect on self-control and highlight the effectiveness of a novel Internet-based application delivered by smartphone as a practical means to administer and monitor a self-control training program. The smartphone training application has considerable advantages over other means to train self-control adopted in previous studies in that it has increased ecological validity and enables effective monitoring of compliance with the training program.

  20. Increasing Self-Regulatory Energy Using an Internet-Based Training Application Delivered by Smartphone Technology

    PubMed Central

    Benford, Steve; Houghton, Robert J.; Golembewksi, Michael; Fischer, Joel E.; Hagger, Martin S.

    2014-01-01

    Abstract Self-control resources can be defined in terms of “energy.” Repeated attempts to override desires and impulses can result in a state of reduced self-control energy termed “ego depletion” leading to a reduced capacity to regulate future self-control behaviors effectively. Regular practice or “training” on self-control tasks may improve an individual's capacity to overcome ego depletion effectively. The current research tested the effectiveness of training using a novel Internet-based smartphone application to improve self-control and reduce ego depletion. In two experiments, participants were randomly assigned to either an experimental group, which received a daily program of self-control training using a modified Stroop-task Internet-based application delivered via smartphone to participants over a 4-week period, or a no-training control group. Participants assigned to the experimental group performed significantly better on post-training laboratory self-control tasks relative to participants in the control group. Findings support the hypothesized training effect on self-control and highlight the effectiveness of a novel Internet-based application delivered by smartphone as a practical means to administer and monitor a self-control training program. The smartphone training application has considerable advantages over other means to train self-control adopted in previous studies in that it has increased ecological validity and enables effective monitoring of compliance with the training program. PMID:24015984

  1. NBS (National Bureau of Standards): Materials measurements

    NASA Technical Reports Server (NTRS)

    Manning, J. R.

    1985-01-01

    NBS work for NASA in support of NASA's Microgravity Science and Applications Program under NASA Government Order H-27954B (Properties of Electronic Materials) covering the period April 1, 1984 to March 31, 1985 is described. The work has been carried out in three independent tasks: Task 1--Surface Tensions and Their Variations with Temperature and Impurities; Task 2--Convention during Unidirectional Solidification; Task 3--Measurement of High Temperature Thermodynamic Properties. The results for each task are given separately in the body of the report.

  2. An Empirical Determination of Tasks Essential to Successful Performance as a Chemical Applicator. Determination of a Common Core of Basic Skills in Agribusiness and Natural Resources.

    ERIC Educational Resources Information Center

    Miller, Daniel R.; And Others

    To improve vocational educational programs in agriculture, occupational information on a common core of basic skills within the occupational area of the chemical applicator is presented in the revised task inventory survey. The purpose of the occupational survey was to identify a common core of basic skills which are performed and are essential…

  3. A wirelessly programmable actuation and sensing system for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Long, James; Büyüköztürk, Oral

    2016-04-01

    Wireless sensor networks promise to deliver low cost, low power and massively distributed systems for structural health monitoring. A key component of these systems, particularly when sampling rates are high, is the capability to process data within the network. Although progress has been made towards this vision, it remains a difficult task to develop and program 'smart' wireless sensing applications. In this paper we present a system which allows data acquisition and computational tasks to be specified in Python, a high level programming language, and executed within the sensor network. Key features of this system include the ability to execute custom application code without firmware updates, to run multiple users' requests concurrently and to conserve power through adjustable sleep settings. Specific examples of sensor node tasks are given to demonstrate the features of this system in the context of structural health monitoring. The system comprises of individual firmware for nodes in the wireless sensor network, and a gateway server and web application through which users can remotely submit their requests.

  4. Efficient parallel architecture for highly coupled real-time linear system applications

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Homaifar, Abdollah; Barua, Soumavo

    1988-01-01

    A systematic procedure is developed for exploiting the parallel constructs of computation in a highly coupled, linear system application. An overall top-down design approach is adopted. Differential equations governing the application under consideration are partitioned into subtasks on the basis of a data flow analysis. The interconnected task units constitute a task graph which has to be computed in every update interval. Multiprocessing concepts utilizing parallel integration algorithms are then applied for efficient task graph execution. A simple scheduling routine is developed to handle task allocation while in the multiprocessor mode. Results of simulation and scheduling are compared on the basis of standard performance indices. Processor timing diagrams are developed on the basis of program output accruing to an optimal set of processors. Basic architectural attributes for implementing the system are discussed together with suggestions for processing element design. Emphasis is placed on flexible architectures capable of accommodating widely varying application specifics.

  5. Connected Vehicle Pilot Deployment Program phase 1 : application deployment plan : New York City : final application deployment plan.

    DOT National Transportation Integrated Search

    2016-08-04

    This document is the Task 7 Application Deployment Plan deliverable for the New York City Connected Vehicle Pilot Deployment. It describes the process that the deployment team will follow to acquire and test the connected vehicle safety applications....

  6. Logistics and Sampling Plan for Task 2: 1979-1980 IRS Comparison Study. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance.

    ERIC Educational Resources Information Center

    Walker, Gail; Kuchak, JoAnn

    The type, number, and scope of errors on Basic Educational Opportunity Grant (BEOG) program applications were estimated in a replication of a 1976-1977 Internal Revenue Service (IRS) Comparison Study. Information reported on BEOG applicants and IRS income tax returns was compared for various categories of applicants. The study provides information…

  7. Evaluation of the ion implantation process for production of solar cells from silicon sheet materials

    NASA Technical Reports Server (NTRS)

    Spitzer, M. B.

    1983-01-01

    The objective of this program is the investigation and evaluation of the capabilities of the ion implantation process for the production of photovoltaic cells from a variety of present-day, state-of-the-art, low-cost silicon sheet materials. Task 1 of the program concerns application of ion implantation and furnace annealing to fabrication of cells made from dendritic web silicon. Task 2 comprises the application of ion implantation and pulsed electron beam annealing (PEBA) to cells made from SEMIX, SILSO, heat-exchanger-method (HEM), edge-defined film-fed growth (EFG) and Czochralski (CZ) silicon. The goals of Task 1 comprise an investigation of implantation and anneal processes applied to dendritic web. A further goal is the evaluation of surface passivation and back surface reflector formation. In this way, processes yielding the very highest efficiency can be evaluated. Task 2 seeks to evaluate the use of PEBA for various sheet materials. A comparison of PEBA to thermal annealing will be made for a variety of ion implantation processes.

  8. WAG 2 remedial investigation and site investigation site-specific work plan/health and safety checklist for the soil and sediment task. Environmental Restoration Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holt, V.L.; Burgoa, B.B.

    1993-12-01

    This document is a site-specific work plan/health and safety checklist (WP/HSC) for a task of the Waste Area Grouping 2 Remedial Investigation and Site Investigation (WAG 2 RI&SI). Title 29 CFR Part 1910.120 requires that a health and safety program plan that includes site- and task-specific information be completed to ensure conformance with health- and safety-related requirements. To meet this requirement, the health and safety program plan for each WAG 2 RI&SI field task must include (1) the general health and safety program plan for all WAG 2 RI&SI field activities and (2) a WP/HSC for that particular field task.more » These two components, along with all applicable referenced procedures, must be kept together at the work site and distributed to field personnel as required. The general health and safety program plan is the Health and Safety Plan for the Remedial Investigation and Site Investigation of Waste Area Grouping 2 at the Oak Ridge National Laboratory, Oak Ridge, Tennessee (ORNL/ER-169). The WP/HSCs are being issued as supplements to ORNL/ER-169.« less

  9. Low cost program practices for future NASA space programs, volume 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The progress and outcomes of a NASA/HQ indepth analysis of NASA program practices are documented. Included is a survey of NASA and industry reaction to the utility and application of a Program Effects Relationship Handbook. The results and outcomes of all study tasks are presented as engineering memoranda as the appendix.

  10. Machine learning in motion control

    NASA Technical Reports Server (NTRS)

    Su, Renjeng; Kermiche, Noureddine

    1989-01-01

    The existing methodologies for robot programming originate primarily from robotic applications to manufacturing, where uncertainties of the robots and their task environment may be minimized by repeated off-line modeling and identification. In space application of robots, however, a higher degree of automation is required for robot programming because of the desire of minimizing the human intervention. We discuss a new paradigm of robotic programming which is based on the concept of machine learning. The goal is to let robots practice tasks by themselves and the operational data are used to automatically improve their motion performance. The underlying mathematical problem is to solve the problem of dynamical inverse by iterative methods. One of the key questions is how to ensure the convergence of the iterative process. There have been a few small steps taken into this important approach to robot programming. We give a representative result on the convergence problem.

  11. Solar thermal storage applications program

    NASA Astrophysics Data System (ADS)

    Peila, W. C.

    1982-12-01

    The efforts of the Storage Applications Program are reviewed. The program concentrated on the investigation of storage media and evaluation of storage methods. Extensive effort was given to experimental and analytical investigations of nitrate salts. Two tasks are the preliminary design of a 1200 MW/sub th/ system and the design, construction, operation, and evaluation of a subsystem research experiment, which utilized the same design. Some preliminary conclusions drawn from the subsystem research experiment are given.

  12. Status of the Ford program to evaluate ceramics for stator applications in automotive gas turbine engines

    NASA Technical Reports Server (NTRS)

    Trela, W.

    1980-01-01

    The paper reviews the progress of the major technical tasks of the DOE/NASA/Ford program Evaluation of Ceramics for Stator Applications in Automotive Gas Turbine Engines: reliability prediction, stator fabrication, material characterization, and stator evaluation. A fast fracture reliability model was prepared for a one-piece ceramic stator. Periodic inspection results are presented.

  13. Applications of Advanced Experimental Methodologies to AWAVS Training Research. Final Report, May 1977-July 1978.

    ERIC Educational Resources Information Center

    Simon, Charles W.

    A major part of the Naval Training Equipment Center's Aviation Wide Angle Visual System (AWAVS) program involves behavioral research to provide a basis for establishing design criteria for flight trainers. As part of the task of defining the purpose and approach of this program, the applications of advanced experimental methods are explained and…

  14. Traleika Glacier X-Stack Extension Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fryman, Joshua

    The XStack Extension Project continued along the direction of the XStack program in exploring the software tools and frameworks to support a task-based community runtime towards the goal of Exascale programming. The momentum built as part of the XStack project, with the development of the task-based Open Community Runtime (OCR) and related tools, was carried through during the XStack Extension with the focus areas of easing application development, improving performance and supporting more features. The infrastructure set up for a community-driven open-source development continued to be used towards these areas, with continued co-development of runtime and applications. A variety ofmore » OCR programming environments were studied, as described in Sections Revolutionary Programming Environments & Applications – to assist with application development on OCR, and we develop OCR Translator, a ROSE-based source-to-source compiler that parses high-level annotations in an MPI program to generate equivalent OCR code. Figure 2 compares the number of OCR objects needed to generate the 2D stencil workload using the translator, against manual approaches based on SPMD library or native coding. The rate of increase with the translator, with an increase in number of ranks, is consistent with other approaches. This is explored further in Section OCR Translator.« less

  15. Manipulator system man-machine interface evaluation program. [technology assessment

    NASA Technical Reports Server (NTRS)

    Malone, T. B.; Kirkpatrick, M.; Shields, N. L.

    1974-01-01

    Application and requirements for remote manipulator systems for future space missions were investigated. A manipulator evaluation program was established to study the effects of various systems parameters on operator performance of tasks necessary for remotely manned missions. The program and laboratory facilities are described. Evaluation criteria and philosophy are discussed.

  16. Military Applications of Curved Focal Plane Arrays Developed by the HARDI Program

    DTIC Science & Technology

    2011-01-01

    considered one of the main founders of geometrical optics, modern photography, and cinematography . Among his inventions are the Petzval portrait lens...still be a problem. B. HARDI Program/Institute for Defense Analyses (IDA) Task 1. HARDI Program State-of-the- art cameras could be improved by

  17. Solar Energy Task Force Report on Education and Training.

    ERIC Educational Resources Information Center

    O'Connor, J. Kevin

    The Solar Energy Task Force Report summarizes data, information, and discussions focusing on solar space and water heating applications. The report is intended to fill a need for curriculum and course development and direction for technical training programs, especially in vocational/technical schools and community colleges. It addresses…

  18. The Viability of a DTN System for Current Military Application

    DTIC Science & Technology

    2013-03-01

    Agency (DARPA) Disruption-Tolerant Networking program and the Internet Research Task Force (IRTF) DTN Research Group made significant strides toward...Disruption-Tolerant Networks A Primer,” Interplanetary Internet Special Interest Group, 2012. [4] D. T. N. R. Group, “Compiling DTN2,” Internet Research Task

  19. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  20. UF/RO applications at the Browns Ferry Nuclear Power Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palino, G.F.; Sailor, W.C.; Sawochka, S.G.

    1981-04-01

    In June 1979, NWT was contracted by TVA to review the applicability of reverse osmosis (RO) and ultrafiltration (UF) membrane treatment technology at the Browns Ferry Nuclear Power Station. Specific program tasks are described and results presented.

  1. Summary of synfuel characterization and combustion studies

    NASA Technical Reports Server (NTRS)

    Schultz, D. F.

    1983-01-01

    Combustion component research studies aimed at evolving environmentally acceptable approaches for burning coal derived fuels for ground power applications were performed at the NASA Lewis Research Center under a program titled the ""Critical Research and Support Technology Program'' (CRT). The work was funded by the Department of Energy and was performed in four tasks. This report summarizes these tasks which have all been previously reported. In addition some previously unreported data from Task 4 is also presented. The first, Task 1 consisted of a literature survey aimed at determining the properties of synthetic fuels. This was followed by a computer modeling effort, Task 2, to predict the exhaust emissions resulting from burning coal liquids by various combustion techniques such as lean and rich-lean combustion. The computer predictions were then compared to the results of a flame tube rig, Task 3, in which the fuel properties were varied to simulate coal liquids. Two actual SRC 2 coal liquids were tested in this flame tube task.

  2. Optimizing Mars Airplane Trajectory with the Application Navigation System

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Riley, Derek

    2004-01-01

    Planning complex missions requires a number of programs to be executed in concert. The Application Navigation System (ANS), developed in the NAS Division, can execute many interdependent programs in a distributed environment. We show that the ANS simplifies user effort and reduces time in optimization of the trajectory of a martian airplane. We use a software package, Cart3D, to evaluate trajectories and a shortest path algorithm to determine the optimal trajectory. ANS employs the GridScape to represent the dynamic state of the available computer resources. Then, ANS uses a scheduler to dynamically assign ready task to machine resources and the GridScape for tracking available resources and forecasting completion time of running tasks. We demonstrate system capability to schedule and run the trajectory optimization application with efficiency exceeding 60% on 64 processors.

  3. Comparison of the effects of mobile technology AAC apps on programming visual scene displays.

    PubMed

    Caron, Jessica; Light, Janice; Davidoff, Beth E; Drager, Kathryn D R

    2017-12-01

    Parents and professionals who work with individuals who use augmentative and alternative communication (AAC) face tremendous time pressures, especially when programming vocabulary in AAC technologies. System design (from programming functions to layout options) necessitates a range of skills related to operational competence and can impose intensive training demands for communication partners. In fact, some AAC applications impose considerable learning demands, which can lead to increased time to complete the same programming tasks. A within-subject design was used to investigate the comparative effects of three visual scene display AAC apps (GoTalk Now, AutisMate, EasyVSD) on the programming times for three off-line programming activities, by adults who were novices to programming AAC apps. The results indicated all participants were able to create scenes and add hotspots during off-line programming tasks with minimal self-guided training. The AAC app that had the least number of programming steps, EasyVSD, resulted in the fastest completion times across the three programming tasks. These results suggest that by simplifying the operational requirements of AAC apps the programming time is reduced, which may allow partners to better support individuals who use AAC.

  4. Silicon Power MOSFETs

    NASA Technical Reports Server (NTRS)

    Lauenstein, Jean-Marie; Casey, Megan; Campola, Michael; Ladbury, Raymond; Label, Kenneth; Wilcox, Ted; Phan, Anthony; Kim, Hak; Topper, Alyson

    2017-01-01

    Recent work for the NASA Electronic Parts and Packaging Program Power MOSFET task is presented. The Task technology focus, roadmap, and partners are given. Recent single-event effect test results on commercial, automotive, and radiation hardened trench power MOSFETs are summarized with an emphasis on risk of using commercial and automotive trench-gate power MOSFETs in space applications.

  5. Task Force on Education, Cabinet Committee on Opportunities for Spanish Speaking People, Fiscal Year 1971.

    ERIC Educational Resources Information Center

    Cabinet Committee on Opportunities for Spanish Speaking People, Washington, DC.

    Seventeen recommendations by the Education Task Force for the improvement of education for the Spanish speaking are given. These recommendations were made to the President and to departments which provide programs and services for the Spanish speaking. The recommendations pertain to funding applications, job specifications, teacher education,…

  6. An Open-Sourced and Interactive Ebook Development Program for Minority Languages

    ERIC Educational Resources Information Center

    Sheepy, Emily; Sundberg, Ross; Laurie, Anne

    2017-01-01

    According to Long (2014), genuine task-based pedagogy is centered around the real-world activities that learners need to complete using the target language. We are developing the OurStories mobile application to support learners and instructors of minority languages in the development of personally relevant, task-based learning resources. The…

  7. Administrator Training and Development: Conceptual Model.

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    A conceptual model for an individualized training program for school administrators integrates processes, characteristics, and tasks through theory training and application. Based on an application of contingency theory, it provides a system matching up administrative candidates' needs in three areas (administrative process, administrative…

  8. A Practical Solution Using A New Approach To Robot Vision

    NASA Astrophysics Data System (ADS)

    Hudson, David L.

    1984-01-01

    Up to now, robot vision systems have been designed to serve both application development and operational needs in inspection, assembly and material handling. This universal approach to robot vision is too costly for many practical applications. A new industrial vision system separates the function of application program development from on-line operation. A Vision Development System (VDS) is equipped with facilities designed to simplify and accelerate the application program development process. A complimentary but lower cost Target Application System (TASK) runs the application program developed with the VDS. This concept is presented in the context of an actual robot vision application that improves inspection and assembly for a manufacturer of electronic terminal keyboards. Applications developed with a VDS experience lower development cost when compared with conventional vision systems. Since the TASK processor is not burdened with development tools, it can be installed at a lower cost than comparable "universal" vision systems that are intended to be used for both development and on-line operation. The VDS/TASK approach opens more industrial applications to robot vision that previously were not practical because of the high cost of vision systems. Although robot vision is a new technology, it has been applied successfully to a variety of industrial needs in inspection, manufacturing, and material handling. New developments in robot vision technology are creating practical, cost effective solutions for a variety of industrial needs. A year or two ago, researchers and robot manufacturers interested in implementing a robot vision application could take one of two approaches. The first approach was to purchase all the necessary vision components from various sources. That meant buying an image processor from one company, a camera from another and lens and light sources from yet others. The user then had to assemble the pieces, and in most instances he had to write all of his own software to test, analyze and process the vision application. The second and most common approach was to contract with the vision equipment vendor for the development and installation of a turnkey inspection or manufacturing system. The robot user and his company paid a premium for their vision system in an effort to assure the success of the system. Since 1981, emphasis on robotics has skyrocketed. New groups have been formed in many manufacturing companies with the charter to learn about, test and initially apply new robot and automation technologies. Machine vision is one of new technologies being tested and applied. This focused interest has created a need for a robot vision system that makes it easy for manufacturing engineers to learn about, test, and implement a robot vision application. A newly developed vision system addresses those needs. Vision Development System (VDS) is a complete hardware and software product for the development and testing of robot vision applications. A complimentary, low cost Target Application System (TASK) runs the application program developed with the VDS. An actual robot vision application that demonstrates inspection and pre-assembly for keyboard manufacturing is used to illustrate the VDS/TASK approach.

  9. Real time software for a heat recovery steam generator control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdes, R.; Delgadillo, M.A.; Chavez, R.

    1995-12-31

    This paper is addressed to the development and successful implementation of a real time software for the Heat Recovery Steam Generator (HRSG) control system of a Combined Cycle Power Plant. The real time software for the HRSG control system physically resides in a Control and Acquisition System (SAC) which is a component of a distributed control system (DCS). The SAC is a programmable controller. The DCS installed at the Gomez Palacio power plant in Mexico accomplishes the functions of logic, analog and supervisory control. The DCS is based on microprocessors and the architecture consists of workstations operating as a Man-Machinemore » Interface (MMI), linked to SAC controllers by means of a communication system. The HRSG real time software is composed of an operating system, drivers, dedicated computer program and application computer programs. The operating system used for the development of this software was the MultiTasking Operating System (MTOS). The application software developed at IIE for the HRSG control system basically consisted of a set of digital algorithms for the regulation of the main process variables at the HRSG. By using the multitasking feature of MTOS, the algorithms are executed pseudo concurrently. In this way, the applications programs continuously use the resources of the operating system to perform their functions through a uniform service interface. The application software of the HRSG consist of three tasks, each of them has dedicated responsibilities. The drivers were developed for the handling of hardware resources of the SAC controller which in turn allows the signals acquisition and data communication with a MMI. The dedicated programs were developed for hardware diagnostics, task initializations, access to the data base and fault tolerance. The application software and the dedicated software for the HRSG control system was developed using C programming language due to compactness, portability and efficiency.« less

  10. Phase 1 Program Joint Report

    NASA Technical Reports Server (NTRS)

    Nield, George C. (Editor); Vorobiev, Pavel Mikhailovich (Editor)

    1999-01-01

    This report consists of inputs from each of the Phase I Program Joint Working Groups. The Working Groups were tasked to describe the organizational structure and work processes that they used during the program, joint accomplishments, lessons learned, and applications to the International Space Station Program. This report is a top-level joint reference document that contains information of interest to both countries.

  11. R&M (Reliability and Maintainability) Program Cost Drivers.

    DTIC Science & Technology

    1987-05-01

    Specific data points used to develop the models (i.e., labor hours mid associated systems and task application characteristics) were obtained from three...study data base used to generate the CER’s em be expanded by adding project data points to the input data given in Appendix 13, adjusting the CER...FRACAS, worst-case/ thermal analyses, stress screening and R-growth. However, the studies did not assign benefits to specific task areas. c. Task

  12. The effect of single-task and dual-task balance exercise programs on balance performance in adults with osteoporosis: a randomized controlled preliminary trial.

    PubMed

    Konak, H E; Kibar, S; Ergin, E S

    2016-11-01

    Osteoporosis is a serious disease characterized by muscle weakness in the lower extremities, shortened length of trunk, and increased dorsal kyphosis leading to poor balance performance. Although balance impairment increases in adults with osteoporosis, falls and fall-related injuries have been shown to occur mainly during the dual-task performance. Several studies have shown that dual-task performance was improved with specific repetitive dual-task exercises. The aims of this study were to compare the effect of single- and dual-task balance exercise programs on static balance, dynamic balance, and activity-specific balance confidence in adults with osteoporosis and to assess the effectiveness of dual-task balance training on gait speed under dual-task conditions. Older adults (N = 42) (age range, 45-88 years) with osteoporosis were randomly assigned into two groups. Single-task balance training group was given single-task balance exercises for 4 weeks, whereas dual-task balance training group received dual-task balance exercises. Participants received 45-min individualized training session, three times a week. Static balance was evaluated by one-leg stance (OLS) and a kinesthetic ability trainer (KAT) device. Dynamic balance was measured by the Berg Balance Scale (BBS), Time Up and Go (TUG) test, and gait speed. Self-confidence was assessed with the Activities-specific Balance Confidence (ABC-6) scale. Assessments were performed at baseline and after the 4-week program. At the end of the treatment periods, KAT score, BBS score, time in OLS and TUG, gait speeds under single- and dual-task conditions, and ABC-6 scale scores improved significantly in all patients (p < 0.05). However, BBS and gait speeds under single- and dual-task conditions showed significantly greater improvement in the dual-task balance training group than in the single-task balance training group (p < 0.05). ABC-6 scale scores improved more in the single-task balance training group than in the dual-task balance training group (p < 0.05). A 4-week single- and dual-task balance exercise programs are effective in improving static balance, dynamic balance, and balance confidence during daily activities in older adults with osteoporosis. However, single- and dual-task gait speeds showed greater improvement following the application of a specific type of dual-task exercise programs. 24102014-2.

  13. Application of Multi-Frequency Modulation (MFM) for High-Speed Data Communications to a Voice Frequency Channel

    DTIC Science & Technology

    1990-06-01

    reader is cautioned that computer programs developed in this research may not have been exercised for all cases of interest. While every effort has been...Source of Funding Numbers _. Program Element No Project No I Task No I Work Unit Accession No 11 Title (Include security classflcation) APPLICATION OF...formats. Previous applications of these encoding formats were on industry standard computers (PC) over a 16-20 klIz channel. This report discusses the

  14. Medical benefits from the NASA biomedical applications program

    NASA Technical Reports Server (NTRS)

    Sigmon, J. L.

    1974-01-01

    To achieve its goals the NASA Biomedical Applications Program performs four basic tasks: (1) identification of major medical problems which lend themselves to solution by relevant aerospace technology; (2) identification of relevant aerospace technology which can be applied to those problems; (3) application of that technology to demonstrate the feasibility as real solutions to the identified problems; and, (4) motivation of the industrial community to manufacture and market the identified solution to maximize the utilization of aerospace solutions to the biomedical community.

  15. Space Station Application of Simulator-Developed Aircrew Coordination and Performance Measures

    NASA Technical Reports Server (NTRS)

    Murphy, Miles

    1985-01-01

    This paper summarizes a study in progress at NASA/Ames Research Center to develop measures of aircrew coordination and decision-making factors and to relate them to flight task performance, that is, to crew and system performance measures. The existence of some similar interpersonal process and task performance requirements suggests a potential application of these methods in space station crew research -- particularly research conducted in ground-based mock-ups. The secondary objective of this study should also be of interest: to develop information on crew process and performance for application in developing crew training programs.

  16. Task 6 -- Advanced turbine systems program conceptual design and product development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-01-10

    The Allison Engine Company has completed the Task 6 Conceptual Design and Analysis of Phase 2 of the Advanced Turbine System (ATS) contract. At the heart of Allison`s system is an advanced simple cycle gas turbine engine. This engine will incorporate components that ensure the program goals are met. Allison plans to commercialize the ATS demonstrator and market a family of engines incorporating this technology. This family of engines, ranging from 4.9 MW to 12 MW, will be suitable for use in all industrial engine applications, including electric power generation, mechanical drive, and marine propulsion. In the field of electricmore » power generation, the engines will be used for base load, standby, cogeneration, and distributed generation applications.« less

  17. Solar Concentrator Advanced Development Program, Task 1

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Solar dynamic power generation has been selected by NASA to provide power for the space station. Solar dynamic concentrator technology has been demonstrated for terrestrial applications but has not been developed for space applications. The object of the Solar Concentrator Advanced Development program is to develop the technology of solar concentrators which would be used on the space station. The first task of this program was to develop conceptual concentrator designs and perform trade-off studies and to develop a materials data base and perform material selection. Three unique concentrator concepts; Truss Hex, Spline Radial Panel and Domed Fresnel, were developed and evaluated against weighted trade criteria. The Truss Hex concept was recommended for the space station. Materials data base development demonstrated that several material systems are capable of withstanding extended periods of atomic oxygen exposure without undesirable performance degradation. Descriptions of the conceptual designs and materials test data are included.

  18. Solid State Energy Conversion Alliance Delphi SOFC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Shaffer; Gary Blake; Sean Kelly

    2006-12-31

    The following report details the results under the DOE SECA program for the period July 2006 through December 2006. Developments pertain to the development of a 3 to 5 kW Solid Oxide Fuel Cell power system for a range of fuels and applications. This report details technical results of the work performed under the following tasks for the SOFC Power System: Task 1 SOFC System Development; Task 2 Solid Oxide Fuel Cell Stack Developments; Task 3 Reformer Developments; Task 4 Development of Balance of Plant Components; Task 5 Project Management; and Task 6 System Modeling & Cell Evaluation for Highmore » Efficiency Coal-Based Solid Oxide Fuel Cell Gas Turbine Hybrid System.« less

  19. The National Shipbuilding Research Program. Application of Industrial Engineering Techniques to Reduce Workers’ Compensation and Environmental Costs

    DTIC Science & Technology

    1999-10-01

    TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Surface Warfare Center CD Code 2230-Design Integration...Tools Bldg 192, Room 128 9500 MacArthur Blvd Bethesda, MD 20817-5700 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...Implementation of Task 2.4 • Task 7.0 Conduct Workshops • Task 8.0 Final Report To ensure success with the project, the research needed to be performed at the

  20. HAL/S language specification

    NASA Technical Reports Server (NTRS)

    Newbold, P. M.

    1974-01-01

    A programming language for the flight software of the NASA space shuttle program was developed and identified as HAL/S. The language is intended to satisfy virtually all of the flight software requirements of the space shuttle. The language incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks.

  1. System architecture for asynchronous multi-processor robotic control system

    NASA Technical Reports Server (NTRS)

    Steele, Robert D.; Long, Mark; Backes, Paul

    1993-01-01

    The architecture for the Modular Telerobot Task Execution System (MOTES) as implemented in the Supervisory Telerobotics (STELER) Laboratory is described. MOTES is the software component of the remote site of a local-remote telerobotic system which is being developed for NASA for space applications, in particular Space Station Freedom applications. The system is being developed to provide control and supervised autonomous control to support both space based operation and ground-remote control with time delay. The local-remote architecture places task planning responsibilities at the local site and task execution responsibilities at the remote site. This separation allows the remote site to be designed to optimize task execution capability within a limited computational environment such as is expected in flight systems. The local site task planning system could be placed on the ground where few computational limitations are expected. MOTES is written in the Ada programming language for a multiprocessor environment.

  2. Versatile and declarative dynamic programming using pair algebras.

    PubMed

    Steffen, Peter; Giegerich, Robert

    2005-09-12

    Dynamic programming is a widely used programming technique in bioinformatics. In sharp contrast to the simplicity of textbook examples, implementing a dynamic programming algorithm for a novel and non-trivial application is a tedious and error prone task. The algebraic dynamic programming approach seeks to alleviate this situation by clearly separating the dynamic programming recurrences and scoring schemes. Based on this programming style, we introduce a generic product operation of scoring schemes. This leads to a remarkable variety of applications, allowing us to achieve optimizations under multiple objective functions, alternative solutions and backtracing, holistic search space analysis, ambiguity checking, and more, without additional programming effort. We demonstrate the method on several applications for RNA secondary structure prediction. The product operation as introduced here adds a significant amount of flexibility to dynamic programming. It provides a versatile testbed for the development of new algorithmic ideas, which can immediately be put to practice.

  3. X-Windows Socket Widget Class

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.

    2006-01-01

    The X-Windows Socket Widget Class ("Class" is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network connections for graphical-user-interface (GUI) computer programs. UNIX Transmission Control Protocol/Internet Protocol (TCP/IP) socket programming libraries require many method calls to configure, operate, and destroy sockets. Most X Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Socket Widget Class encapsulates UNIX TCP/IP socket-management tasks within the framework of an X Windows widget. Using the widget framework, X Windows GUI programs can treat one or more network socket instances in the same manner as that of other graphical widgets, making it easier to program sockets. Wrapping ISP socket programming libraries inside a widget framework enables a programmer to treat a network interface as though it were a GUI.

  4. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  5. Joining of ceramics for high performance energy systems. Mid-term progress report, August 1, 1979-March 31, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smeltzer, C E; Metcalfe, A G

    The subject program is primarily an exploratory and demonstration study of the use of silicate glass-based adhesives for bonding silicon-base refractory ceramics (SiC, Si/sub 3/N/sub 4/). The projected application is 1250 to 2050/sup 0/F relaxing joint service in high-performance energy conversion systems. The five program tasks and their current status are as follows. Task 1 - Long-Term Joint Stability. Time-temperature-transformation studies of candidate glass adhesives, out to 2000 hours simulated service exposure, are half complete. Task 2 - Environmental and Service Effects on Joint Reliability. Start up delayed due to late delivery of candidate glass fillers and ceramic specimens. Taskmore » 3 - Viscoelastic Damping of Glass Bonded Ceramics. Promising results obtained over approximately the same range of glass viscosity required for joint relaxation function (10/sup 7.5/ to 10/sup 9.5/ poise). Work is 90% complete. Task 4 - Crack Arrest and Crack Diversion by Joints. No work started due to late arrival of materials. Task 5 - Improved Joining and Fabrication Methods. Significant work has been conducted in the area of refractory pre-glazing and the application and bonding of high-density candidate glass fillers (by both hand-artisan and slip-spray techniques). Work is half complete.« less

  6. Astronaut Office Scheduling System Software

    NASA Technical Reports Server (NTRS)

    Brown, Estevancio

    2010-01-01

    AOSS is a highly efficient scheduling application that uses various tools to schedule astronauts weekly appointment information. This program represents an integration of many technologies into a single application to facilitate schedule sharing and management. It is a Windows-based application developed in Visual Basic. Because the NASA standard office automation load environment is Microsoft-based, Visual Basic provides AO SS developers with the ability to interact with Windows collaboration components by accessing objects models from applications like Outlook and Excel. This also gives developers the ability to create newly customizable components that perform specialized tasks pertaining to scheduling reporting inside the application. With this capability, AOSS can perform various asynchronous tasks, such as gathering/ sending/ managing astronauts schedule information directly to their Outlook calendars at any time.

  7. Application of advanced speech technology in manned penetration bombers

    NASA Astrophysics Data System (ADS)

    North, R.; Lea, W.

    1982-03-01

    This report documents research on the potential use of speech technology in a manned penetration bomber aircraft (B-52/G and H). The objectives of the project were to analyze the pilot/copilot crewstation tasks over a three-hour-and forty-minute mission and determine the tasks that would benefit the most from conversion to speech recognition/generation, determine the technological feasibility of each of the identified tasks, and prioritize these tasks based on these criteria. Secondary objectives of the program were to enunciate research strategies in the application of speech technologies in airborne environments, and develop guidelines for briefing user commands on the potential of using speech technologies in the cockpit. The results of this study indicated that for the B-52 crewmember, speech recognition would be most beneficial for retrieving chart and procedural data that is contained in the flight manuals. Technological feasibility of these tasks indicated that the checklist and procedural retrieval tasks would be highly feasible for a speech recognition system.

  8. Design study of wind turbines 50 kW to 3000 kW for electric utility applications: Analysis and design

    NASA Technical Reports Server (NTRS)

    1976-01-01

    In the conceptual design task, several feasible wind generator systems (WGS) configurations were evaluated, and the concept offering the lowest energy cost potential and minimum technical risk for utility applications was selected. In the optimization task, the selected concept was optimized utilizing a parametric computer program prepared for this purpose. In the preliminary design task, the optimized selected concept was designed and analyzed in detail. The utility requirements evaluation task examined the economic, operational, and institutional factors affecting the WGS in a utility environment, and provided additional guidance for the preliminary design effort. Results of the conceptual design task indicated that a rotor operating at constant speed, driving an AC generator through a gear transmission is the most cost effective WGS configuration. The optimization task results led to the selection of a 500 kW rating for the low power WGS and a 1500 kW rating for the high power WGS.

  9. IPAD applications to the design, analysis, and/or machining of aerospace structures. [Integrated Program for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.

    1981-01-01

    A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.

  10. Remote sensing support for national forest inventories

    Treesearch

    Ronald E. McRoberts; Erkki O. Tomppo

    2007-01-01

    National forest inventory programs are tasked to produce timely and accurate estimates for a wide range of forest resource variables for a variety of users and applications. Time, cost, and precision constraints cause these programs to seek technological innovations that contribute to measurement and estimation efficiencies and that facilitate the production and...

  11. Applications of Combinatorial Programming to Data Analysis: The Traveling Salesman and Related Problems

    ERIC Educational Resources Information Center

    Hubert, Lawrence J.; Baker, Frank B.

    1978-01-01

    The "Traveling Salesman" and similar combinatorial programming tasks encountered in operations research are discussed as possible data analysis models in psychology, for example, in developmental scaling, Guttman scaling, profile smoothing, and data array clustering. A short overview of various computational approaches from this area of…

  12. Prototyping with Application Generators: Lessons Learned from the Naval Aviation Logistics Command Management Information System Case

    DTIC Science & Technology

    1992-10-01

    Prototyping with Application Generators: Lessons Learned from the Naval Aviation Logistics Command Management Information System Case. This study... management information system to automate manual Naval aviation maintenance tasks-NALCOMIS. With the use of a fourth-generation programming language

  13. Mobile Language Study

    DTIC Science & Technology

    2003-08-18

    Language Study 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT NUMBER 5d. TASK NUMBER 6. AUTHOR(S) Professor Mads Dam, Pablo Giambiagi 5e...Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39-18 SPC 01-4025 Mobile Language Study Final...smart card applications. Smart cards can be programmed using general-purpose languages ; but because of their limited resources, smart card programs

  14. The Affordance Template ROS Package for Robot Task Programming

    NASA Technical Reports Server (NTRS)

    Hart, Stephen; Dinh, Paul; Hambuchen, Kimberly

    2015-01-01

    This paper introduces the Affordance Template ROS package for quickly programming, adjusting, and executing robot applications in the ROS RViz environment. This package extends the capabilities of RViz interactive markers by allowing an operator to specify multiple end-effector waypoint locations and grasp poses in object-centric coordinate frames and to adjust these waypoints in order to meet the run-time demands of the task (specifically, object scale and location). The Affordance Template package stores task specifications in a robot-agnostic XML description format such that it is trivial to apply a template to a new robot. As such, the Affordance Template package provides a robot-generic ROS tool appropriate for building semi-autonomous, manipulation-based applications. Affordance Templates were developed by the NASA-JSC DARPA Robotics Challenge (DRC) team and have since successfully been deployed on multiple platforms including the NASA Valkyrie and Robonaut 2 humanoids, the University of Texas Dreamer robot and the Willow Garage PR2. In this paper, the specification and implementation of the affordance template package is introduced and demonstrated through examples for wheel (valve) turning, pick-and-place, and drill grasping, evincing its utility and flexibility for a wide variety of robot applications.

  15. Highlights of X-Stack ExM Deliverable Swift/T

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wozniak, Justin M.

    Swift/T is a key success from the ExM: System support for extreme-scale, many-task applications1 X-Stack project, which proposed to use concurrent dataflow as an innovative programming model to exploit extreme parallelism in exascale computers. The Swift/T component of the project reimplemented the Swift language from scratch to allow applications that compose scientific modules together to be build and run on available petascale computers (Blue Gene, Cray). Swift/T does this via a new compiler and runtime that generates and executes the application as an MPI program. We assume that mission-critical emerging exascale applications will be composed as scalable applications using existingmore » software components, connected by data dependencies. Developers wrap native code fragments using a higherlevel language, then build composite applications to form a computational experiment. This exemplifies hierarchical concurrency: lower-level messaging libraries are used for fine-grained parallelism; highlevel control is used for inter-task coordination. These patterns are best expressed with dataflow, but static DAGs (i.e., other workflow languages) limit the applications that can be built; they do not provide the expressiveness of Swift, such as conditional execution, iteration, and recursive functions.« less

  16. [Virtual surgical education: experience with medicine and surgery students].

    PubMed

    Bonavina, Luigi; Mozzi, Enrico; Peracchia, Alberto

    2003-01-01

    The use of virtual reality simulation is currently being proposed within programs of postgraduate surgical education. The simple tasks that make up an operative procedure can be repeatedly performed until satisfactory execution is achieved, and the errors can be corrected by means of objective assessment. The aim of this study was to evaluate the applicability and the results of structured practice with the LapSim laparoscopic simulator used by undergraduate medical students. A significant reduction in operative time and errors was noted in several tasks (navigation, clipping, etc.). Although the transfer of technical skills to the operating room environment remains to be demonstrated, our research shows that this type of teaching is applicable to undergraduate medical students and in future may become a useful tool for selecting individuals for surgical residency programs.

  17. Generation IV Nuclear Energy Systems Construction Cost Reductions through the Use of Virtual Environments - Task 4 Report: Virtual Mockup Maintenance Task Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Shaw; Anthony Baratta; Vaughn Whisker

    2005-02-28

    Task 4 report of 3 year DOE NERI-sponsored effort evaluating immersive virtual reality (CAVE) technology for design review, construction planning, and maintenance planning and training for next generation nuclear power plants. Program covers development of full-scale virtual mockups generated from 3D CAD data presented in a CAVE visualization facility. This report focuses on using Full-scale virtual mockups for nuclear power plant training applications.

  18. NASA/ESTO investments in remote sensing technologies (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Babu, Sachidananda R.

    2017-02-01

    For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.

  19. ESTO Investments in Innovative Sensor Technologies for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Babu, Sachidananda R.

    2017-01-01

    For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.

  20. Petascale Simulation Initiative Tech Base: FY2007 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, J; Chen, R; Jefferson, D

    The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending on when Peloton becomes available); (2) Improve SARS's robustness and ease-of-use, and develop user documentation; and (3) Work with LLNL code teams to help them determine how Symponents could benefit their applications. The original funding request was $296,000 for the year, and we eventually received $252,000. The remainder of this report describes our efforts and accomplishments for each of the goals listed above.« less

  1. Research and applications: Artificial intelligence

    NASA Technical Reports Server (NTRS)

    Chaitin, L. J.; Duda, R. O.; Johanson, P. A.; Raphael, B.; Rosen, C. A.; Yates, R. A.

    1970-01-01

    The program is reported for developing techniques in artificial intelligence and their application to the control of mobile automatons for carrying out tasks autonomously. Visual scene analysis, short-term problem solving, and long-term problem solving are discussed along with the PDP-15 simulator, LISP-FORTRAN-MACRO interface, resolution strategies, and cost effectiveness.

  2. An iconic programming language for sensor-based robots

    NASA Technical Reports Server (NTRS)

    Gertz, Matthew; Stewart, David B.; Khosla, Pradeep K.

    1993-01-01

    In this paper we describe an iconic programming language called Onika for sensor-based robotic systems. Onika is both modular and reconfigurable and can be used with any system architecture and real-time operating system. Onika is also a multi-level programming environment wherein tasks are built by connecting a series of icons which, in turn, can be defined in terms of other icons at the lower levels. Expert users are also allowed to use control block form to define servo tasks. The icons in Onika are both shape and color coded, like the pieces of a jigsaw puzzle, thus providing a form of error control in the development of high level applications.

  3. Parallel Programming Strategies for Irregular Adaptive Applications

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    Achieving scalable performance for dynamic irregular applications is eminently challenging. Traditional message-passing approaches have been making steady progress towards this goal; however, they suffer from complex implementation requirements. The use of a global address space greatly simplifies the programming task, but can degrade the performance for such computations. In this work, we examine two typical irregular adaptive applications, Dynamic Remeshing and N-Body, under competing programming methodologies and across various parallel architectures. The Dynamic Remeshing application simulates flow over an airfoil, and refines localized regions of the underlying unstructured mesh. The N-Body experiment models two neighboring Plummer galaxies that are about to undergo a merger. Both problems demonstrate dramatic changes in processor workloads and interprocessor communication with time; thus, dynamic load balancing is a required component.

  4. Basic and applied research program. Semiannual report, July-December 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, B.L.

    1979-12-01

    The status of research projects in the Basic and Applied Research Program at SERI is presented for the semiannual period ending December 31, 1978. The five tasks in this program are grouped into Materials Research and Development, Materials Processing and Development, Photoconversion Research, Exploratory Research, and Energy Resource and Assessment and have been carried out by personnel in the Materials, Bio/Chemical Conversion, and Energy Resource and Assessment Branches. Subtask elements in the task areas include coatings and films, polymers, metallurgy and corrosion, optical materials, surfaces and interfaces in materials research and development; photochemistry, photoelectrochemistry, and photobiology in photoconversion; thin glassmore » mirror development, silver degradation of mirrors, hail resistance of thin glass, thin glass manufacturing, cellular glass development, and sorption by desiccants in materials processing and development; and thermoelectric energy conversion, desiccant cooling, photothermal degradation, and amorphous materials in exploratory research. For each task or subtask element, the overview, scope, goals, approach, apparatus and equipment, and supporting subcontracts are presented, as applicable, in addition to the status of the projects in each task or subtask. Listing of publications and reports authored by personnel associated with the Basic and Applied Research Program and prepared or published during 1978 are also included.« less

  5. Creep-fatigue life prediction for engine hot section materials (isotropic)

    NASA Technical Reports Server (NTRS)

    Moreno, V.

    1982-01-01

    The objectives of this program are the investigation of fundamental approaches to high temperature crack initiation life prediction, identification of specific modeling strategies and the development of specific models for component relevant loading conditions. A survey of the hot section material/coating systems used throughout the gas turbine industry is included. Two material/coating systems will be identified for the program. The material/coating system designated as the base system shall be used throughout Tasks 1-12. The alternate material/coating system will be used only in Task 12 for further evaluation of the models developed on the base material. In Task II, candidate life prediction approaches will be screened based on a set of criteria that includes experience of the approaches within the literature, correlation with isothermal data generated on the base material, and judgements relative to the applicability of the approach for the complex cycles to be considered in the option program. The two most promising approaches will be identified. Task 3 further evaluates the best approach using additional base material fatigue testing including verification tests. Task 4 consists of technical, schedular, financial and all other reporting requirements in accordance with the Reports of Work clause.

  6. Cross validation issues in multiobjective clustering

    PubMed Central

    Brusco, Michael J.; Steinley, Douglas

    2018-01-01

    The implementation of multiobjective programming methods in combinatorial data analysis is an emergent area of study with a variety of pragmatic applications in the behavioural sciences. Most notably, multiobjective programming provides a tool for analysts to model trade offs among competing criteria in clustering, seriation, and unidimensional scaling tasks. Although multiobjective programming has considerable promise, the technique can produce numerically appealing results that lack empirical validity. With this issue in mind, the purpose of this paper is to briefly review viable areas of application for multiobjective programming and, more importantly, to outline the importance of cross-validation when using this method in cluster analysis. PMID:19055857

  7. Personal Electronic Aid for Maintenance

    DTIC Science & Technology

    1989-03-01

    as input to the Department of Defense Computer -Aided Acquisi- tion and Logistics Support program and to the development of the Militarized Electronic...NUMBER ORGANIZATION (If applicable) Bc. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO...Manpower and Training Technology Development Program . This summary report of the Personal Electronic Aid for Maintenance (PEAM) was prepared by the

  8. Toward a Natural Speech Understanding System

    DTIC Science & Technology

    1989-10-01

    WALTER J. SENUS Technical Director Directorate of Intelligence & Reconnaissance FOR THE COMMANDER JAMES W. HYDE III V Directorate of Plans & Programs ...applicable) Human Resources Laboratory F30602-81-C-0193 8 . ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK...error rates for distinctive words produced in isolation by a single speaker, and their simple programming requirements. Template-matching systems rank

  9. Obtaining Content Weights for Test Specifications from Job Analysis Task Surveys: An Application of the Many-Facets Rasch Model

    ERIC Educational Resources Information Center

    Wang, Ning; Stahl, John

    2012-01-01

    This article discusses the use of the Many-Facets Rasch Model, via the FACETS computer program (Linacre, 2006a), to scale job/practice analysis survey data as well as to combine multiple rating scales into single composite weights representing the tasks' relative importance. Results from the Many-Facets Rasch Model are compared with those…

  10. Automatic translation of MPI source into a latency-tolerant, data-driven form

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric

    Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. We reformulate MPI source into a task dependency graph representation, which partially orders the tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotation for a variety ofmore » applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo’s performance meets or exceeds that of labor-intensive hand coding. As a result, the translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a well-known library.« less

  11. Automatic translation of MPI source into a latency-tolerant, data-driven form

    DOE PAGES

    Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric; ...

    2017-03-06

    Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. We reformulate MPI source into a task dependency graph representation, which partially orders the tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotation for a variety ofmore » applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo’s performance meets or exceeds that of labor-intensive hand coding. As a result, the translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a well-known library.« less

  12. Automatic translation of MPI source into a latency-tolerant, data-driven form

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric

    Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. Bamboo reformulates MPI source into the form of a task dependency graph that expresses a partial ordering among tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotationmore » for a variety of applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo's performance meets or exceeds that of labor-intensive hand coding. The translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a wellknown library.« less

  13. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Samuel; Baker, Gavin Matthew; Gamell, Marc

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leadingmore » candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.« less

  14. SOLID STATE ENERGY CONVERSION ALLIANCE DELPHI SOLID OXIDE FUEL CELL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Shaffer; Sean Kelly; Subhasish Mukerjee

    2003-12-08

    The objective of Phase I under this project is to develop a 5 kW Solid Oxide Fuel Cell power system for a range of fuels and applications. During Phase I, the following will be accomplished: Develop and demonstrate technology transfer efforts on a 5 kW stationary distributed power generation system that incorporates steam reforming of natural gas with the option of piped-in water (Demonstration System A). Initiate development of a 5 kW system for later mass-market automotive auxiliary power unit application, which will incorporate Catalytic Partial Oxidation (CPO) reforming of gasoline, with anode exhaust gas injected into an ultra-lean burnmore » internal combustion engine. This technical progress report covers work performed by Delphi from January 1, 2003 to June 30, 2003, under Department of Energy Cooperative Agreement DE-FC-02NT41246. This report highlights technical results of the work performed under the following tasks: Task 1 System Design and Integration; Task 2 Solid Oxide Fuel Cell Stack Developments; Task 3 Reformer Developments; Task 4 Development of Balance of Plant (BOP) Components; Task 5 Manufacturing Development (Privately Funded); Task 6 System Fabrication; Task 7 System Testing; Task 8 Program Management; and Task 9 Stack Testing with Coal-Based Reformate.« less

  15. Rehabilitation robotics for the upper extremity: review with new directions for orthopaedic disorders.

    PubMed

    Hakim, Renée M; Tunis, Brandon G; Ross, Michael D

    2017-11-01

    The focus of research using technological innovations such as robotic devices has been on interventions to improve upper extremity function in neurologic populations, particularly patients with stroke. There is a growing body of evidence describing rehabilitation programs using various types of supportive/assistive and/or resistive robotic and virtual reality-enhanced devices to improve outcomes for patients with neurologic disorders. The most promising approaches are task-oriented, based on current concepts of motor control/learning and practice-induced neuroplasticity. Based on this evidence, we describe application and feasibility of virtual reality-enhanced robotics integrated with current concepts in orthopaedic rehabilitation shifting from an impairment-based focus to inclusion of more intense, task-specific training for patients with upper extremity disorders, specifically emphasizing the wrist and hand. The purpose of this paper is to describe virtual reality-enhanced rehabilitation robotic devices, review evidence of application in patients with upper extremity deficits related to neurologic disorders, and suggest how this technology and task-oriented rehabilitation approach can also benefit patients with orthopaedic disorders of the wrist and hand. We will also discuss areas for further research and development using a task-oriented approach and a commercially available haptic robotic device to focus on training of grasp and manipulation tasks. Implications for Rehabilitation There is a growing body of evidence describing rehabilitation programs using various types of supportive/assistive and/or resistive robotic and virtual reality-enhanced devices to improve outcomes for patients with neurologic disorders. The most promising approaches using rehabilitation robotics are task-oriented, based on current concepts of motor control/learning and practice-induced neuroplasticity. Based on the evidence in neurologic populations, virtual reality-enhanced robotics may be integrated with current concepts in orthopaedic rehabilitation shifting from an impairment-based focus to inclusion of more intense, task-specific training for patients with UE disorders, specifically emphasizing the wrist and hand. Clinical application of a task-oriented approach may be accomplished using commercially available haptic robotic device to focus on training of grasp and manipulation tasks.

  16. Application of advanced control techniques to aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    Lehtinen, B.

    1984-01-01

    Two programs are described which involve the application of advanced control techniques to the design of engine control algorithms. Multivariable control theory is used in the F100 MVCS (multivariable control synthesis) program to design controls which coordinate the control inputs for improved engine performance. A systematic method for handling a complex control design task is given. Methods of analytical redundancy are aimed at increasing the control system reliability. The F100 DIA (detection, isolation, and accommodation) program, which investigates the uses of software to replace or augment hardware redundancy for certain critical engine sensor, is described.

  17. Behavioral Scientists (AFSC 2675), Scientific Managers (AFSC 26169), and Related Specialties.

    DTIC Science & Technology

    1984-12-01

    mitroliche only h : hard copy only A ,ceoo For DTI5 ?RA1 . ." ),, " k -. A’t ’ . .. " " ." , ’ - % ° ’ -" .. . , .. -. • . . . -. . . . - " . TABLE OF...FUNCTIONS 22 K APPLICATIONS OF RESEARCH L MANAGING RESEARCH OR APPLICATIONS PROGRAMS 14 M ORGANIZATIONAL IMPROVEMENT FUNCTIONS it N ACADEMIC INSTRUCTOR...time ratings for each task. For the purpose of organizing individual jobs into similar types of work, an automated job clustering program was used. This

  18. Application of GA package in functional packaging

    NASA Astrophysics Data System (ADS)

    Belousova, D. A.; Noskova, E. E.; Kapulin, D. V.

    2018-05-01

    The approach to application program for the task of configuration of the elements of the commutation circuit for design of the radio-electronic equipment on the basis of the genetic algorithm is offered. The efficiency of the used approach for commutation circuits with different characteristics for computer-aided design on radio-electronic manufacturing is shown. The prototype of the computer-aided design subsystem on the basis of a package GA for R with a set of the general functions for optimization of multivariate models is programmed.

  19. Study of robotics systems applications to the space station program

    NASA Technical Reports Server (NTRS)

    Fox, J. C.

    1983-01-01

    Applications of robotics systems to potential uses of the Space Station as an assembly facility, and secondarily as a servicing facility, are considered. A typical robotics system mission is described along with the pertinent application guidelines and Space Station environmental assumptions utilized in developing the robotic task scenarios. A functional description of a supervised dual-robot space structure construction system is given, and four key areas of robotic technology are defined, described, and assessed. Alternate technologies for implementing the more routine space technology support subsystems that will be required to support the Space Station robotic systems in assembly and servicing tasks are briefly discussed. The environmental conditions impacting on the robotic configuration design and operation are reviewed.

  20. Creating an iPhone Application for Collecting Continuous ABC Data

    ERIC Educational Resources Information Center

    Whiting, Seth W.; Dixon, Mark R.

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…

  1. NASA/MSFC FY-83 Atmospheric Processes Research Review

    NASA Technical Reports Server (NTRS)

    Turner, R. E. (Compiler)

    1983-01-01

    The atmospheric processes research program was reviewed. Research tasks sponsored by the NASA Office of Space Science and Applications, Earth Sciences and Applications Division in the areas of upper atmosphere, global weather, and mesoscale processes are discussed. The are: the research project summaries, together with the agenda and other information about the meeting.

  2. Civil Technology Applications. Teacher Edition [and] Student Edition.

    ERIC Educational Resources Information Center

    Schertz, Karen

    Teacher and student editions of Civil Technology Applications are one in a series of competency-based instructional materials for drafting and civil technology programs. It includes the technical content and tasks necessary for a student to be employed as a drafter or civil technician in a civil engineering firm. Introductory pages in the teacher…

  3. Summary results of the DOE flywheel development effort

    NASA Astrophysics Data System (ADS)

    Olszewski, M.; Martin, J. F.

    1984-11-01

    The technology and applications evaluation task focuses on defining performance and cost requirements for flywheels in the various areas of application. To date the DOE program has focused on automotive applications. The composite materials effort entails the testing of new commercial composites to determine their engineering properties. The rotor and containment development work uses data from these program elements to design and fabricate flywheels. The flywheels are then tested at the Oak Ridge Flywheel Evaluation Laboratory and their performance is evaluated to indicate possible areas for improvement. Once a rotor has been fully developed it is transferred to the private sector.

  4. NBS (National Bureau of Standards): Materials measurements

    NASA Technical Reports Server (NTRS)

    Manning, J. R.

    1984-01-01

    Work in support of NASA's Microgravity Science and Applications Program is described. The results of the following three tasks are given in detail: (1) surface tensions and their variations with temperature and impurities; (2) convection during unidirectional solidification; and (3) measurement of high temperature thermophysical properties. Tasks 1 and 2 were directed toward determining how the reduced gravity obtained in space flight can affect convection and solidification processes. Emphasis in task 3 was on development of levitation and containerless processing techniques which can be applied in space flight to provide thermodynamic measurements of reactive materials.

  5. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  6. Program definition and assessment overview. [for thermal energy storage project management

    NASA Technical Reports Server (NTRS)

    Gordon, L. H.

    1980-01-01

    The implementation of a program level assessment of thermal energy storage technology thrusts for the near and far term to assure overall coherent energy storage program is considered. The identification and definition of potential thermal energy storage applications, definition of technology requirements, and appropriate market sectors are discussed along with the necessary coordination, planning, and preparation associated with program reviews, workshops, multi-year plans and annual operating plans for the major laboratory tasks.

  7. Facilities and the Air Force Systems Acquisition Process.

    DTIC Science & Technology

    1985-05-01

    INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (it applicable) Sc. ADDRESS (City. State and ZIP Code) 10. SOURCE OF FUNDING NOS. PROGRAM PROJECT TASK WORK UNIT...L- . - - - CHAP T-F< I P f),DUCTr JOti The Air Force is in the midst of its most extensive peacetime force modernization programs ...will answer the following ques- tions: a. Are facility requirements anticipated and ade- quatel% scoped during the early phase of program development so

  8. The Bittersweet Task of Running a Grant Program

    ERIC Educational Resources Information Center

    Markin, Karen M.

    2013-01-01

    Running a grant program for the first time can feel overwhelming. The work is time-consuming, requires attention to many details, and is accompanied by pressure from applicants who are desperate for money and prompt decisions. This article presents a list of all of the factors educators have to consider. From establishing a timeline and drafting…

  9. Functional Utilization of DABS Data Link Discrete Address Beacon System

    DOT National Transportation Integrated Search

    1978-10-01

    The report describes the output of a Task Force established by FAA Headquarters, SRDS, Robert Wedan, in June 1977 to study and recommend potential applications for Data Link to the DABS Experimentation Program

  10. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. We describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  11. DARMA v. Beta 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollman, David; Lifflander, Jonathon; Wilke, Jeremiah

    2017-03-14

    DARMA is a portability layer for asynchronous many-task (AMT) runtime systems. AMT runtime systems show promise to mitigate challenges imposed by next generation high performance computing architectures. However, current runtime system technologies are not production-ready. DARMA is a portability layer that seeks to insulate application developers from idiosyncrasies of individual runtime systems, thereby facilitating application-developer use of these technologies. DARMA comprises a frontend application programming interface (API) for application developers, a backend API for runtime system developers, and a translation that translates frontend API calls into backend API calls. Application developers use C++ abstractions to annotate both data and tasksmore » in their code. The DARMA translation layer uses C++ template metaprogramming to capture data-task dependencies, and provides this information to a potential backend runtime system via a series of backend API calls.« less

  12. Consolidated fuel reprossing program: The implications of force reflection for teleoperation in space

    NASA Technical Reports Server (NTRS)

    Draper, John V.; Herndon, Joseph N.; Moore, Wendy E.

    1987-01-01

    Previous research on teleoperator force feedback is reviewed and results of a testing program which assessed the impact of force reflection on teleoperator task performance are reported. Force relection is a type of force feedback in which the forces acting on the remote portion of the teleoperator are displayed to the operator by back-driving the master controller. The testing program compared three force reflection levels: 4 to 1 (four units of force on the slave produce one unit of force at the master controller), 1 to 1, and infinity to 1 (no force reflection). Time required to complete tasks, rate of occurrence of errors, the maximum force applied to tasks components, and variability in forces applied to components during completion of representative remote handling tasks were used as dependent variables. Operators exhibited lower error rates, lower peak forces, and more consistent application of forces using force relection than they did without it. These data support the hypothesis that force reflection provides useful information for teleoperator users. The earlier literature and the results of the experiment are discussed in terms of their implications for space based teleoperator systems. The discussion described the impact of force reflection on task completion performance and task strategies, as suggested by the literature. It is important to understand the trade-offs involved in using telerobotic systems with and without force reflection.

  13. Selling points: What cognitive abilities are tapped by casual video games?

    PubMed Central

    Baniqued, Pauline L.; Lee, Hyunkyu; Voss, Michelle W.; Basak, Chandramallika; Cosman, Joshua D.; DeSouza, Shanna; Severson, Joan; Salthouse, Timothy A.; Kramer, Arthur F.

    2013-01-01

    The idea that video games or computer-based applications can improve cognitive function has led to a proliferation of programs claiming to “train the brain.” However, there is often little scientific basis in the development of commercial training programs, and many research-based programs yield inconsistent or weak results. In this study, we sought to better understand the nature of cognitive abilities tapped by casual video games and thus reflect on their potential as a training tool. A moderately large sample of participants (n=209) played 20 web-based casual games and performed a battery of cognitive tasks. We used cognitive task analysis and multivariate statistical techniques to characterize the relationships between performance metrics. We validated the cognitive abilities measured in the task battery, examined a task analysis-based categorization of the casual games, and then characterized the relationship between game and task performance. We found that games categorized to tap working memory and reasoning were robustly related to performance on working memory and fluid intelligence tasks, with fluid intelligence best predicting scores on working memory and reasoning games. We discuss these results in the context of overlap in cognitive processes engaged by the cognitive tasks and casual games, and within the context of assessing near and far transfer. While this is not a training study, these findings provide a methodology to assess the validity of using certain games as training and assessment devices for specific cognitive abilities, and shed light on the mixed transfer results in the computer-based training literature. Moreover, the results can inform design of a more theoretically-driven and methodologically-sound cognitive training program. PMID:23246789

  14. Selling points: What cognitive abilities are tapped by casual video games?

    PubMed

    Baniqued, Pauline L; Lee, Hyunkyu; Voss, Michelle W; Basak, Chandramallika; Cosman, Joshua D; Desouza, Shanna; Severson, Joan; Salthouse, Timothy A; Kramer, Arthur F

    2013-01-01

    The idea that video games or computer-based applications can improve cognitive function has led to a proliferation of programs claiming to "train the brain." However, there is often little scientific basis in the development of commercial training programs, and many research-based programs yield inconsistent or weak results. In this study, we sought to better understand the nature of cognitive abilities tapped by casual video games and thus reflect on their potential as a training tool. A moderately large sample of participants (n=209) played 20 web-based casual games and performed a battery of cognitive tasks. We used cognitive task analysis and multivariate statistical techniques to characterize the relationships between performance metrics. We validated the cognitive abilities measured in the task battery, examined a task analysis-based categorization of the casual games, and then characterized the relationship between game and task performance. We found that games categorized to tap working memory and reasoning were robustly related to performance on working memory and fluid intelligence tasks, with fluid intelligence best predicting scores on working memory and reasoning games. We discuss these results in the context of overlap in cognitive processes engaged by the cognitive tasks and casual games, and within the context of assessing near and far transfer. While this is not a training study, these findings provide a methodology to assess the validity of using certain games as training and assessment devices for specific cognitive abilities, and shed light on the mixed transfer results in the computer-based training literature. Moreover, the results can inform design of a more theoretically-driven and methodologically-sound cognitive training program. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Feasibility of remotely manipulated welding in space: A step in the development of novel joining technologies

    NASA Technical Reports Server (NTRS)

    Masubuchi, K.; Agapakis, J. E.; Debiccari, A.; Vonalt, C.

    1985-01-01

    A six month research program entitled Feasibility of Remotely Manipulated Welding in Space - A Step in the Development of Novel Joining Technologies is performed at the Massachusetts Institute of Technology for the Office of Space Science and Applications, NASA, under Contract No. NASW-3740. The work is performed as a part of the Innovative Utilization of the Space Station Program. The final report from M.I.T. was issued in September 1983. This paper presents a summary of the work performed under this contract. The objective of this research program is to initiate research for the development of packaged, remotely controlled welding systems for space construction and repair. The research effort includes the following tasks: (1) identification of probable joining tasks in space; (2) identification of required levels of automation in space welding tasks; (3) development of novel space welding concepts; (4) development of recommended future studies; and (5) preparation of the final report.

  16. Dimensions of complexity in learning from interactive instruction. [for robotic systems deployed in space

    NASA Technical Reports Server (NTRS)

    Huffman, Scott B.; Laird, John E.

    1992-01-01

    Robot systems deployed in space must exhibit flexibility. In particular, an intelligent robotic agent should not have to be reprogrammed for each of the various tasks it may face during the course of its lifetime. However, pre-programming knowledge for all of the possible tasks that may be needed is extremely difficult. Therefore, a powerful notion is that of an instructible agent, one which is able to receive task-level instructions and advice from a human advisor. An agent must do more than simply memorize the instructions it is given (this would amount to programming). Rather, after mapping instructions into task constructs that it can reason with, it must determine each instruction's proper scope of applicability. In this paper, we will examine the characteristics of instruction, and the characteristics of agents, that affect learning from instruction. We find that in addition to a myriad of linguistic concerns, both the situatedness of the instructions (their placement within the ongoing execution of tasks) and the prior domain knowledge of the agent have an impact on what can be learned.

  17. PALP: A Package for Analysing Lattice Polytopes with applications to toric geometry

    NASA Astrophysics Data System (ADS)

    Kreuzer, Maximilian; Skarke, Harald

    2004-02-01

    We describe our package PALP of C programs for calculations with lattice polytopes and applications to toric geometry, which is freely available on the internet. It contains routines for vertex and facet enumeration, computation of incidences and symmetries, as well as completion of the set of lattice points in the convex hull of a given set of points. In addition, there are procedures specialized to reflexive polytopes such as the enumeration of reflexive subpolytopes, and applications to toric geometry and string theory, like the computation of Hodge data and fibration structures for toric Calabi-Yau varieties. The package is well tested and optimized in speed as it was used for time consuming tasks such as the classification of reflexive polyhedra in 4 dimensions and the creation and manipulation of very large lists of 5-dimensional polyhedra. While originally intended for low-dimensional applications, the algorithms work in any dimension and our key routine for vertex and facet enumeration compares well with existing packages. Program summaryProgram obtainable form: CPC Program Library, Queen's University of Belfast, N. Ireland Title of program: PALP Catalogue identifier: ADSQ Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSQ Computer for which the program is designed: Any computer featuring C Computers on which it has been tested: PCs, SGI Origin 2000, IBM RS/6000, COMPAQ GS140 Operating systems under which the program has been tested: Linux, IRIX, AIX, OSF1 Programming language used: C Memory required to execute with typical data: Negligible for most applications; highly variable for analysis of large polytopes; no minimum but strong effects on calculation time for some tasks Number of bits in a word: arbitrary Number of processors used: 1 Has the code been vectorised or parallelized?: No Number of bytes in distributed program, including test data, etc.: 138 098 Distribution format: tar gzip file Keywords: Lattice polytopes, facet enumeration, reflexive polytopes, toric geometry, Calabi-Yau manifolds, string theory, conformal field theory Nature of problem: Certain lattice polytopes called reflexive polytopes afford a combinatorial description of a very large class of Calabi-Yau manifolds in terms of toric geometry. These manifolds play an essential role for compactifications of string theory. While originally designed to handle and classify reflexive polytopes, with particular emphasis on problems relevant to string theory applications [M. Kreuzer and H. Skarke, Rev. Math. Phys. 14 (2002) 343], the package also handles standard questions (facet enumeration and similar problems) about arbitrary lattice polytopes very efficiently. Method of solution: Much of the code is straightforward programming, but certain key routines are optimized with respect to calculation time and the handling of large sets of data. A double description method (see, e.g., [D. Avis et al., Comput. Geometry 7 (1997) 265]) is used for the facet enumeration problem, lattice basis reduction for extended gcd and a binary database structure for tasks involving large numbers of polytopes, such as classification problems. Restrictions on the complexity of the program: The only hard limitation comes from the fact that fixed integer arithmetic (32 or 64 bit) is used, allowing for input data (polytope coordinates) of roughly up to 10 9. Other parameters (dimension, numbers of points and vertices, etc.) can be set before compilation. Typical running time: Most tasks (typically: analysis of a four dimensional reflexive polytope) can be perfomed interactively within milliseconds. The classification of all reflexive polytopes in four dimensions takes several processor years. The facet enumeration problem for higher (e.g., 12-20) dimensional polytopes varies strongly with the dimension and structure of the polytope; here PALP's performance is similar to that of existing packages [Avis et al., Comput. Geometry 7 (1997) 265]. Unusual features of the program: None

  18. HAL/S language specification. Version IR-542

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The formal HAL/S language specification is documented with particular referral to the essentials of HAL/S syntax and semantics. The language is intended to satisfy virtually all of the flight software requirements of NASA programs. To achieve this, HAL/S incorporates a wide range of features, including applications oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks.

  19. The Development of Fuel Cell Technology for NASA's Human Spaceflight Program

    NASA Technical Reports Server (NTRS)

    Scott, John H.

    2007-01-01

    My task this morning is to review the history and current direction of fuel cell technology development for NASA's human spaceflight program and to compare it to the directions being taken in that field for The Hydrogen Economy. The concept of "The Hydrogen Economy" involves many applications for fuel cells, but for today's discussion, I'll focus on automobiles.

  20. DRACULA: Dynamic range control for broadcasting and other applications

    NASA Astrophysics Data System (ADS)

    Gilchrist, N. H. C.

    The BBC has developed a digital processor which is capable of reducing the dynamic range of audio in an unobtrusive manner. It is ideally suited to the task of controlling the level of musical programs. Operating as a self-contained dynamic range controller, the processor is suitable for controlling levels in conventional AM or FM broadcasting, or for applications such as the compression of program material for in-flight entertainment. It can, alternatively, be used to provide a supplementary signal in DAB (digital audio broadcasting) for optional dynamic compression in the receiver.

  1. Using NetCloak to develop server-side Web-based experiments without writing CGI programs.

    PubMed

    Wolfe, Christopher R; Reyna, Valerie F

    2002-05-01

    Server-side experiments use the Web server, rather than the participant's browser, to handle tasks such as random assignment, eliminating inconsistencies with JAVA and other client-side applications. Heretofore, experimenters wishing to create server-side experiments have had to write programs to create common gateway interface (CGI) scripts in programming languages such as Perl and C++. NetCloak uses simple, HTML-like commands to create CGIs. We used NetCloak to implement an experiment on probability estimation. Measurements of time on task and participants' IP addresses assisted quality control. Without prior training, in less than 1 month, we were able to use NetCloak to design and create a Web-based experiment and to help graduate students create three Web-based experiments of their own.

  2. Ground Robotic Hand Applications for the Space Program study (GRASP)

    NASA Astrophysics Data System (ADS)

    Grissom, William A.; Rafla, Nader I.

    1992-04-01

    This document reports on a NASA-STDP effort to address research interests of the NASA Kennedy Space Center (KSC) through a study entitled, Ground Robotic-Hand Applications for the Space Program (GRASP). The primary objective of the GRASP study was to identify beneficial applications of specialized end-effectors and robotic hand devices for automating any ground operations which are performed at the Kennedy Space Center. Thus, operations for expendable vehicles, the Space Shuttle and its components, and all payloads were included in the study. Typical benefits of automating operations, or augmenting human operators performing physical tasks, include: reduced costs; enhanced safety and reliability; and reduced processing turnaround time.

  3. The Transportable Applications Environment - An interactive design-to-production development system

    NASA Technical Reports Server (NTRS)

    Perkins, Dorothy C.; Howell, David R.; Szczur, Martha R.

    1988-01-01

    An account is given of the design philosophy and architecture of the Transportable Applications Environment (TAE), an executive program binding a system of applications programs into a single, easily operable whole. TAE simplifies the job of a system developer by furnishing a stable framework for system-building; it also integrates system activities, and cooperates with the host operating system in order to perform such functions as task-scheduling and I/O. The initial TAE human/computer interface supported command and menu interfaces, data displays, parameter-prompting, error-reporting, and online help. Recent extensions support graphics workstations with a window-based, modeless user interface.

  4. Ground Robotic Hand Applications for the Space Program study (GRASP)

    NASA Technical Reports Server (NTRS)

    Grissom, William A.; Rafla, Nader I. (Editor)

    1992-01-01

    This document reports on a NASA-STDP effort to address research interests of the NASA Kennedy Space Center (KSC) through a study entitled, Ground Robotic-Hand Applications for the Space Program (GRASP). The primary objective of the GRASP study was to identify beneficial applications of specialized end-effectors and robotic hand devices for automating any ground operations which are performed at the Kennedy Space Center. Thus, operations for expendable vehicles, the Space Shuttle and its components, and all payloads were included in the study. Typical benefits of automating operations, or augmenting human operators performing physical tasks, include: reduced costs; enhanced safety and reliability; and reduced processing turnaround time.

  5. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  6. Controller resource management : what can we learn from aircrews?

    DOT National Transportation Integrated Search

    1995-07-01

    This paper provides an overview of the scientific literature regarding Crew Resource Management (CRM). It responds to tasking from the Office of Air Traffic Program Management to conduct studies addressing the application of team training models such...

  7. 30 CFR 1229.123 - Standards for audit activities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... professional proficiency for the tasks required, including a knowledge of accounting, auditing, agency... shall maintain an independent attitude and appearance. (iii) Due professional care. Due professional... accordance with the generally accepted program audit standards (including the applicable General Accounting...

  8. Microgravity Science and Applications. Program Tasks and Bibliography for FY 1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    An annual report published by the Microgravity Science and Applications Division (MSAD) of NASA is presented. It represents a compilation of the Division's currently-funded ground, flight and Advanced Technology Development tasks. An overview and progress report for these tasks, including progress reports by principal investigators selected from the academic, industry and government communities, are provided. The document includes a listing of new bibliographic data provided by the principal investigators to reflect the dissemination of research data during FY 1993 via publications and presentations. The document also includes division research metrics and an index of the funded investigators. The document contains three sections and three appendices: Section 1 includes an introduction and metrics data, Section 2 is a compilation of the task reports in an order representative of its ground, flight or ATD status and the science discipline it represents, and Section 3 is the bibliography. The three appendices, in the order of presentation, are: Appendix A - a microgravity science acronym list, Appendix B - a list of guest investigators associated with a biotechnology task, and Appendix C - an index of the currently funded principal investigators.

  9. X-Windows Information Sharing Protocol Widget Class

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.

    2006-01-01

    The X-Windows Information Sharing Protocol (ISP) Widget Class ("Class") is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing ISP graphical-user-interface (GUI) computer programs. ISP programming tasks require many method calls to identify, query, and interpret the connections and messages exchanged between a client and an ISP server. Most X-Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Information Sharing Protocol (ISP) Widget Class encapsulates the client side of the ISP programming libraries within the framework of an X-Windows widget. Using the widget framework, X-Windows GUI programs can interact with ISP services in an abstract way and in the same manner as that of other graphical widgets, making it easier to write ISP GUI client programs. Wrapping ISP client services inside a widget framework enables a programmer to treat an ISP server interface as though it were a GUI. Moreover, an alternate subclass could implement another communication protocol in the same sort of widget.

  10. Analysis of pilot control strategy

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Hanson, G. D.; Jewell, W. F.; Clement, W. F.

    1983-01-01

    Methods for nonintrusive identification of pilot control strategy and task execution dynamics are presented along with examples based on flight data. The specific analysis technique is Nonintrusive Parameter Identification Procedure (NIPIP), which is described in a companion user's guide (NASA CR-170398). Quantification of pilot control strategy and task execution dynamics is discussed in general terms followed by a more detailed description of how NIPIP can be applied. The examples are based on flight data obtained from the NASA F-8 digital fly by wire airplane. These examples involve various piloting tasks and control axes as well as a demonstration of how the dynamics of the aircraft itself are identified using NIPIP. Application of NIPIP to the AFTI/F-16 flight test program is discussed. Recommendations are made for flight test applications in general and refinement of NIPIP to include interactive computer graphics.

  11. Revised and extended UTILITIES for the RATIP package

    NASA Astrophysics Data System (ADS)

    Nikkinen, J.; Fritzsche, S.; Heinäsmäki, S.

    2006-09-01

    During the last years, the RATIP package has been found useful for calculating the excitation and decay properties of free atoms. Based on the (relativistic) multiconfiguration Dirac-Fock method, this program is used to obtain accurate predictions of atomic properties and to analyze many recent experiments. The daily work with this package made an extension of its UTILITIES [S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163] desirable in order to facilitate the data handling and interpretation of complex spectra. For this purpose, we make available an enlarged version of the UTILITIES which mainly supports the comparison with experiment as well as large Auger computations. Altogether 13 additional tasks have been appended to the program together with a new menu structure to improve the interactive control of the program. Program summaryTitle of program: RATIP Catalogue identifier: ADPD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Reference in CPC to previous version: S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163 Catalogue identifier of previous version: ADPD Authors of previous version: S. Fritzsche, Department of Physics, University of Kassel, Heinrich-Plett-Strasse 40, D-34132 Kassel, Germany Does the new version supersede the original program?: yes Computer for which the new version is designed and others on which it has been tested: IBM RS 6000, PC Pentium II-IV Installations: University of Kassel (Germany), University of Oulu (Finland) Operating systems: IBM AIX, Linux, Unix Program language used in the new version: ANSI standard Fortran 90/95 Memory required to execute with typical data: 300 kB No. of bits in a word: All real variables are parameterized by a selected kind parameter and, thus, can be adapted to any required precision if supported by the compiler. Currently, the kind parameter is set to double precision (two 32-bit words) as used also for other components of the RATIP package [S. Fritzsche, C.F. Fischer, C.Z. Dong, Comput. Phys. Comm. 124 (2000) 341; G. Gaigalas, S. Fritzsche, Comput. Phys. Comm. 134 (2001) 86; S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163; S. Fritzsche, J. Elec. Spec. Rel. Phen. 114-116 (2001) 1155] No. of lines in distributed program, including test data, etc.:231 813 No. of bytes in distributed program, including test data, etc.: 3 977 387 Distribution format: tar.gzip file Nature of the physical problem: In order to describe atomic excitation and decay properties also quantitatively, large-scale computations are often needed. In the framework of the RATIP package, the UTILITIES support a variety of (small) tasks. For example, these tasks facilitate the file and data handling in large-scale applications or in the interpretation of complex spectra. Method of solution: The revised UTILITIES now support a total of 29 subtasks which are mainly concerned with the manipulation of output data as obtained from other components of the RATIP package. Each of these tasks are realized by one or several subprocedures which have access to the corresponding modules of the main components. While the main menu defines seven groups of subtasks for data manipulations and computations, a particular task is selected from one of these group menus. This allows to enlarge the program later if technical support for further tasks will become necessary. For each selected task, an interactive dialog about the required input and output data as well as a few additional information are printed during the execution of the program. Reasons for the new version: The requirement for enlarging the previous version of the UTILITIES [S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163] arose from the recent application of the RATIP package for large-scale radiative and Auger computations. A number of new subtasks now refer to the handling of Auger amplitudes and their proper combination in order to facilitate the interpretation of complex spectra. A few further tasks, such as the direct access to the one-electron matrix elements for some given set of orbital functions, have been found useful also in the analysis of data. Summary of revisions: extraction and handling of atomic data within the framework of RATIP. With the revised version, we now 'add' another 13 tasks which refer to the manipulation of data files, the generation and interpretation of Auger spectra, the computation of various one- and two-electron matrix elements as well as the evaluation of momentum densities and grid parameters. Owing to the rather large number of subtasks, the main menu has been divided into seven groups from which the individual tasks can be selected very similarly as before. Typical running time: The program responds promptly for most of the tasks. The responding time for some tasks, such as the generation of a relativistic momentum density, strongly depends on the size of the corresponding data files and the number of grid points. Unusual features of the program: A total of 29 different tasks are supported by the program. Starting from the main menu, the user is guided interactively through the program by a dialog and a few additional explanations. For each task, a short summary about its function is displayed before the program prompts for all the required input data.

  12. Definition Of Touch-Sensitive Zones For Graphical Displays

    NASA Technical Reports Server (NTRS)

    Monroe, Burt L., III; Jones, Denise R.

    1988-01-01

    Touch zones defined simply by touching, while editing done automatically. Development of touch-screen interactive computing system, tedious task. Interactive Editor for Definition of Touch-Sensitive Zones computer program increases efficiency of human/machine communications by enabling user to define each zone interactively, minimizing redundancy in programming and eliminating need for manual computation of boundaries of touch areas. Information produced during editing process written to data file, to which access gained when needed by application program.

  13. Interactive Vulnerability Analysis Enhancement Results

    DTIC Science & Technology

    2012-12-01

    from JavaEE web based applications to other non-web based Java programs. Technology developed in this effort should be generally applicable to other...Generating a rule is a 2 click process that requires no input from the user. • Task 3: Added support for non- Java EE applications Aspect’s...investigated a variety of Java -based technologies and how IAST can support them. We were successful in adding support for Scala, a popular new language, and

  14. Dynamics and Control of Non-Smooth Systems with Applications to Supercavitating Vehicles

    DTIC Science & Technology

    2011-01-01

    ABSTRACT Title of dissertation: Dynamics and Control of Non-Smooth Systems with Applications to Supercavitating Vehicles Vincent Nguyen, Doctor of...relates to the dynamics of non-smooth vehicle systems, and in particular, supercavitating vehicles. These high-speed under- water vehicles are...Applications to Supercavitating Vehicles 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  15. 40 CFR 170.260 - Emergency assistance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Emergency assistance. 170.260 Section 170.260 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS... from handling tasks or from application, splash, spill, drift, or pesticide residues, the handler...

  16. 40 CFR 170.260 - Emergency assistance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Emergency assistance. 170.260 Section 170.260 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS... from handling tasks or from application, splash, spill, drift, or pesticide residues, the handler...

  17. 40 CFR 170.260 - Emergency assistance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Emergency assistance. 170.260 Section 170.260 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS... from handling tasks or from application, splash, spill, drift, or pesticide residues, the handler...

  18. 40 CFR 170.260 - Emergency assistance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Emergency assistance. 170.260 Section 170.260 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS... from handling tasks or from application, splash, spill, drift, or pesticide residues, the handler...

  19. 40 CFR 170.260 - Emergency assistance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Emergency assistance. 170.260 Section 170.260 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS... from handling tasks or from application, splash, spill, drift, or pesticide residues, the handler...

  20. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures, Task 1, Volume 2: Support Volume, Final Report

    DOT National Transportation Integrated Search

    1994-10-01

    THE RUN-OFF-ROAD COLLISION AVOIDANCE USING LVHS COUNTERMEASURES PROGRAM IS TO ADDRESS THE SINGLE VEHICLE CRASH PROBLEM THROUGH APPLICATION OF TECHNOLOGY TO PREVENT AND/OR REDUCE THE SEVERITY OF THESE CRASHES.

  1. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, either are too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by end users or high-level programming models. We describe the design, implementation, and performance characterization of Argobots and present integrations with three high-level models: OpenMP, MPI, and colocated I/O services. Evaluations show that (1) Argobots, while providing richer capabilities, is competitive with existing simpler generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency-hiding capabilities; and (4) I/O services with Argobots reduce interference with colocated applications while achieving performance competitive with that of a Pthreads approach.« less

  2. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE PAGES

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan; ...

    2017-10-24

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  3. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  4. Expert Systems in Contract Management. A Pilot Study.

    DTIC Science & Technology

    1985-10-01

    and appropriately selected .1 ~..they can provide valuable assistance to managers in their tasks of planning and control. If these techniques are to ...the programs that would affect their relevance -> . to construction management applications in general. We also ascertained which programs have... further experience of SAVOIR applied in a fairly complex real-world domain This has, in their view, confirmed the suitability of SAVOIR for the domain

  5. Clinical ladder program implementation: a project guide.

    PubMed

    Ko, Yu Kyung; Yu, Soyoung

    2014-11-01

    This article describes the development of a clinical ladder program (CLP) implementation linked to a promotion system for nurses. The CLP task force developed criteria for each level of performance and a performance evaluation tool reflecting the self-motivation of the applicant for professional development. One year after implementation, the number of nurses taking graduate courses increased, and 7 nurses were promoted to nurse manager positions.

  6. A comparative study of the Unified System for Orbit Computation and the Flight Design System. [computer programs for mission planning tasks associated with space shuttle

    NASA Technical Reports Server (NTRS)

    Maag, W.

    1977-01-01

    The Flight Design System (FDS) and the Unified System for Orbit Computation (USOC) are compared and described in relation to mission planning for the shuttle transportation system (STS). The FDS is designed to meet the requirements of a standardized production tool and the USOC is designed for rapid generation of particular application programs. The main emphasis in USOC is put on adaptability to new types of missions. It is concluded that a software system having a USOC-like structure, adapted to the specific needs of MPAD, would be appropriate to support planning tasks in the area unique to STS missions.

  7. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  8. Low-cost solar array project progress and plans

    NASA Technical Reports Server (NTRS)

    Callaghan, W. T.

    1981-01-01

    The considered project is part of the DOE Photovoltaic Technology and Market Development Program. This program is concerned with the development and the utilization of cost-competitive photovoltaic systems. The project has the objective to develop, by 1986, the national capability to manufacture low-cost, long-life photovoltaic arrays at production rates that will realize economies of scale, and at a price of less than $0.70/watt. The array performance objectives include an efficiency greater than 10% and an operating lifetime longer than 20 years. The objective of the silicon material task is to establish the practicality of processes for producing silicon suitable for terrestrial photovoltaic applications at a price of $14/kg. The large-area sheet task is concerned with the development of process technology for sheet formation. Low-cost encapsulation material systems are being developed in connection with the encapsulation task. Another project goal is related to the development of economical process sequences.

  9. Navy Omni-Directional Vehicle (ODV) development program

    NASA Technical Reports Server (NTRS)

    Mcgowen, Hillery

    1994-01-01

    The Omni-Directional Vehicle (ODV) development program sponsored by the Office of Naval Research at the Coastal Systems Station has investigated the application of ODV technology for use in the Navy shipboard environment. ODV technology as originally received by the Navy in the form of the Cadillac-Gage Side Mover Vehicle was applicable to the shipboard environment with the potential to overcome conditions of reduced traction, ship motion, decks heeled at high angles, obstacles, and confined spaces. Under the Navy program, ODV technology was investigated and a series of experimental vehicles were built and successfully tested under extremely demanding conditions. The ODV drive system has been found to be applicable to autonomous, remotely, or manually operated vehicles. Potential commercial applications include multi-directional forklift trucks, automatic guided vehicles employed in manufacturing environments, and remotely controlled platforms used in nuclear facilities or for hazardous waste clean up tasks.

  10. Navy Omni-Directional Vehicle (ODV) development program

    NASA Astrophysics Data System (ADS)

    McGowen, Hillery

    1994-02-01

    The Omni-Directional Vehicle (ODV) development program sponsored by the Office of Naval Research at the Coastal Systems Station has investigated the application of ODV technology for use in the Navy shipboard environment. ODV technology as originally received by the Navy in the form of the Cadillac-Gage Side Mover Vehicle was applicable to the shipboard environment with the potential to overcome conditions of reduced traction, ship motion, decks heeled at high angles, obstacles, and confined spaces. Under the Navy program, ODV technology was investigated and a series of experimental vehicles were built and successfully tested under extremely demanding conditions. The ODV drive system has been found to be applicable to autonomous, remotely, or manually operated vehicles. Potential commercial applications include multi-directional forklift trucks, automatic guided vehicles employed in manufacturing environments, and remotely controlled platforms used in nuclear facilities or for hazardous waste clean up tasks.

  11. HAL/S programmer's guide. [for space shuttle program

    NASA Technical Reports Server (NTRS)

    Newbold, P. M.; Hotz, R. L.

    1974-01-01

    This programming language was developed for the flight software of the NASA space shuttle program. HAL/S is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, HAL/s incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. As the name indicates, HAL/S is a dialect of the original HAL language previously developed. Changes have been incorporated to simplify syntax, curb excessive generality, or facilitate flight code emission.

  12. Silicon Carbide Power Devices and Integrated Circuits

    NASA Technical Reports Server (NTRS)

    Lauenstein, Jean-Marie; Casey, Megan; Samsel, Isaak; LaBel, Ken; Chen, Yuan; Ikpe, Stanley; Wilcox, Ted; Phan, Anthony; Kim, Hak; Topper, Alyson

    2017-01-01

    An overview of the NASA NEPP Program Silicon Carbide Power Device subtask is given, including the current task roadmap, partnerships, and future plans. Included are the Agency-wide efforts to promote development of single-event effect hardened SiC power devices for space applications.

  13. Workshop on The Rio Grande Rift: Crustal Modeling and Applications of Remote Sensing

    NASA Technical Reports Server (NTRS)

    Blanchard, D. P. (Editor)

    1980-01-01

    The elements of a program that could address significant earth science problems by combining remote sensing and traditional geological, geophysical, and geochemical approaches were addressed. Specific areas and tasks related to the Rio Grande Rift are discussed.

  14. MoDOT pavement preservation research program volume V, site-specific pavement condition assessment.

    DOT National Transportation Integrated Search

    2015-11-01

    The overall objective of Task 4 was to thoroughly assess the cost-effectiveness and utility of selected non-invasive technologies as : applicable to MoDOT roadways. Non-invasive imaging technologies investigated in this project were Ultrasonic Surfac...

  15. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  16. Structured Analysis/Design - LSA Task 101, Early Logistic Support Analysis Strategy, Subtask 101.2.1, Develop Early LSA Strategy

    DTIC Science & Technology

    1990-07-01

    replacing "logic diagrams" or "flow charts") to aid in coordinating the functions to be performed by a computer program and its associated Inputs...ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT ITASK IWORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE...the analysis. Both the logical model and detailed procedures are used to develop the application software programs which will be provided to Government

  17. Machine Learning in Intrusion Detection

    DTIC Science & Technology

    2005-07-01

    machine learning tasks. Anomaly detection provides the core technology for a broad spectrum of security-centric applications. In this dissertation, we examine various aspects of anomaly based intrusion detection in computer security. First, we present a new approach to learn program behavior for intrusion detection. Text categorization techniques are adopted to convert each process to a vector and calculate the similarity between two program activities. Then the k-nearest neighbor classifier is employed to classify program behavior as normal or intrusive. We demonstrate

  18. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2008-11-21

    PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...optimal construction sequence and application of lessons learned for follow-on vessels in these programs. In the LPD 17 program, the Navy’s reliance on an...I am deeply concerned about Northrop Grumman Ship Systems’ ( NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to

  19. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2011-04-20

    and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...construction sequence and application of lessons learned for follow-on vessels in these programs. In the LPD 17 program, the Navy’s reliance on an immature...deeply concerned about Northrop Grumman Ship Systems’ ( NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to

  20. Controls structures interaction, an interdisciplinary challenge for large spacecraft

    NASA Technical Reports Server (NTRS)

    Hanks, Brantley R.

    1990-01-01

    Controls structures interaction (CSI), a phenomenon which occurs when control forces interact with the flexible motion of a structure, can, if improperly treated in design and development, cause reduced performance or control instabilities. Properly applied, it can improve flexible spacecraft performance. In this paper, the NASA CSI technology program for future spacecraft applications is described. The program objectives and organization are outlined, and the nature of individual program tasks is described. The interdisciplinary aspects of CSI are also addressed.

  1. Principles and Application of Magnetic Rubber Testing for Crack Detection in High-Strength Steel Components: I. Active-Field Inspection

    DTIC Science & Technology

    2014-12-01

    Historically, MRT found its most extensive application in the inspection of critical high-strength steel components of the F-111 aircraft to...Steve Burke is Group Leader Acoustic Material Systems within Maritime Division and Task Leader for AIR 07/101 Assessment and Control of Aircraft ...Maritime Division. He has previously led research programs in advanced electromagnetic and ultrasonic NDE for aircraft applications. Geoff has BSc and BE

  2. Imaging Tasks Scheduling for High-Altitude Airship in Emergency Condition Based on Energy-Aware Strategy

    PubMed Central

    Zhimeng, Li; Chuan, He; Dishan, Qiu; Jin, Liu; Manhao, Ma

    2013-01-01

    Aiming to the imaging tasks scheduling problem on high-altitude airship in emergency condition, the programming models are constructed by analyzing the main constraints, which take the maximum task benefit and the minimum energy consumption as two optimization objectives. Firstly, the hierarchy architecture is adopted to convert this scheduling problem into three subproblems, that is, the task ranking, value task detecting, and energy conservation optimization. Then, the algorithms are designed for the sub-problems, and the solving results are corresponding to feasible solution, efficient solution, and optimization solution of original problem, respectively. This paper makes detailed introduction to the energy-aware optimization strategy, which can rationally adjust airship's cruising speed based on the distribution of task's deadline, so as to decrease the total energy consumption caused by cruising activities. Finally, the application results and comparison analysis show that the proposed strategy and algorithm are effective and feasible. PMID:23864822

  3. Programmer's manual for the Mission Analysis Evaluation and Space Trajectory Operations program (MAESTRO)

    NASA Technical Reports Server (NTRS)

    Lutzky, D.; Bjorkman, W. S.

    1973-01-01

    The Mission Analysis Evaluation and Space Trajectory Operations program known as MAESTRO is described. MAESTRO is an all FORTRAN, block style, computer program designed to perform various mission control tasks. This manual is a guide to MAESTRO, providing individuals the capability of modifying the program to suit their needs. Descriptions are presented of each of the subroutines descriptions consist of input/output description, theory, subroutine description, and a flow chart where applicable. The programmer's manual also contains a detailed description of the common blocks, a subroutine cross reference map, and a general description of the program structure.

  4. Efficient Ada multitasking on a RISC register window architecture

    NASA Technical Reports Server (NTRS)

    Kearns, J. P.; Quammen, D.

    1987-01-01

    This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.

  5. A Low-Cost Part-Task Flight Training System: An Application of a Head Mounted Display

    DTIC Science & Technology

    1990-12-01

    architecture. The task at hand was to develop a software emulation libary that would emulate the function calls used within the Flight and Dog programs. This...represented in two hexadecimal digits for each color. The format of the packed long integer looks like aaggbbrr with each color value representing a...Western Digital ethernet card as the cheapest compatible card available. Good fortune arrived, as I was calling to order the card, I saw an unused card

  6. Research and technology at the Lyndon B. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Johnson Space Center accomplishments in new and advanced concepts during 1983 are highlighted. Included are research funded by the Office of Aeronautics and Space Technology; Advanced Programs tasks funded by the Office of Space Flight; and Solar System Explorations, Life Sciences, and Earth Sciences and Applications research funded by the Office of Space Sciences and Applications. Summary sections describing the role of the Johnson Space Center in each program are followed by one-page descriptions of significant projects. Descriptions are suitable for external consumption, free of technical jargon, and illustrated to increase ease of comprehension.

  7. Deployable antenna phase A study

    NASA Technical Reports Server (NTRS)

    Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.

    1979-01-01

    Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.

  8. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures - Task 4, Volume 2: RORSIM Manual

    DOT National Transportation Integrated Search

    1995-09-05

    The Run-Off-Road Collision Avoidance Using IVHS Countermeasures program is to address the single vehicle crash problem through application of technology to prevent and/or reduce the severity of these crashes. : This report documents the RORSIM comput...

  9. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures, Task 3, Volume 2, Final Report

    DOT National Transportation Integrated Search

    1995-08-01

    INTELLIGENT VEHICLE INITIATIVE OR IVI : THE RUN-OFF-ROAD COLLISION AVOIDANCE USING IVHS COUNTERMEASURES PROGRAM IS TO ADDRESS THE SINGLE VEHICLE CRASH PROBLEM THROUGH APPLICATION OF TECHNOLOGY TO PREVENT AND/OR REDUCE THE SEVERITY OF THESE CRASHES. :...

  10. Application and Implications of Agent Technology for Librarians.

    ERIC Educational Resources Information Center

    Nardi, Bonnie A.; O'Day, Vicki L.

    1998-01-01

    Examines intelligent software agents, presents nine design principles aimed specifically at the technology perspective (to personalize task performance and general principles), and discusses what librarians can do that software agents (agents defined as activity-aware software programs) cannot do. Describes an information ecology that integrates…

  11. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures: Task 3, Volume 1

    DOT National Transportation Integrated Search

    1995-08-23

    The Run-Off-Road Collision Avoidance Using IVHS Countermeasures program is to address the single vehicle crash problem through application of technology to prevent and/or reduce the severity oi these crashes. This report describes the findings of the...

  12. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures Task 3 - Volume 2

    DOT National Transportation Integrated Search

    1995-08-23

    The Run-Off-Road Collision Avoidance Using IVHS Countermeasures program is to address the single vehicle crash problem through application of technology to prevent and/or reduce the severity of these crashes. : This report describes the findings of t...

  13. Parallelization of NAS Benchmarks for Shared Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.

  14. Work stealing for GPU-accelerated parallel programs in a global address space framework: WORK STEALING ON GPU-ACCELERATED SYSTEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arafat, Humayun; Dinan, James; Krishnamoorthy, Sriram

    Task parallelism is an attractive approach to automatically load balance the computation in a parallel system and adapt to dynamism exhibited by parallel systems. Exploiting task parallelism through work stealing has been extensively studied in shared and distributed-memory contexts. In this paper, we study the design of a system that uses work stealing for dynamic load balancing of task-parallel programs executed on hybrid distributed-memory CPU-graphics processing unit (GPU) systems in a global-address space framework. We take into account the unique nature of the accelerator model employed by GPUs, the significant performance difference between GPU and CPU execution as a functionmore » of problem size, and the distinct CPU and GPU memory domains. We consider various alternatives in designing a distributed work stealing algorithm for CPU-GPU systems, while taking into account the impact of task distribution and data movement overheads. These strategies are evaluated using microbenchmarks that capture various execution configurations as well as the state-of-the-art CCSD(T) application module from the computational chemistry domain.« less

  15. Work stealing for GPU-accelerated parallel programs in a global address space framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arafat, Humayun; Dinan, James; Krishnamoorthy, Sriram

    Task parallelism is an attractive approach to automatically load balance the computation in a parallel system and adapt to dynamism exhibited by parallel systems. Exploiting task parallelism through work stealing has been extensively studied in shared and distributed-memory contexts. In this paper, we study the design of a system that uses work stealing for dynamic load balancing of task-parallel programs executed on hybrid distributed-memory CPU-graphics processing unit (GPU) systems in a global-address space framework. We take into account the unique nature of the accelerator model employed by GPUs, the significant performance difference between GPU and CPU execution as a functionmore » of problem size, and the distinct CPU and GPU memory domains. We consider various alternatives in designing a distributed work stealing algorithm for CPU-GPU systems, while taking into account the impact of task distribution and data movement overheads. These strategies are evaluated using microbenchmarks that capture various execution configurations as well as the state-of-the-art CCSD(T) application module from the computational chemistry domain« less

  16. Reliability assessment of Multichip Module technologies via the Triservice/NASA RELTECH program

    NASA Astrophysics Data System (ADS)

    Fayette, Daniel F.

    1994-10-01

    Multichip Module (MCM) packaging/interconnect technologies have seen increased emphasis from both the commercial and military communities as a means of increasing capability and performance while providing a vehicle for reducing cost, power and weight of the end item electronic application. This is accomplished through three basic Multichip module technologies, MCM-L that are laminates, MCM-C that are ceramic type substrates and MCM-D that are deposited substrates (e.g., polymer dielectric with thin film metals). Three types of interconnect structures are also used with these substrates and include, wire bond, Tape Automated Bonds (TAB) and flip chip ball bonds. Application, cost, producibility and reliability are the drivers that will determine which MCM technology will best fit a respective need or requirement. With all the benefits and technologies cited, it would be expected that the use of, or the planned use of, MCM's would be more extensive in both military and commercial applications. However, two significant roadblocks exist to implementation of these new technologies: the absence of reliability data and a single national standard for the procurement of reliable/quality MCM's. To address the preceding issues, the Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH) program has been established. This program, which began in May 1992, has endeavored to evaluate a cross section of MCM technologies covering all classes of MCM's previously cited. NASA and the Tri-Services (Air Force Rome Laboratory, Naval Surface Warfare Center, Crane IN and Army Research Laboratory) have teamed together with sponsorship from ARPA to evaluate the performance, reliability and producibility of MCM's for both military and commercial usage. This is done in close cooperation with our industry partners whose support is critical to the goals of the program. Several tasks are being performed by the RELTECH program and data from this effort, in conjunction with information from our industry partners as well as discussions with industry organizations (IPC, EIA, ISHM, etc.) are being used to develop the qualification and screening requirements for MCM's. Specific tasks being performed by the RELTECH program include technical assessments, product evaluations, reliability modeling, environmental testing, and failure analysis. This paper will describe the various tasks associated with the RELTECH program, status, progress and a description of the national dual use specification being developed for MCM technologies.

  17. Automatic data partitioning on distributed memory multicomputers. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gupta, Manish

    1992-01-01

    Distributed-memory parallel computers are increasingly being used to provide high levels of performance for scientific applications. Unfortunately, such machines are not very easy to program. A number of research efforts seek to alleviate this problem by developing compilers that take over the task of generating communication. The communication overheads and the extent of parallelism exploited in the resulting target program are determined largely by the manner in which data is partitioned across different processors of the machine. Most of the compilers provide no assistance to the programmer in the crucial task of determining a good data partitioning scheme. A novel approach is presented, the constraints-based approach, to the problem of automatic data partitioning for numeric programs. In this approach, the compiler identifies some desirable requirements on the distribution of various arrays being referenced in each statement, based on performance considerations. These desirable requirements are referred to as constraints. For each constraint, the compiler determines a quality measure that captures its importance with respect to the performance of the program. The quality measure is obtained through static performance estimation, without actually generating the target data-parallel program with explicit communication. Each data distribution decision is taken by combining all the relevant constraints. The compiler attempts to resolve any conflicts between constraints such that the overall execution time of the parallel program is minimized. This approach has been implemented as part of a compiler called Paradigm, that accepts Fortran 77 programs, and specifies the partitioning scheme to be used for each array in the program. We have obtained results on some programs taken from the Linpack and Eispack libraries, and the Perfect Benchmarks. These results are quite promising, and demonstrate the feasibility of automatic data partitioning for a significant class of scientific application programs with regular computations.

  18. High Performance Fortran for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Zima, Hans; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    This paper focuses on the use of High Performance Fortran (HPF) for important classes of algorithms employed in aerospace applications. HPF is a set of Fortran extensions designed to provide users with a high-level interface for programming data parallel scientific applications, while delegating to the compiler/runtime system the task of generating explicitly parallel message-passing programs. We begin by providing a short overview of the HPF language. This is followed by a detailed discussion of the efficient use of HPF for applications involving multiple structured grids such as multiblock and adaptive mesh refinement (AMR) codes as well as unstructured grid codes. We focus on the data structures and computational structures used in these codes and on the high-level strategies that can be expressed in HPF to optimally exploit the parallelism in these algorithms.

  19. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and Grids. Solution method: High-level job management interface, including command line, scripting and GUI components. Restrictions: Access to the distributed resources depends on the installed, 3rd party software such as batch system client or Grid user interface.

  20. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  1. User-centered development of a smart phone mobile application delivering personalized real-time advice on sun protection.

    PubMed

    Buller, David B; Berwick, Marianne; Shane, James; Kane, Ilima; Lantz, Kathleen; Buller, Mary Klein

    2013-09-01

    Smart phones are changing health communication for Americans. User-centered production of a mobile application for sun protection is reported. Focus groups (n = 16 adults) provided input on the mobile application concept. Four rounds of usability testing were conducted with 22 adults to develop the interface. An iterative programming procedure moved from a specification document to the final mobile application, named Solar Cell. Adults desired a variety of sun protection advice, identified few barriers to use and were willing to input personal data. The Solar Cell prototype was improved from round 1 (seven of 12 tasks completed) to round 2 (11 of 12 task completed) of usability testing and was interoperable across handsets and networks. The fully produced version was revised during testing. Adults rated Solar Cell as highly user friendly (mean = 5.06). The user-centered process produced a mobile application that should help many adults manage sun safety.

  2. MSIX - A general and user-friendly platform for RAM analysis

    NASA Astrophysics Data System (ADS)

    Pan, Z. J.; Blemel, Peter

    The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.

  3. A Measure of Effectiveness - Analysis and Application.

    DTIC Science & Technology

    1981-06-01

    PODELL UNCLASSIFIED DCEC-TN-I2-81 NLEEEllllllllllI IIIIIIIIII!I IIIIEIIIIIIII2 4 LEYEV AD A10788 6 TN 2-81 DEFENSE COMMUNICATIONS ENGINEERING CENTER...AUTOR~a)8. CONTRACT OR GRANT NUMBER(&) .OBERT PODELL 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT PROJECT, TASK AREA 6 WORK UNIT...APPLICATION JUNE 1981 Prepared by: @ Robert Podell Approved for Publication: G.E.LaVE-AN Chief, Systems Engineering Division FOREWORD The Defense

  4. A Preliminary Study of Krypton Laser-Induced Fluorescence

    DTIC Science & Technology

    2010-07-01

    Induced Fluorescence 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) William A. Hargus, Jr. (AFRL/RZSS) 5d. PROJECT NUMBER R 5e. TASK...replacement for xenon. This study examines the potential applications of laser-induced fluorescence as a plasma diagnostic technique for Kr I and Kr...II. Candidate electronic transitions are examined to determine their suitability for successful routine application of laser-induced fluorescence

  5. 49 CFR 110.90 - Grant monitoring, reports, and records retention.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... performance of supported activities to assure compliance with applicable Federal requirements and achievement of performance goals. Monitoring must cover each program, function, activity, or task covered by the... shall submit a performance report at the completion of an activity for which reimbursement is being...

  6. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures Task 1 Vol. 1 Technical Findings

    DOT National Transportation Integrated Search

    1994-10-28

    The Run-Off-Road Collision Avoidance Using IVHS Countermeasures program is to address the single vehicle crash problem through application of technology to prevent and/or reduce the severity of these crashes. This report describes and documents the a...

  7. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures, Task 1, Volume 1: Technical Findings, Final Report

    DOT National Transportation Integrated Search

    1994-10-01

    THE RUN-OFF-ROAD COLLISION AVOIDANCE USING IVHS COUNTERMEASURES PROGRAM IS TO ADDRESS THE SINGLE VEHICLE CRASH PROBLEM THROUGH APPLICATION OF TECHNOLOGY TO PREVENT AND/OR REDUCE THE SEVERITY OF THESE CRASHES. : THIS REPORT DESCRIBES AND DOCUMENTS ...

  8. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures, Task 2, Volume 1: Technical Findings, Final Report

    DOT National Transportation Integrated Search

    1995-06-01

    THE RUN-OFF-ROAD COLLISION AVOIDANCE USING IVHS COUNTERMEASURES PROGRAM IS TO ADDRESS THE SINGLE VEHICLE CRASH PROBLEM THROUGH APPLICATION OF TECHNOLOGY TO PREVENT AND/OR REDUCE THE SEVERITY OF THESE CRASHES. : THIS REPORT DESCRIBES AND DOCUMENTS ...

  9. Run-Off-Road Collision Avoidance Countermeasures Using IVHS Countermeasures, Task 4, Volume 2: Rorsim Manual, Final Report

    DOT National Transportation Integrated Search

    1995-09-01

    THE RUN-OFF-ROAD COLLISION AVOIDANCE USING IVHS COUNTERMEASURES PROGRAM IS TO ADDRESS THE SINGLE VEHICLE CRASH PROBLEM THROUGH APPLICATION OF TECHNOLOGY TO PREVENT AND/OR REDUCE THE SEVERITY OF THESE CRASHES. : THIS REPORT DOCUMENTS THE RORSIM COM...

  10. Run-Off Road Collision Avoidance Countermeasures Using IVHS Countermeasures Task 1 Vol. 2 Support Volume

    DOT National Transportation Integrated Search

    1994-10-28

    The Run-Off-Road Collision Avoidance Using IVHS Countermeasures program is to address the single vehicle crash problem through application of technology to prevent and/or reduce the severity of these crashes. This report contains a summary of data us...

  11. GEOTHERM Data Set

    DOE Data Explorer

    DeAngelo, Jacob

    1983-01-01

    GEOTHERM is a comprehensive system of public databases and software used to store, locate, and evaluate information on the geology, geochemistry, and hydrology of geothermal systems. Three main databases address the general characteristics of geothermal wells and fields, and the chemical properties of geothermal fluids; the last database is currently the most active. System tasks are divided into four areas: (1) data acquisition and entry, involving data entry via word processors and magnetic tape; (2) quality assurance, including the criteria and standards handbook and front-end data-screening programs; (3) operation, involving database backups and information extraction; and (4) user assistance, preparation of such items as application programs, and a quarterly newsletter. The principal task of GEOTHERM is to provide information and research support for the conduct of national geothermal-resource assessments. The principal users of GEOTHERM are those involved with the Geothermal Research Program of the U.S. Geological Survey.

  12. The First Development of Human Factors Engineering Requirements for Application to Ground Task Design for a NASA Flight Program

    NASA Technical Reports Server (NTRS)

    Dischinger, H. Charles, Jr.; Stambolian, Damon B.; Miller, Darcy H.

    2008-01-01

    The National Aeronautics and Space Administration has long applied standards-derived human engineering requirements to the development of hardware and software for use by astronauts while in flight. The most important source of these requirements has been NASA-STD-3000. While there have been several ground systems human engineering requirements documents, none has been applicable to the flight system as handled at NASA's launch facility at Kennedy Space Center. At the time of the development of previous human launch systems, there were other considerations that were deemed more important than developing worksites for ground crews; e.g., hardware development schedule and vehicle performance. However, experience with these systems has shown that failure to design for ground tasks has resulted in launch schedule delays, ground operations that are more costly than they might be, and threats to flight safety. As the Agency begins the development of new systems to return humans to the moon, the new Constellation Program is addressing this issue with a new set of human engineering requirements. Among these requirements is a subset that will apply to the design of the flight components and that is intended to assure ground crew success in vehicle assembly and maintenance tasks. These requirements address worksite design for usability and for ground crew safety.

  13. Using linear programming to minimize the cost of nurse personnel.

    PubMed

    Matthews, Charles H

    2005-01-01

    Nursing personnel costs make up a major portion of most hospital budgets. This report evaluates and optimizes the utility of the nurse personnel at the Internal Medicine Outpatient Clinic of Wake Forest University Baptist Medical Center. Linear programming (LP) was employed to determine the effective combination of nurses that would allow for all weekly clinic tasks to be covered while providing the lowest possible cost to the department. Linear programming is a standard application of standard spreadsheet software that allows the operator to establish the variables to be optimized and then requires the operator to enter a series of constraints that will each have an impact on the ultimate outcome. The application is therefore able to quantify and stratify the nurses necessary to execute the tasks. With the report, a specific sensitivity analysis can be performed to assess just how sensitive the outcome is to the stress of adding or deleting a nurse to or from the payroll. The nurse employee cost structure in this study consisted of five certified nurse assistants (CNA), three licensed practicing nurses (LPN), and five registered nurses (RN). The LP revealed that the outpatient clinic should staff four RNs, three LPNs, and four CNAs with 95 percent confidence of covering nurse demand on the floor. This combination of nurses would enable the clinic to: 1. Reduce annual staffing costs by 16 percent; 2. Force each level of nurse to be optimally productive by focusing on tasks specific to their expertise; 3. Assign accountability more efficiently as the nurses adhere to their specific duties; and 4. Ultimately provide a competitive advantage to the clinic as it relates to nurse employee and patient satisfaction. Linear programming can be used to solve capacity problems for just about any staffing situation, provided the model is indeed linear.

  14. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2009-04-07

    and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...disrupting the optimal construction sequence and application of lessons learned for follow-on vessels in these programs...statements among others, although not necessarily in the order shown below: • “I am deeply concerned about Northrop Grumman Ship Systems’ ( NGSS

  15. The Landsat program: Its origins, evolution, and impacts

    USGS Publications Warehouse

    Lauer, D.T.; Morain, S.A.; Salomonson, V.V.

    1997-01-01

    Landsat 1 began an era of space-based resource data collection that changed the way science, industry, governments, and the general public view the Earth. For the last 25 years, the Landsat program - despite being hampered by institutional problems and budget uncertainties - has successfully provided a continuous supply of synoptic, repetitive, multi-spectral data of the Earth's land areas. These data have profoundly affected programs for mapping resources, monitoring environmental changes, and assessing global habitability. The societal applications this program generated are so compelling that international systems have proliferated to carry on the tasks initiated with Landsat data.

  16. A comparison of older adults' subjective experiences with virtual and real environments during dynamic balance activities.

    PubMed

    Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee

    2015-01-01

    The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semistructured interview at the end of the testing session. Data were analyzed respectively using paired t tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs.

  17. RINGMesh: A programming library for developing mesh-based geomodeling applications

    NASA Astrophysics Data System (ADS)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  18. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  19. Narrative Reproduction in Preschoolers after the Application of an Inductive Reasoning Training Programme.

    ERIC Educational Resources Information Center

    Manavopoulos, Konstantin; Tzouriadou, Maria

    1998-01-01

    Investigated whether preschoolers had acquired event knowledge schemata, and the impact of an inductive reasoning training program on knowledge transfer in story recall tasks. Found that inductive reasoning training led to knowledge transfer in kindergartners but had only minor influence on prekindergartners. (Author/KB)

  20. Advanced Visualization and Interactive Display Rapid Innovation and Discovery Evaluation Research Program task 8: Survey of WEBGL Graphics Engines

    DTIC Science & Technology

    2015-01-01

    1 3.0 Methods, Assumptions, and Procedures ...18 4.6.3. LineUp Web... Procedures A search of the internet looking at web sites specializing in graphics, graphics engines, web browser applications, and games was conducted to

  1. Analysis of rocket engine injection combustion processes

    NASA Technical Reports Server (NTRS)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  2. Force Management Methods. Task 1 Report. Current Methods

    DTIC Science & Technology

    1978-12-01

    information about the F/FB-III MCR system is presented in USAF T.O. IF-IlIA-2-1-2 ("F-Ill Service Usage Recorder Program -- Data Collection and Reporting ...34) and T.O. 1F-Ill(B) -2-1-2 ("FB-III Service Usage Program -- Data Collection and Reporting "). The former covers application of the MCR system in F-IIIA...Control Program" NOR 71 -109 "Structural Description Report " NOR 71 -214 "Structural Fatigue Criteria" NOR 76-70 "Structural Fatigue Criteria for Saudi

  3. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  4. A Case Study: Using Delmia at Kennedy Space Center to Support NASA's Constellation Program

    NASA Technical Reports Server (NTRS)

    Kickbusch, Tracey; Humeniuk, Bob

    2010-01-01

    The presentation examines the use of Delmia (Digital Enterprise Lean Manufacturing Interactive Application) for digital simulation in NASA's Constellation Program. Topics include an overview of the Kennedy Space Center (KSC) Design Visualization Group tasks, NASA's Constellation Program, Ares 1 ground processing preliminary design review, and challenges and how Delmia is used at KSC, Challenges include dealing with large data sets, creating and maintaining KSC's infrastructure, gathering customer requirements and meeting objectives, creating life-like simulations, and providing quick turn-around on varied products,

  5. General Reevaluation and Supplement to Environmental Impact Statement for Flood Control and Related Purposes. Red and Red Lake Rivers at East Grand Forks, Minnesota.

    DTIC Science & Technology

    1984-11-01

    ORGANIZATION (if applicable) 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK IWORK UNIT ELEMENT NO. NO. NO...participate in tne project. The city has also entered the regular phase of tne National Flood Insurance program adopted 23 September 1977. The State ’V of...releases It o Possible sites outside area of city control/ during periods of low flow. responsibility. -s Red Lake Watersned District has a current program

  6. Architecture, Design and Implementation of RC64, a Many-Core High-Performance DSP for Space Applications

    NASA Astrophysics Data System (ADS)

    Ginosar, Ran; Aviely, Peleg; Liran, Tuvia; Alon, Dov; Dobkin, Reuven; Goldberg, Michael

    2013-08-01

    RC64, a novel 64-core many-core signal processing chip targets DSP performance of 12.8 GIPS, 100 GOPS and 12.8 single precision GFLOS while dissipating only 3 Watts. RC64 employs advanced DSP cores, a multi-bank shared memory and a hardware scheduler, supports DDR2 memory and communicates over five proprietary 6.4 Gbps channels. The programming model employs sequential fine-grain tasks and a separate task map to define task dependencies. RC64 is implemented as a 200 MHz ASIC on Tower 130nm CMOS technology, assembled in hermetically sealed ceramic QFP package and qualified to the highest space standards.

  7. STS pilot user development program

    NASA Technical Reports Server (NTRS)

    Mcdowell, J. R.

    1977-01-01

    Full exploitation of the STS capabilities will be not only dependent on the extensive use of the STS for known space applications and research, but also on new, innovative ideas of use originating with both current and new users. In recognition of this, NASA has been engaged in a User Development Program for the STS. The program began with four small studies. Each study addressed a separate sector of potential new users to identify techniques and methodologies for user development. The collective results established that a user development function was not only feasible, but necessary for NASA to realize the full potential of the STS. This final report begins with a description of the overall pilot program plan, which involved five specific tasks defined in the contract Statement of Work. Each task is then discussed separately; but two subjects, the development of principal investigators and space processing users, are discussed separately for improved continuity of thought. These discussions are followed by a summary of the primary results and conclusions of the Pilot User Development Program. Specific recommendations of the study are given.

  8. Overview of the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.

  9. Magnetically suspended reaction wheel assembly

    NASA Technical Reports Server (NTRS)

    Stocking, G.

    1984-01-01

    The magnetically suspended reaction wheel assembly (MSRWA) is the product of a development effort funded by the Air Force Materials Laboratory (AFML) at Wright Patterson AFB. The specific objective of the project was to establish the manufacturing processes for samarium cobalt magnets and demonstrate their use in a space application. The development was successful on both counts. The application portion of the program, which involves the magnetically suspended reaction wheel assembly, is emphasized. The requirements for the reaction wheel were based on the bias wheel requirements of the DSP satellite. The tasks included the design, fabrication, and test of the unit to the DSP program qualification requirements.

  10. Magnetically suspended reaction wheel assembly

    NASA Astrophysics Data System (ADS)

    Stocking, G.

    1984-11-01

    The magnetically suspended reaction wheel assembly (MSRWA) is the product of a development effort funded by the Air Force Materials Laboratory (AFML) at Wright Patterson AFB. The specific objective of the project was to establish the manufacturing processes for samarium cobalt magnets and demonstrate their use in a space application. The development was successful on both counts. The application portion of the program, which involves the magnetically suspended reaction wheel assembly, is emphasized. The requirements for the reaction wheel were based on the bias wheel requirements of the DSP satellite. The tasks included the design, fabrication, and test of the unit to the DSP program qualification requirements.

  11. A remote sensing applications update: Results of interviews with Earth Observations Commercialization Program (EOCAP) participants

    NASA Technical Reports Server (NTRS)

    Mcvey, Sally

    1991-01-01

    Earth remote sensing is a uniquely valuable tool for large-scale resource management, a task whose importance will likely increase world-wide through the foreseeable future. NASA research and engineering have virtually created the existing U.S. system, and will continue to push the frontiers, primarily through Earth Observing System (EOS) instruments, research, and data and information systems. It is the researchers' view that the near-term health of remote sensing applications also deserves attention; it seems important not to abandon the system or its clients. The researchers suggest that, like its Landsat predecessor, a successful Earth Observing System program is likely to reinforce pressure to 'manage' natural resources, and consequently, to create more pressure for Earth Observations Commercialization (EOCAP) type applications. The current applications programs, though small, are valuable because of their technical and commercial results, and also because they support a community whose contributions will increase along with our ability to observe the Earth from space.

  12. Analysis of high-order languages for use on space station application software

    NASA Technical Reports Server (NTRS)

    Knoebel, A.

    1986-01-01

    Considered in this study is the general and not easily resolved problem of how to choose the right programming language for a particular task. This is specialized to the question of which versions of what languages should be chosen for the multitude of tasks that the Marshall Space Flight Center will be responsible for in the Space Station. Four criteria are presented: theoretical considerations, quantitative matrices, qualitative benchmarks, and the monitoring of programmers. Specific recommendations for future studies are given to resolve these questions for the Space Station.

  13. A Multiple Linear Regression Model for Predicting Zone A Retention by Military Occupational Specialty.

    DTIC Science & Technology

    1986-09-01

    OF REPORT Approved for public release; distribution 2b DECLASSIFICATION /DOWNGRADING SCHEDULE is unlimited. 4 PERFORMING ORGANIZATION REPORT NUMBER(S...S MONITORING ORGANIZATION REPORT NUMBER(S) 6a NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION (If applicable...ORGAIZATION (If applicable) 8c ADDRESS(Ciry, State, ard ZIPCode) 10 SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO INO NO ACCESSION

  14. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  15. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Error-Prone Model Derived from 1978-1979 Quality Control Study. Data Report. [Task 3.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; Kuchak, JoAnn

    An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…

  16. Standoff reconnaissance imagery - Applications and interpreter training

    NASA Astrophysics Data System (ADS)

    Gustafson, G. C.

    1980-01-01

    The capabilities, advantages and applications of Long Range Oblique Photography (LOROP) standoff air reconnaissance cameras are reviewed, with emphasis on the problems likely to be encountered in photo interpreter training. Results of student exercises in descriptive image analysis and mensuration are presented and discussed, and current work on the computer programming of oblique and panoramic mensuration tasks is summarized. Numerous examples of this class of photographs and their interpretation at various magnifications are also presented.

  17. Control Improvisation with Application to Music

    DTIC Science & Technology

    2013-11-04

    Control Improvisation with Application to Music Alexandre Donze Sophie Libkind Sanjit A. Seshia David Wessel Electrical Engineering and Computer...to Music 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7...domain of music . More speci cally, we consider the scenario of generating a monophonic Jazz melody (solo) on a given song harmonization. The music is

  18. A study of mass data storage technology for rocket engine data

    NASA Technical Reports Server (NTRS)

    Ready, John F.; Benser, Earl T.; Fritz, Bernard S.; Nelson, Scott A.; Stauffer, Donald R.; Volna, William M.

    1990-01-01

    The results of a nine month study program on mass data storage technology for rocket engine (especially the Space Shuttle Main Engine) health monitoring and control are summarized. The program had the objective of recommending a candidate mass data storage technology development for rocket engine health monitoring and control and of formulating a project plan and specification for that technology development. The work was divided into three major technical tasks: (1) development of requirements; (2) survey of mass data storage technologies; and (3) definition of a project plan and specification for technology development. The first of these tasks reviewed current data storage technology and developed a prioritized set of requirements for the health monitoring and control applications. The second task included a survey of state-of-the-art and newly developing technologies and a matrix-based ranking of the technologies. It culminated in a recommendation of optical disk technology as the best candidate for technology development. The final task defined a proof-of-concept demonstration, including tasks required to develop, test, analyze, and demonstrate the technology advancement, plus an estimate of the level of effort required. The recommended demonstration emphasizes development of an optical disk system which incorporates an order-of-magnitude increase in writing speed above the current state of the art.

  19. Emerald: an object-based language for distributed programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, N.C.

    1987-01-01

    Distributed systems have become more common, however constructing distributed applications remains a very difficult task. Numerous operating systems and programming languages have been proposed that attempt to simplify the programming of distributed applications. Here a programing language called Emerald is presented that simplifies distributed programming by extending the concepts of object-based languages to the distributed environment. Emerald supports a single model of computation: the object. Emerald objects include private entities such as integers and Booleans, as well as shared, distributed entities such as compilers, directories, and entire file systems. Emerald objects may move between machines in the system, but objectmore » invocation is location independent. The uniform semantic model used for describing all Emerald objects makes the construction of distributed applications in Emerald much simpler than in systems where the differences in implementation between local and remote entities are visible in the language semantics. Emerald incorporates a type system that deals only with the specification of objects - ignoring differences in implementation. Thus, two different implementations of the same abstraction may be freely mixed.« less

  20. Design study of wind turbines, 50 kW to 3000 kW for electric utility applications: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Preliminary designs of low power (50 to 500 kW) and high power (500 to 3000 kW) wind generator systems (WGS) for electric utility applications were developed. These designs provide the bases for detail design, fabrication, and experimental demonstration testing of these units at selected utility sites. Several feasible WGS configurations were evaluated, and the concept offering the lowest energy cost potential and minimum technical risk for utility applications was selected. The selected concept was optimized utilizing a parametric computer program prepared for this purpose. The utility requirements evaluation task examined the economic, operational and institutional factors affecting the WGS in a utility environment, and provided additional guidance for the preliminary design effort. Results of the conceptual design task indicated that a rotor operating at constant speed, driving an AC generator through a gear transmission is the most cost effective WGS configuration.

  1. Computer program for the automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Poulson, P.; Rasmusson, C.

    1971-01-01

    The automated attendance accounting system (AAAS) was developed under the auspices of the Space Technology Applications Program. The task is basically the adaptation of a small digital computer, coupled with specially developed pushbutton terminals located in school classrooms and offices for the purpose of taking daily attendance, maintaining complete attendance records, and producing partial and summary reports. Especially developed for high schools, the system is intended to relieve both teachers and office personnel from the time-consuming and dreary task of recording and analyzing the myriad classroom attendance data collected throughout the semester. In addition, since many school district budgets are related to student attendance, the increase in accounting accuracy is expected to augment district income. A major component of this system is the real-time AAAS software system, which is described.

  2. Bayesian Modeling of a Human MMORPG Player

    NASA Astrophysics Data System (ADS)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  3. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2009-06-04

    and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...context of the best balance of the force overall. “While we agree on requirements—and the Navy and Marine Corps are pretty aligned on that—we have to...sequence and application of lessons learned for follow-on vessels in these programs

  4. Early Program Development

    NASA Image and Video Library

    1970-01-01

    Managed by Marshall Space Flight Center, the Space Tug concept was intended to be a reusable multipurpose space vehicle designed to transport payloads to different orbital inclinations. Utilizing mission-specific combinations of its three primary modules (crew, propulsion, and cargo) and a variety of supplementary kits, the Space Tug was capable of numerous space applications. This 1970 artist's concept represents a typical configuration required to conduct operations and tasks in Earth orbit. The Space Tug program was cancelled and did not become a reality.

  5. Force Identification from Structural Response

    DTIC Science & Technology

    1999-12-01

    STUDENT AT (If applicable) AFIT/CIA Univ of New Mexico A 6c. ADDRESS (City, State, and ZIP Code ) 7b. ADDRESS (City, State, and ZIP Code ) Wright...ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (h,,clude...FOR PUBLIC RELEASE IAW AFR 190-1 ERNEST A. HAYGOOD, 1st Lt, USAF Executive Officer, Civilian Institution Programs 17. COSATI CODES 18. SUBJECT TERMS

  6. Real-time implementations of image segmentation algorithms on shared memory multicore architecture: a survey (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed

    2017-05-01

    The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.

  7. Extraterrestrial applications of solar optics for interior illumination

    NASA Technical Reports Server (NTRS)

    Eijadi, David A.; Williams, Kyle D.

    1992-01-01

    Solar optics is a terrestrial technology that has potential extraterrestrial applications. Active solar optics (ASO) and passive solar optics (PSO) are two approaches to the transmission of sunlight to remote interior spaces. Active solar optics is most appropriate for task illumination, while PSO is most appropriate for general illumination. Research into solar optics, motivated by energy conservation, has produced lightweight and low-cost materials, products that have applications to NASA's Controlled Ecological Life Support System (CELSS) program and its lunar base studies. Specifically, prism light guides have great potential in these contexts. Several applications of solar optics to lunar base concepts are illustrated.

  8. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented

  9. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.

  10. Varied overground walking-task practice versus body-weight-supported treadmill training in ambulatory adults within one year of stroke: a randomized controlled trial protocol.

    PubMed

    DePaul, Vincent G; Wishart, Laurie R; Richardson, Julie; Lee, Timothy D; Thabane, Lehana

    2011-10-21

    Although task-oriented training has been shown to improve walking outcomes after stroke, it is not yet clear whether one task-oriented approach is superior to another. The purpose of this study is to compare the effectiveness of the Motor Learning Walking Program (MLWP), a varied overground walking task program consistent with key motor learning principles, to body-weight-supported treadmill training (BWSTT) in community-dwelling, ambulatory, adults within 1 year of stroke. A parallel, randomized controlled trial with stratification by baseline gait speed will be conducted. Allocation will be controlled by a central randomization service and participants will be allocated to the two active intervention groups (1:1) using a permuted block randomization process. Seventy participants will be assigned to one of two 15-session training programs. In MLWP, one physiotherapist will supervise practice of various overground walking tasks. Instructions, feedback, and guidance will be provided in a manner that facilitates self-evaluation and problem solving. In BWSTT, training will emphasize repetition of the normal gait cycle while supported over a treadmill, assisted by up to three physiotherapists. Outcomes will be assessed by a blinded assessor at baseline, post-intervention and at 2-month follow-up. The primary outcome will be post-intervention comfortable gait speed. Secondary outcomes include fast gait speed, walking endurance, balance self-efficacy, participation in community mobility, health-related quality of life, and goal attainment. Groups will be compared using analysis of covariance with baseline gait speed strata as the single covariate. Intention-to-treat analysis will be used. In order to direct clinicians, patients, and other health decision-makers, there is a need for a head-to-head comparison of different approaches to active, task-related walking training after stroke. We hypothesize that outcomes will be optimized through the application of a task-related training program that is consistent with key motor learning principles related to practice, guidance and feedback. ClinicalTrials.gov # NCT00561405.

  11. Varied overground walking-task practice versus body-weight-supported treadmill training in ambulatory adults within one year of stroke: a randomized controlled trial protocol

    PubMed Central

    2011-01-01

    Background Although task-oriented training has been shown to improve walking outcomes after stroke, it is not yet clear whether one task-oriented approach is superior to another. The purpose of this study is to compare the effectiveness of the Motor Learning Walking Program (MLWP), a varied overground walking task program consistent with key motor learning principles, to body-weight-supported treadmill training (BWSTT) in community-dwelling, ambulatory, adults within 1 year of stroke. Methods/Design A parallel, randomized controlled trial with stratification by baseline gait speed will be conducted. Allocation will be controlled by a central randomization service and participants will be allocated to the two active intervention groups (1:1) using a permuted block randomization process. Seventy participants will be assigned to one of two 15-session training programs. In MLWP, one physiotherapist will supervise practice of various overground walking tasks. Instructions, feedback, and guidance will be provided in a manner that facilitates self-evaluation and problem solving. In BWSTT, training will emphasize repetition of the normal gait cycle while supported over a treadmill, assisted by up to three physiotherapists. Outcomes will be assessed by a blinded assessor at baseline, post-intervention and at 2-month follow-up. The primary outcome will be post-intervention comfortable gait speed. Secondary outcomes include fast gait speed, walking endurance, balance self-efficacy, participation in community mobility, health-related quality of life, and goal attainment. Groups will be compared using analysis of covariance with baseline gait speed strata as the single covariate. Intention-to-treat analysis will be used. Discussion In order to direct clinicians, patients, and other health decision-makers, there is a need for a head-to-head comparison of different approaches to active, task-related walking training after stroke. We hypothesize that outcomes will be optimized through the application of a task-related training program that is consistent with key motor learning principles related to practice, guidance and feedback. Trial Registration ClinicalTrials.gov # NCT00561405 PMID:22018267

  12. The Military Application of Narrative: Solving Army Warfighting Challenge #2

    DTIC Science & Technology

    2016-06-10

    U.S. Army Command and General Staff College in partial fulfillment of the requirements for the degree MASTER OF MILITARY ART AND SCIENCE...NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Robert D. Payne III, Major 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER

  13. Thiokol 260-SL Nozzle Development Program

    DTIC Science & Technology

    1967-01-01

    Pigure 1 Candidate Throat Inserts ............................ 7 2 Laminate Temperature versus Coating Thickness for Selectron 5003 Specimens...32 Photo Cross Adhesive Pattern ....................... 111 33 Photo Parallel Adhesive Pattern ..................... 112 34 Adhesive Applicator Teeth ...Ablative Material .... 88 XXXIII Task 9: Corlar Coating of Graphite Materials Throat, IS 11004-01-02, 156-ZC-1 ............ ....... 90 XXXIV Adapter

  14. Development and application of a community sustainability visualization tool through integration of US EPA’s Sustainable and Health Community Research Program tasks

    EPA Science Inventory

    Maintaining a harmonious balance between economic, social, and environmental well-being is paramount to community sustainability. Communities need a practical/usable suite of measures to assess their current position on a "surface" of sustainability created from the interaction ...

  15. Criterion-Referenced Job Proficiency Testing: A Large Scale Application. Research Report 1193.

    ERIC Educational Resources Information Center

    Maier, Milton H.; Hirshfeld, Stephen F.

    The Army Skill Qualification Tests (SQT's) were designed to determine levels of competence in performance of the tasks crucial to an enlisted soldier's occupational specialty. SQT's are performance-based, criterion-referenced measures which offer two advantages over traditional proficiency and achievement testing programs: test content can be made…

  16. The design and application of a Transportable Inference Engine (TIE1)

    NASA Technical Reports Server (NTRS)

    Mclean, David R.

    1986-01-01

    A Transportable Inference Engine (TIE1) system has been developed by the author as part of the Interactive Experimenter Planning System (IEPS) task which is involved with developing expert systems in support of the Spacecraft Control Programs Branch at Goddard Space Flight Center in Greenbelt, Maryland. Unlike traditional inference engines, TIE1 is written in the C programming language. In the TIE1 system, knowledge is represented by a hierarchical network of objects which have rule frames. The TIE1 search algorithm uses a set of strategies, including backward chaining, to obtain the values of goals. The application of TIE1 to a spacecraft scheduling problem is described. This application involves the development of a strategies interpreter which uses TIE1 to do constraint checking.

  17. Automation of Shuttle Tile Inspection - Engineering methodology for Space Station

    NASA Technical Reports Server (NTRS)

    Wiskerchen, M. J.; Mollakarimi, C.

    1987-01-01

    The Space Systems Integration and Operations Research Applications (SIORA) Program was initiated in late 1986 as a cooperative applications research effort between Stanford University, NASA Kennedy Space Center, and Lockheed Space Operations Company. One of the major initial SIORA tasks was the application of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. This effort has adopted a systems engineering approach consisting of an integrated set of rapid prototyping testbeds in which a government/university/industry team of users, technologists, and engineers test and evaluate new concepts and technologies within the operational world of Shuttle. These integrated testbeds include speech recognition and synthesis, laser imaging inspection systems, distributed Ada programming environments, distributed relational database architectures, distributed computer network architectures, multimedia workbenches, and human factors considerations.

  18. High Current Density, Long Life Cathodes for High Power RF Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ives, Robert Lawrence; Collins, George; Falce, Lou

    2014-01-22

    This program was tasked with improving the quality and expanding applications for Controlled Porosity Reservoir (CPR) cathodes. Calabazas Creek Research, Inc. (CCR) initially developed CPR cathodes on a DOE-funded SBIR program to improve cathodes for magnetron injection guns. Subsequent funding was received from the Defense Advanced Research Projects Agency. The program developed design requirements for implementation of the technology into high current density cathodes for high frequency applications. During Phase I of this program, CCR was awarded the prestigious 2011 R&D100 award for this technology. Subsequently, the technology was presented at numerous technical conferences. A patent was issued for themore » technology in 2009. These cathodes are now marketed by Semicon Associates, Inc. in Lexington, KY. They are the world’s largest producer of cathodes for vacuum electron devices. During this program, CCR teamed with Semicon Associates, Inc. and Ron Witherspoon, Inc. to improve the fabrication processes and expand applications for the cathodes. Specific fabrications issues included the quality of the wire winding that provides the basic structure and the sintering to bond the wires into a robust, cohesive structure. The program also developed improved techniques for integrating the resulting material into cathodes for electron guns.« less

  19. PyPele Rewritten To Use MPI

    NASA Technical Reports Server (NTRS)

    Hockney, George; Lee, Seungwon

    2008-01-01

    A computer program known as PyPele, originally written as a Pythonlanguage extension module of a C++ language program, has been rewritten in pure Python language. The original version of PyPele dispatches and coordinates parallel-processing tasks on cluster computers and provides a conceptual framework for spacecraft-mission- design and -analysis software tools to run in an embarrassingly parallel mode. The original version of PyPele uses SSH (Secure Shell a set of standards and an associated network protocol for establishing a secure channel between a local and a remote computer) to coordinate parallel processing. Instead of SSH, the present Python version of PyPele uses Message Passing Interface (MPI) [an unofficial de-facto standard language-independent application programming interface for message- passing on a parallel computer] while keeping the same user interface. The use of MPI instead of SSH and the preservation of the original PyPele user interface make it possible for parallel application programs written previously for the original version of PyPele to run on MPI-based cluster computers. As a result, engineers using the previously written application programs can take advantage of embarrassing parallelism without need to rewrite those programs.

  20. Flexible Conformable Clamps for a Machining Cell with Applications to Turbine Blade Machining.

    DTIC Science & Technology

    1983-05-01

    PERIOD COVERED * FLEXIBLE CONFORMABLE CLAMPS FOR A MACHINING CELL Interim WITH APPLICATIONS TO TURBINE BLADE MACHINING 6. PERFORMING ORG. REPORT NUMBER...7. AuTmbR(s) 6. CONTRACT OR GRANT NUMBER(a) Eiki Kurokawa 3. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELE%4NTPROJECT. TASK Carnegie-Mellon...University AREA a WORK UhIT NUMBERS The Robotics Institute Pittsburgh, PA. 15213 II. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE May 1983. 13

  1. Exactly Embedded Density Functional Theory: A New Paradigm for the First-principles Modeling of Reactions in Complex Systems

    DTIC Science & Technology

    2014-10-14

    applications. By developing both inversion-based and projection -based strategies to enable 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...REPORT TYPE 17. LIMITATION OF ABSTRACT 15. NUMBER OF PAGES 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 5c. PROGRAM ELEMENT...constraint that excluded essentially all condensed-phase and reactive chemical applications. By developing both inversion-based and projection -based

  2. Potential Applications of Cable Television (CATV) to the FEMA (Federal Emergency Management Agency) Communications Mission.

    DTIC Science & Technology

    1983-07-01

    8217D-RI31 326 POTENTIAL APPLICATIONS OF CABLE TELEVISION (CATY) TO i2 THE FEMA (FEDERAL EM.. (U) CONTROL ENERGY CORP BOSTON MR D D GILLIGAN ET AL. JUL...9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK AREA & WORK UNIT NUMBERS Control Energy Corporation 470 Atlantic...Avenue Boston, MA 02210 ______________ 11. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Federal Emergency Management Agency July, 1983 OAM

  3. Integrated Payload Data Handling Systems Using Software Partitioning

    NASA Astrophysics Data System (ADS)

    Taylor, Alun; Hann, Mark; Wishart, Alex

    2015-09-01

    An integrated Payload Data Handling System (I-PDHS) is one in which multiple instruments share a central payload processor for their on-board data processing tasks. This offers a number of advantages over the conventional decentralised architecture. Savings in payload mass and power can be realised because the total processing resource is matched to the requirements, as opposed to the decentralised architecture here the processing resource is in effect the sum of all the applications. Overall development cost can be reduced using a common processor. At individual instrument level the potential benefits include a standardised application development environment, and the opportunity to run the instrument data handling application on a fully redundant and more powerful processing platform [1]. This paper describes a joint program by SCISYS UK Limited, Airbus Defence and Space, Imperial College London and RAL Space to implement a realistic demonstration of an I-PDHS using engineering models of flight instruments (a magnetometer and camera) and a laboratory demonstrator of a central payload processor which is functionally representative of a flight design. The objective is to raise the Technology Readiness Level of the centralised data processing technique by address the key areas of task partitioning to prevent fault propagation and the use of a common development process for the instrument applications. The project is supported by a UK Space Agency grant awarded under the National Space Technology Program SpaceCITI scheme. [1].

  4. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  5. Genetic programming applied to RFI mitigation in radio astronomy

    NASA Astrophysics Data System (ADS)

    Staats, K.

    2016-12-01

    Genetic Programming is a type of machine learning that employs a stochastic search of a solutions space, genetic operators, a fitness function, and multiple generations of evolved programs to resolve a user-defined task, such as the classification of data. At the time of this research, the application of machine learning to radio astronomy was relatively new, with a limited number of publications on the subject. Genetic Programming had never been applied, and as such, was a novel approach to this challenging arena. Foundational to this body of research, the application Karoo GP was developed in the programming language Python following the fundamentals of tree-based Genetic Programming described in "A Field Guide to Genetic Programming" by Poli, et al. Karoo GP was tasked with the classification of data points as signal or radio frequency interference (RFI) generated by instruments and machinery which makes challenging astronomers' ability to discern the desired targets. The training data was derived from the output of an observation run of the KAT-7 radio telescope array built by the South African Square Kilometre Array (SKA-SA). Karoo GP, kNN, and SVM were comparatively employed, the outcome of which provided noteworthy correlations between input parameters, the complexity of the evolved hypotheses, and performance of raw data versus engineered features. This dissertation includes description of novel approaches to GP, such as upper and lower limits to the size of syntax trees, an auto-scaling multiclass classifier, and a Numpy array element manager. In addition to the research conducted at the SKA-SA, it is described how Karoo GP was applied to fine-tuning parameters of a weather prediction model at the South African Astronomical Observatory (SAAO), to glitch classification at the Laser Interferometer Gravitational-wave Observatory (LIGO), and to astro-particle physics at The Ohio State University.

  6. RC64, a Rad-Hard Many-Core High- Performance DSP for Space Applications

    NASA Astrophysics Data System (ADS)

    Ginosar, Ran; Aviely, Peleg; Gellis, Hagay; Liran, Tuvia; Israeli, Tsvika; Nesher, Roy; Lange, Fredy; Dobkin, Reuven; Meirov, Henri; Reznik, Dror

    2015-09-01

    RC64, a novel rad-hard 64-core signal processing chip targets DSP performance of 75 GMACs (16bit), 150 GOPS and 38 single precision GFLOPS while dissipating less than 10 Watts. RC64 integrates advanced DSP cores with a multi-bank shared memory and a hardware scheduler, also supporting DDR2/3 memory and twelve 3.125 Gbps full duplex high speed serial links using SpaceFibre and other protocols. The programming model employs sequential fine-grain tasks and a separate task map to define task dependencies. RC64 is implemented as a 300 MHz integrated circuit on a 65nm CMOS technology, assembled in hermetically sealed ceramic CCGA624 package and qualified to the highest space standards.

  7. RC64, a Rad-Hard Many-Core High-Performance DSP for Space Applications

    NASA Astrophysics Data System (ADS)

    Ginosar, Ran; Aviely, Peleg; Liran, Tuvia; Alon, Dov; Mandler, Alberto; Lange, Fredy; Dobkin, Reuven; Goldberg, Miki

    2014-08-01

    RC64, a novel rad-hard 64-core signal processing chip targets DSP performance of 75 GMACs (16bit), 150 GOPS and 20 single precision GFLOPS while dissipating less than 10 Watts. RC64 integrates advanced DSP cores with a multi-bank shared memory and a hardware scheduler, also supporting DDR2/3 memory and twelve 2.5 Gbps full duplex high speed serial links using SpaceFibre and other protocols. The programming model employs sequential fine-grain tasks and a separate task map to define task dependencies. RC64 is implemented as a 300 MHz integrated circuit on a 65nm CMOS technology, assembled in hermetically sealed ceramic CCGA624 package and qualified to the highest space standards.

  8. Beyond the Baseline: Proceedings of the Space Station Evolution Symposium. Volume 2, Part 2; Space Station Freedom Advanced Development Program

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This report contains the individual presentations delivered at the Space Station Evolution Symposium in League City, Texas on February 6, 7, 8, 1990. Personnel responsible for Advanced Systems Studies and Advanced Development within the Space Station Freedom program reported on the results of their work to date. Systems Studies presentations focused on identifying the baseline design provisions (hooks and scars) necessary to enable evolution of the facility to support changing space policy and anticipated user needs. Also emphasized were evolution configuration and operations concepts including on-orbit processing of space transfer vehicles. Advanced Development task managers discussed transitioning advanced technologies to the baseline program, including those near-term technologies which will enhance the safety and productivity of the crew and the reliability of station systems. Special emphasis was placed on applying advanced automation technology to ground and flight systems. This publication consists of two volumes. Volume 1 contains the results of the advanced system studies with the emphasis on reference evolution configurations, system design requirements and accommodations, and long-range technology projections. Volume 2 reports on advanced development tasks within the Transition Definition Program. Products of these tasks include: engineering fidelity demonstrations and evaluations on Station development testbeds and Shuttle-based flight experiments; detailed requirements and performance specifications which address advanced technology implementation issues; and mature applications and the tools required for the development, implementation, and support of advanced technology within the Space Station Freedom Program.

  9. Innovative Methods for Estimating Densities and Detection Probabilities of Secretive Reptiles Including Invasive Constrictors and Rare Upland Snakes

    DTIC Science & Technology

    2018-01-30

    1  Department of Defense Legacy Resource Management Program Agreement # W9132T-14-2-0010 ( Project # 14-754) Innovative Methods for Estimating...Upland Snakes NA 5c. PROGRAM ELEMENT NUMBER NA 6. AUTHOR(S) 5d. PROJECT NUMBER John D. Willson, Ph.D. 14-754 Shannon Pittman, Ph.D. 5e. TASK NUMBER...STATEMENT Publically available 13. SUPPLEMENTARY NOTES NA 14. ABSTRACT This project demonstrates the broad applicability of a novel simulation

  10. RIPS: a UNIX-based reference information program for scientists.

    PubMed

    Klyce, S D; Rózsa, A J

    1983-09-01

    A set of programs is described which implement a personal reference management and information retrieval system on a UNIX-based minicomputer. The system operates in a multiuser configuration with a host of user-friendly utilities that assist entry of reference material, its retrieval, and formatted printing for associated tasks. A search command language was developed without restriction in keyword vocabulary, number of keywords, or level of parenthetical expression nesting. The system is readily transported, and by design is applicable to any academic specialty.

  11. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-08-02

    and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...has been required, disrupting the optimal construction sequence and application of lessons learned for follow-on vessels in these programs... NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to construction of LPD 17 Class vessels.” • “I am equally

  12. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-07-07

    and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...has been required, disrupting the optimal construction sequence and application of lessons learned for follow-on vessels in these programs... NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to construction of LPD 17 Class vessels.” • “I am equally

  13. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-10-04

    Issues, and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...rework has been required, disrupting the optimal construction sequence and application of lessons learned for follow-on vessels in these programs...Systems’ ( NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to construction of LPD 17 Class vessels.” • “I am

  14. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-10-15

    Issues, And Options For Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...19 has been required, disrupting the optimal construction sequence and application of lessons learned for follow-on vessels in these programs. In...although not necessarily in the order shown below: • “I am deeply concerned about Northrop Grumman Ship Systems’ ( NGSS ) ability to recover in the

  15. A Review of NASA's Radiation-Hardened Electronics for Space Environments Project

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Adams, James H.; Patrick, Marshall C.; Johnson, Michael A.; Cressler, John D.

    2008-01-01

    NASA's Radiation Hardened Electronics for Space Exploration (RHESE) project develops the advanced technologies required to produce radiation hardened electronics, processors, and devices in support of the requirements of NASA's Constellation program. Over the past year, multiple advancements have been made within each of the RHESE technology development tasks that will facilitate the success of the Constellation program elements. This paper provides a brief review of these advancements, discusses their application to Constellation projects, and addresses the plans for the coming year.

  16. Learning to merge: a new tool for interactive mapping

    NASA Astrophysics Data System (ADS)

    Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy

    2013-05-01

    The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.

  17. New Brunswick Laboratory: Progress report, October 1987--September 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    NBL has been tasked by the DOE Office of Safeguards and Security, Defense Programs (OSS/DP) to assure the application of accurate and reliable measurement technology for the safeguarding of special nuclear materials. NBL is fulfilling its mission responsibilities by identifying and addressing the measurement and measurement-related needs of the nuclear material safeguards community. These responsibilities are being addressed by activities in the following program areas: (1) reference and calibration materials, (2) measurement development, (3) measurement services, (4) measurement evaluation, (5) safeguards assessment, and (6) site-specific assistance. Highlights of each of these programs areas are provided in this summary.

  18. HAL/SM language specification. [programming languages and computer programming for space shuttles

    NASA Technical Reports Server (NTRS)

    Williams, G. P. W., Jr.; Ross, C.

    1975-01-01

    A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.

  19. HPC Programming on Intel Many-Integrated-Core Hardware with MAGMA Port to Xeon Phi

    DOE PAGES

    Dongarra, Jack; Gates, Mark; Haidar, Azzam; ...

    2015-01-01

    This paper presents the design and implementation of several fundamental dense linear algebra (DLA) algorithms for multicore with Intel Xeon Phi coprocessors. In particular, we consider algorithms for solving linear systems. Further, we give an overview of the MAGMA MIC library, an open source, high performance library, that incorporates the developments presented here and, more broadly, provides the DLA functionality equivalent to that of the popular LAPACK library while targeting heterogeneous architectures that feature a mix of multicore CPUs and coprocessors. The LAPACK-compliance simplifies the use of the MAGMA MIC library in applications, while providing them with portably performant DLA.more » High performance is obtained through the use of the high-performance BLAS, hardware-specific tuning, and a hybridization methodology whereby we split the algorithm into computational tasks of various granularities. Execution of those tasks is properly scheduled over the heterogeneous hardware by minimizing data movements and mapping algorithmic requirements to the architectural strengths of the various heterogeneous hardware components. Our methodology and programming techniques are incorporated into the MAGMA MIC API, which abstracts the application developer from the specifics of the Xeon Phi architecture and is therefore applicable to algorithms beyond the scope of DLA.« less

  20. Determining Behavioral Task Content of the Curriculum in Occupational and Professional Education Programs: The Dental Auxiliaries. Final Report.

    ERIC Educational Resources Information Center

    Terry, David R.; Evans, Rupert N.

    The document is the final report of a project to develop a suitable method for studying the task content of accredited dental auxiliary education programs and the relationship between the tasks taught in such programs and the tasks involved in a professional situation. The set of instruments developed and pilot tested in 63 programs was used to…

  1. Automatically producing tailored web materials for public administration

    NASA Astrophysics Data System (ADS)

    Colineau, Nathalie; Paris, Cécile; Vander Linden, Keith

    2013-06-01

    Public administration organizations commonly produce citizen-focused, informational materials describing public programs and the conditions under which citizens or citizen groups are eligible for these programs. The organizations write these materials for generic audiences because of the excessive human resource costs that would be required to produce personalized materials for everyone. Unfortunately, generic materials tend to be longer and harder to understand than materials tailored for particular citizens. Our work explores the feasibility and effectiveness of automatically producing tailored materials. We have developed an adaptive hypermedia application system that automatically produces tailored informational materials and have evaluated it in a series of studies. The studies demonstrate that: (1) subjects prefer tailored materials over generic materials, even if the tailoring requires answering a set of demographic questions first; (2) tailored materials are more effective at supporting subjects in their task of learning about public programs; and (3) the time required to specify the demographic information on which the tailoring is based does not significantly slow down the subjects in their information seeking task.

  2. Geotherm: the U.S. geological survey geothermal information system

    USGS Publications Warehouse

    Bliss, J.D.; Rapport, A.

    1983-01-01

    GEOTHERM is a comprehensive system of public databases and software used to store, locate, and evaluate information on the geology, geochemistry, and hydrology of geothermal systems. Three main databases address the general characteristics of geothermal wells and fields, and the chemical properties of geothermal fluids; the last database is currently the most active. System tasks are divided into four areas: (1) data acquisition and entry, involving data entry via word processors and magnetic tape; (2) quality assurance, including the criteria and standards handbook and front-end data-screening programs; (3) operation, involving database backups and information extraction; and (4) user assistance, preparation of such items as application programs, and a quarterly newsletter. The principal task of GEOTHERM is to provide information and research support for the conduct of national geothermal-resource assessments. The principal users of GEOTHERM are those involved with the Geothermal Research Program of the U.S. Geological Survey. Information in the system is available to the public on request. ?? 1983.

  3. Kinematically redundant robot manipulators

    NASA Technical Reports Server (NTRS)

    Baillieul, J.; Hollerbach, J.; Brockett, R.; Martin, D.; Percy, R.; Thomas, R.

    1987-01-01

    Research on control, design and programming of kinematically redundant robot manipulators (KRRM) is discussed. These are devices in which there are more joint space degrees of freedom than are required to achieve every position and orientation of the end-effector necessary for a given task in a given workspace. The technological developments described here deal with: kinematic programming techniques for automatically generating joint-space trajectories to execute prescribed tasks; control of redundant manipulators to optimize dynamic criteria (e.g., applications of forces and moments at the end-effector that optimally distribute the loading of actuators); and design of KRRMs to optimize functionality in congested work environments or to achieve other goals unattainable with non-redundant manipulators. Kinematic programming techniques are discussed, which show that some pseudo-inverse techniques that have been proposed for redundant manipulator control fail to achieve the goals of avoiding kinematic singularities and also generating closed joint-space paths corresponding to close paths of the end effector in the workspace. The extended Jacobian is proposed as an alternative to pseudo-inverse techniques.

  4. Activity-Centric Approach to Distributed Programming

    NASA Technical Reports Server (NTRS)

    Levy, Renato; Satapathy, Goutam; Lang, Jun

    2004-01-01

    The first phase of an effort to develop a NASA version of the Cybele software system has been completed. To give meaning to even a highly abbreviated summary of the modifications to be embodied in the NASA version, it is necessary to present the following background information on Cybele: Cybele is a proprietary software infrastructure for use by programmers in developing agent-based application programs [complex application programs that contain autonomous, interacting components (agents)]. Cybele provides support for event handling from multiple sources, multithreading, concurrency control, migration, and load balancing. A Cybele agent follows a programming paradigm, called activity-centric programming, that enables an abstraction over system-level thread mechanisms. Activity centric programming relieves application programmers of the complex tasks of thread management, concurrency control, and event management. In order to provide such functionality, activity-centric programming demands support of other layers of software. This concludes the background information. In the first phase of the present development, a new architecture for Cybele was defined. In this architecture, Cybele follows a modular service-based approach to coupling of the programming and service layers of software architecture. In a service-based approach, the functionalities supported by activity-centric programming are apportioned, according to their characteristics, among several groups called services. A well-defined interface among all such services serves as a path that facilitates the maintenance and enhancement of such services without adverse effect on the whole software framework. The activity-centric application-program interface (API) is part of a kernel. The kernel API calls the services by use of their published interface. This approach makes it possible for any application code written exclusively under the API to be portable for any configuration of Cybele.

  5. The Design and Evaluation of "CAPTools"--A Computer Aided Parallelization Toolkit

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Frumkin, Michael; Hribar, Michelle; Jin, Haoqiang; Waheed, Abdul; Johnson, Steve; Cross, Jark; Evans, Emyr; Ierotheou, Constantinos; Leggett, Pete; hide

    1998-01-01

    Writing applications for high performance computers is a challenging task. Although writing code by hand still offers the best performance, it is extremely costly and often not very portable. The Computer Aided Parallelization Tools (CAPTools) are a toolkit designed to help automate the mapping of sequential FORTRAN scientific applications onto multiprocessors. CAPTools consists of the following major components: an inter-procedural dependence analysis module that incorporates user knowledge; a 'self-propagating' data partitioning module driven via user guidance; an execution control mask generation and optimization module for the user to fine tune parallel processing of individual partitions; a program transformation/restructuring facility for source code clean up and optimization; a set of browsers through which the user interacts with CAPTools at each stage of the parallelization process; and a code generator supporting multiple programming paradigms on various multiprocessors. Besides describing the rationale behind the architecture of CAPTools, the parallelization process is illustrated via case studies involving structured and unstructured meshes. The programming process and the performance of the generated parallel programs are compared against other programming alternatives based on the NAS Parallel Benchmarks, ARC3D and other scientific applications. Based on these results, a discussion on the feasibility of constructing architectural independent parallel applications is presented.

  6. Characterizing Task-Based OpenMP Programs

    PubMed Central

    Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats

    2015-01-01

    Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023

  7. Considerations and measurements of latent-heat-storage salts for secondary thermal battery applications

    NASA Astrophysics Data System (ADS)

    Koenig, A. A.; Braithwaite, J. W.; Armijo, J. R.

    1988-05-01

    Given its potential benefits, the practicality of using a latent heat-storage material as the basis for a passive thermal management system is being assessed by Chloride Silent Power Ltd. (CSPL) with technical assistance from Beta Power, Inc. and Sandia National Laboratories (SNL). Based on the experience gained in large-scale solar energy storage programs, fused salts were selected as the primary candidates for the heat-storage material. The initial phase of this assessment was directed to an EV battery being designed at CSPL for the ETX-II program. Specific tasks included the identification and characterization of potential fused salts, a determination of placement options for the salts within the battery, and an assessment of the ultimate benefit to the battery system. The results obtained to date for each of these tasks are presented in this paper.

  8. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  9. Experiences with Cray multi-tasking

    NASA Technical Reports Server (NTRS)

    Miya, E. N.

    1985-01-01

    The issues involved in modifying an existing code for multitasking is explored. They include Cray extensions to FORTRAN, an examination of the application code under study, designing workable modifications, specific code modifications to the VAX and Cray versions, performance, and efficiency results. The finished product is a faster, fully synchronous, parallel version of the original program. A production program is partitioned by hand to run on two CPUs. Loop splitting multitasks three key subroutines. Simply dividing subroutine data and control structure down the middle of a subroutine is not safe. Simple division produces results that are inconsistent with uniprocessor runs. The safest way to partition the code is to transfer one block of loops at a time and check the results of each on a test case. Other issues include debugging and performance. Task startup and maintenance (e.g., synchronization) are potentially expensive.

  10. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  11. Orbit transfer rocket engine technology program

    NASA Technical Reports Server (NTRS)

    Gustafson, N. B.; Harmon, T. J.

    1993-01-01

    An advanced near term (1990's) space-based Orbit Transfer Vehicle Engine (OTVE) system was designed, and the technologies applicable to its construction, maintenance, and operations were developed under Tasks A through F of the Orbit Transfer Rocket Engine Technology Program. Task A was a reporting task. In Task B, promising OTV turbomachinery technologies were explored: two stage partial admission turbines, high velocity ratio diffusing crossovers, soft wear ring seals, advanced bearing concepts, and a rotordynamic analysis. In Task C, a ribbed combustor design was developed. Possible rib and channel geometries were chosen analytically. Rib candidates were hot air tested and laser velocimeter boundary layer analyses were conducted. A channel geometry was also chosen on the basis of laser velocimeter data. To verify the predicted heat enhancement effects, a ribbed calorimeter spool was hot fire tested. Under Task D, the optimum expander cycle engine thrust, performance and envelope were established for a set of OTV missions. Optimal nozzle contours and quick disconnects for modularity were developed. Failure Modes and Effects Analyses, maintenance and reliability studies and component study results were incorporated into the engine system. Parametric trades on engine thrust, mixture ratio, and area ratio were also generated. A control system and the health monitoring and maintenance operations necessary for a space-based engine were outlined in Task E. In addition, combustor wall thickness measuring devices and a fiberoptic shaft monitor were developed. These monitoring devices were incorporated into preflight engine readiness checkout procedures. In Task F, the Integrated Component Evaluator (I.C.E.) was used to demonstrate performance and operational characteristics of an advanced expander cycle engine system and its component technologies. Sub-system checkouts and a system blowdown were performed. Short transitions were then made into main combustor ignition and main stage operation.

  12. Telerobot local-remote control architecture for space flight program applications

    NASA Technical Reports Server (NTRS)

    Zimmerman, Wayne; Backes, Paul; Steele, Robert; Long, Mark; Bon, Bruce; Beahan, John

    1993-01-01

    The JPL Supervisory Telerobotics (STELER) Laboratory has developed and demonstrated a unique local-remote robot control architecture which enables management of intermittent communication bus latencies and delays such as those expected for ground-remote operation of Space Station robotic systems via the Tracking and Data Relay Satellite System (TDRSS) communication platform. The current work at JPL in this area has focused on enhancing the technologies and transferring the control architecture to hardware and software environments which are more compatible with projected ground and space operational environments. At the local site, the operator updates the remote worksite model using stereo video and a model overlay/fitting algorithm which outputs the location and orientation of the object in free space. That information is relayed to the robot User Macro Interface (UMI) to enable programming of the robot control macros. This capability runs on a single Silicon Graphics Inc. machine. The operator can employ either manual teleoperation, shared control, or supervised autonomous control to manipulate the intended object. The remote site controller, called the Modular Telerobot Task Execution System (MOTES), runs in a multi-processor VME environment and performs the task sequencing, task execution, trajectory generation, closed loop force/torque control, task parameter monitoring, and reflex action. This paper describes the new STELER architecture implementation, and also documents the results of the recent autonomous docking task execution using the local site and MOTES.

  13. Program Management Tool

    NASA Technical Reports Server (NTRS)

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity issues among various NASA systems that impact schedules and planning.

  14. Distributed computing feasibility in a non-dedicated homogeneous distributed system

    NASA Technical Reports Server (NTRS)

    Leutenegger, Scott T.; Sun, Xian-He

    1993-01-01

    The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.

  15. A comparison of older adults' subjective experience with virtual and real environments during dynamic balance activities

    PubMed Central

    Proffitt, Rachel; Lange, Belinda; Chen, Christina; Winstein, Carolee

    2014-01-01

    The purpose of this study was to explore the subjective experience of older adults interacting with both virtual and real environments. Thirty healthy older adults engaged with real and virtual tasks of similar motor demands: reaching to a target in standing and stepping stance. Immersive tendencies and absorption scales were administered before the session. Game engagement and experience questionnaires were completed after each task, followed by a semi-structured interview at the end of the testing session. Data were analyzed respectively using paired t-tests and grounded theory methodology. Participants preferred the virtual task over the real task. They also reported an increase in presence and absorption with the virtual task, describing an external focus of attention. Findings will be used to inform future development of appropriate game-based balance training applications that could be embedded in the home or community settings as part of evidence-based fall prevention programs. PMID:24334299

  16. A historical perspective of remote operations and robotics in nuclear facilities. Robotics and Intelligent Systems Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herndon, J.N.

    1992-12-31

    The field of remote technology is continuing to evolve to support man`s efforts to perform tasks in hostile environments. The technology which we recognize today as remote technology has evolved over the last 45 years to support human operations in hostile environments such as nuclear fission and fusion, space, underwater, hazardous chemical, and hazardous manufacturing. The four major categories of approach to remote technology have been (1) protective clothing and equipment for direct human entry, (2) extended reach tools using distance for safety, (3) telemanipulators with barriers for safety, and (4) teleoperators incorporating mobility with distance and/or barriers for safety.more » The government and commercial nuclear industry has driven the development of the majority of the actual teleoperator hardware available today. This hardware has been developed largely due to the unsatisfactory performance of the protective-clothing approach in many hostile applications. Manipulation systems which have been developed include crane/impact wrench systems, unilateral power manipulators, mechanical master/slaves, and servomanipulators. Viewing systems have included periscopes, shield windows, and television systems. Experience over the past 45 years indicates that maintenance system flexibility is essential to typical repair tasks because they are usually not repetitive, structured, or planned. Fully remote design (manipulation, task provisions, remote tooling, and facility synergy) is essential to work task efficiency. Work for space applications has been primarily research oriented with relatively few successful space applications, although the shuttle`s remote manipulator system has been quite successful. In the last decade, underwater applications have moved forward significantly, with the offshore oil industry and military applications providing the primary impetus.« less

  17. Application of evolutionary computation in ECAD problems

    NASA Astrophysics Data System (ADS)

    Lee, Dae-Hyun; Hwang, Seung H.

    1998-10-01

    Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.

  18. Final Environmental Impact Statement Permit Application by United States Steel Corp. Proposed Lake Front Steel Mill, Conneaut, Ohio. Volume 1,

    DTIC Science & Technology

    1979-04-01

    FRONT STEEL MILL CONNEAUT, OHIO 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(a) S. CONTRACT OR GRANT NUMBERS) Paul G. Leuchner and Gregory P. Keppel... PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK AREA & WORK UNIT NUMBERS U.S. Army Engineer District, Buffalo 1776 Niagara...the Army permit to perform certain work in Lake Erie and its tributaries. Activities proposed by the applicant include the construction of a water

  19. CREATING AN IPHONE APPLICATION FOR COLLECTING CONTINUOUS ABC DATA

    PubMed Central

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs. PMID:23060682

  20. Creating an iPhone application for collecting continuous ABC data.

    PubMed

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.

  1. Massive Symbolic Mathematical Computations and Their Applications

    DTIC Science & Technology

    1988-08-16

    NUMBER ORGANIZATION (if appi cable) AFOSR I A_ /__ I F49620-87- C -0113 Bc. ADDRESS (City, Stare, and ZIP Code) %. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT...TASK WORK UNIT - < ’/I/ "//ELEMENT NO. NO. NO. ACCESSION NO. /,, AF,; c 9r ;- 6 (4/tL’ " ’ ’! /K’, 11 TITLE (Incoue Secuirty Classification) Massive...DARPA R & D Status Report AFOSR.m. 8 8-1 12Contract No. F49620-87- C -0113 MASSIVE SYMBOLIC MATHEMATICAL COMPUTATIONS AND THEIR APPLICATIONS Quarterly

  2. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    NASA Technical Reports Server (NTRS)

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  3. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  4. A Fault Oblivious Extreme-Scale Execution Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKie, Jim

    The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massivemore » data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations employing work stealing for load balancing that scaled to the largest existing supercomputers. Finally, we implemented the Elastic Building Blocks runtime, a library to manage object-oriented distributed software components. To support the research, we won two INCITE awards for time on Intrepid (BG/P) and Mira (BG/Q). Much of our work has had impact in the OS and runtime community through the ASCR Exascale OS/R workshop and report, leading to the research agenda of the Exascale OS/R program. Our project was, however, also affected by attrition of multiple PIs. While the PIs continued to participate and offer guidance as time permitted, losing these key individuals was unfortunate both for the project and for the DOE HPC community.« less

  5. EJS, JIL Server, and LabVIEW: An Architecture for Rapid Development of Remote Labs

    ERIC Educational Resources Information Center

    Chacón, Jesús; Vargas, Hector; Farias, Gonzalo; Sanchez, José; Dormido, Sebastián

    2015-01-01

    Designing and developing web-enabled remote laboratories for pedagogical purposes is not an easy task. Often, developers (generally, educators who know the subjects they teach but lack of the technical and programming skills required to build Internet-based educational applications) end up discarding the idea of exploring these new teaching and…

  6. Logo Experiences with Young Children: Describing Performance, Problem-Solving and Social Contexts of Learning.

    ERIC Educational Resources Information Center

    Yelland, Nicola

    1995-01-01

    Explored the performance of primary school children in Logo programming tasks while they worked in one of three gender pairs (girl, boy, or boy-girl). Found no considerable differences in performance based on gender. Results suggest that what distinguished performance was the application of metastrategic processes--the most effective solutions…

  7. Life Sciences Program Tasks and Bibliography for FY 1996

    NASA Technical Reports Server (NTRS)

    Nelson, John C. (Editor)

    1997-01-01

    This document includes information on all peer reviewed projects funded by the Office of Life and Microgravity Sciences and Applications, Life Sciences Division during fiscal year 1996. This document will be published annually and made available to scientists in the space life sciences field both as a hard copy and as an interactive Internet web page.

  8. Life Sciences Program Tasks and Bibliography for FY 1997

    NASA Technical Reports Server (NTRS)

    Nelson, John C. (Editor)

    1998-01-01

    This document includes information on all peer reviewed projects funded by the Office of Life and Microgravity Sciences and Applications, Life Sciences Division during fiscal year 1997. This document will be published annually and made available to scientists in the space life sciences field both as a hard copy and as an interactive internet web page.

  9. Analytic Support of Emergency Response and Recovery for the Wide-Area Recovery & Resiliency Program (WARRP) Task 1: Medical Countermeasures Response

    DTIC Science & Technology

    2012-02-23

    time Detect and Characterize Event Multiple Materiel No integration between national biosurveillance systems Could receive disparate signals and the...is very limited in its applicability at this time only being deployable in one city and in the process of being implemented in four more Push models

  10. Task-level robot programming: Integral part of evolution from teleoperation to autonomy

    NASA Technical Reports Server (NTRS)

    Reynolds, James C.

    1987-01-01

    An explanation is presented of task-level robot programming and of how it differs from the usual interpretation of task planning for robotics. Most importantly, it is argued that the physical and mathematical basis of task-level robot programming provides inherently greater reliability than efforts to apply better known concepts from artificial intelligence (AI) to autonomous robotics. Finally, an architecture is presented that allows the integration of task-level robot programming within an evolutionary, redundant, and multi-modal framework that spans teleoperation to autonomy.

  11. Microgravity Science and Applications Program tasks, 1986 revision

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Microgravity Science and Applications (MSA) program is directed toward research in the science and technology of processing materials under conditions of low gravity to provide a detailed examination of the constraints imposed by gravitational forces on Earth. The program is expected to lead to the development of new materials and processes in commercial applications adding to this nation's technological base. The research studies emphasize the selected materials and processes that will best elucidate the limitations due to gravity and demonstrate the enhanced sensitivity of control of processes that may be provided by the weightless environment of space. Primary effort is devoted to a study of the specific areas of research which reveals potential value in the initial investigations of the previous decades. Examples of previous process research include crystal growth and directional solidification of metals; containerless processing of reactive materials; synthesis and separation of biological materials; etc. Additional efforts will be devoted to identifying the special requirements which drive the design of hardware to reduce risk in future developments.

  12. Startle reveals an absence of advance motor programming in a Go/No-go task.

    PubMed

    Carlsen, Anthony N; Chua, Romeo; Dakin, Chris J; Sanderson, David J; Inglis, J Timothy; Franks, Ian M

    2008-03-21

    Presenting a startling stimulus in a simple reaction time (RT) task, can involuntarily trigger the pre-programmed response. However, this effect is not seen when the response is programmed following the imperative stimulus (IS) providing evidence that a startle can only trigger pre-programmed responses. In a "Go/No-go" (GNG) RT task the response may be programmed in advance of the IS because there exists only a single predetermined response. The purpose of the current investigation was to examine if startle could elicit a response in a GNG task. Participants completed a wrist extension task in response to a visual stimulus. A startling acoustic stimulus (124dB) was presented in both Go and No-go trials with Go probability manipulated between groups. The inclusion of a startle did not significantly speed RT and led to more response errors. This result is similar to that observed in a startled choice RT task, indicating that in a GNG task participants waited until the IS complete motor programming.

  13. Tryon Trekkers: An Evaluation of a STEM Based Afterschool Program for At-Risk Youth

    NASA Astrophysics Data System (ADS)

    Eckels Anderson, Chessa

    This study contributed to the body of research that supports a holistic model of afterschool learning through the design of an afterschool intervention that benefits elementary school students of low socioeconomic status. This qualitative study evaluated a science focused afterschool curriculum that was designed using principles from Risk and Resiliency Theory, academic motivation theories, science core ideas from the Next Generation Science Standards, and used environmental education philosophy. The research question of this study is: how does an outdoor and STEM based afterschool program impact at-risk students' self-efficacy, belonging and engagement and ability to apply conceptual knowledge of environmental science topics? The study collected information about the participants' affective experiences during the intervention using structured and ethnographic observations and semi-structured interviews. Observations and interviews were coded and analyzed to find patterns in participants' responses. Three participant profiles were developed using the structured observations and ethnographic observations to provide an in depth understanding of the participant experience. The study also assessed the participants' abilities to apply conceptual understanding of the program's science topics by integrating an application of conceptual knowledge task into the curriculum. This task in the form of a participant project was assessed using an adapted version of the Portland Metro STEM Partnership's Application of Conceptual Knowledge Rubric. Results in the study showed that participants demonstrated self-efficacy, a sense of belonging and engagement during the program. Over half of the participants in the study demonstrated a proficient understanding of program concepts. Overall, this holistic afterschool program demonstrated that specific instructional practices and a multi-modal science curriculum helped to support the social and emotional needs of at-risk children.

  14. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    PubMed

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  15. TTSA: An Effective Scheduling Approach for Delay Bounded Tasks in Hybrid Clouds.

    PubMed

    Yuan, Haitao; Bi, Jing; Tan, Wei; Zhou, MengChu; Li, Bo Hu; Li, Jianqiang

    2017-11-01

    The economy of scale provided by cloud attracts a growing number of organizations and industrial companies to deploy their applications in cloud data centers (CDCs) and to provide services to users around the world. The uncertainty of arriving tasks makes it a big challenge for private CDC to cost-effectively schedule delay bounded tasks without exceeding their delay bounds. Unlike previous studies, this paper takes into account the cost minimization problem for private CDC in hybrid clouds, where the energy price of private CDC and execution price of public clouds both show the temporal diversity. Then, this paper proposes a temporal task scheduling algorithm (TTSA) to effectively dispatch all arriving tasks to private CDC and public clouds. In each iteration of TTSA, the cost minimization problem is modeled as a mixed integer linear program and solved by a hybrid simulated-annealing particle-swarm-optimization. The experimental results demonstrate that compared with the existing methods, the optimal or suboptimal scheduling strategy produced by TTSA can efficiently increase the throughput and reduce the cost of private CDC while meeting the delay bounds of all the tasks.

  16. Application of IPAD to missile design

    NASA Technical Reports Server (NTRS)

    Santa, J. E.; Whiting, T. R.

    1974-01-01

    The application of an integrated program for aerospace-vehicle design (IPAD) to the design of a tactical missile is examined. The feasibility of modifying a proposed IPAD system for aircraft design work for use in missile design is evaluated. The tasks, cost, and schedule for the modification are presented. The basic engineering design process is described, explaining how missile design is achieved through iteration of six logical problem solving functions throughout the system studies, preliminary design, and detailed design phases of a new product. Existing computer codes used in various engineering disciplines are evaluated for their applicability to IPAD in missile design.

  17. TDRSS system configuration study for space shuttle program

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This study was set up to assure that operation of the shuttle orbiter communications systems met the program requirements when subjected to electrical conditions similar to those which will be encountered during the operational mission. The test program intended to implement an integrated test bed, consisting of applicable orbiter, EVA, payload simulator, STDN, and AF/SCF, as well as the TDRSS equipment. The stated intention of Task 501 Program was to configure the test bed with prototype hardware for a system development test and production hardware for a system verification test. In case of TDRSS when the hardware was not available, simulators whose functional performance was certified to meet appropriate end item specification were used.

  18. XPI: The Xanadu Parameter Interface

    NASA Technical Reports Server (NTRS)

    White, N.; Barrett, P.; Oneel, B.; Jacobs, P.

    1992-01-01

    XPI is a table driven parameter interface which greatly simplifies both command driven programs such as BROWSE and XIMAGE as well as stand alone single-task programs. It moves all of the syntax and semantic parsing of commands and parameters out of the users code into common code and externally defined tables. This allows the programmer to concentrate on writing the code unique to the application rather than reinventing the user interface and for external graphical interfaces to interface with no changes to the command driven program. XPI also includes a compatibility library which allows programs written using the IRAF host interface (Mandel and Roll) to use XPI in place of the IRAF host interface.

  19. Automated Planning Enables Complex Protocols on Liquid-Handling Robots.

    PubMed

    Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg

    2018-03-16

    Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.

  20. The space station freedom flight telerobotic servicer. The design and evolution of a dexterous space robot

    NASA Astrophysics Data System (ADS)

    McCain, Harry G.; Andary, James F.; Hewitt, Dennis R.; Haley, Dennis C.

    The Flight Telerobotic Servicer (FTS) Project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station Freedom (SSF). The FTS will provide a telerobotic capability to the Freedom Station in the early assembly phases of the program and will be employed for assembly, maintenance, and inspection applications throughout the lifetime of the space station. Appropriately configured elements of the FTS will also be employed for robotic manipulation in remote satellite servicing applications and possibly the Lunar/Mars Program. In mid-1989, the FTS entered the flight system design and implementation phase (Phase C/D) of development with the signing of the FTS prime contract with Martin Marietta Astronautics Group in Denver, Colorado. The basic FTS design is now established and can be reported on in some detail. This paper will describe the FTS flight system design and the rationale for the specific design approaches and component selections. The current state of space technology and the general nature of the FTS task dictate that the FTS be designed with sophisticated teleoperation capabilities for its initial primary operating mode. However, there are technologies, such as advanced computer vision and autonomous planning techniques currently in research and advanced development phases which would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Therefore, a specific requirement on the initial FTS design is that it has the capability to evolve as new technology becomes available. This paper will describe the FTS design approach for evolution to more autonomous capabilities. Some specific task applications of the FTS and partial automation approaches of these tasks will also be discussed in this paper.

  1. The Space Station Freedom Flight Telerobotic Servicer: the design and evolution of a dexterous space robot.

    PubMed

    McCain, H G; Andary, J F; Hewitt, D R; Haley, D C

    1991-01-01

    The Flight Telerobotic Servicer (FTS) Project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station) Freedom (SSF). The FTS will provide a telerobotic capability to the Freedom Station in the early assembly phases of the program and will be employed for assembly, maintenance, and inspection applications throughout the lifetime of the space station. Appropriately configured elements of the FTS will also be employed for robotic manipulation in remote satellite servicing applications and possibly the Lunar/Mars Program. In mid-1989, the FTS entered the flight system design and implementation phase (Phase C/D) of development with the signing of the FTS prime contract with Martin Marietta Astronautics Group in Denver, Colorado. The basic FTS design is now established and can be reported on in some detail. This paper will describe the FTS flight system design and the rationale for the specific design approaches and component selections. The current state of space technology and the nature of the FTS task dictate that the FTS be designed with sophisticated teleoperation capabilities for its initial primary operating mode. However, there are technologies, such as advanced computer vision and autonomous planning techniques currently in research and advanced development phases which would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Therefore, a specific requirement on the initial FTS design is that it has the capability to evolve as new technology becomes available. This paper will describe the FTS design approach for evolution to more autonomous capabilities. Some specific task applications of the FTS and partial automation approaches of these tasks will also be discussed in this paper.

  2. The Space Station Freedom Flight Telerobotic Servicer: the design and evolution of a dexterous space robot

    NASA Technical Reports Server (NTRS)

    McCain, H. G.; Andary, J. F.; Hewitt, D. R.; Haley, D. C.

    1991-01-01

    The Flight Telerobotic Servicer (FTS) Project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station) Freedom (SSF). The FTS will provide a telerobotic capability to the Freedom Station in the early assembly phases of the program and will be employed for assembly, maintenance, and inspection applications throughout the lifetime of the space station. Appropriately configured elements of the FTS will also be employed for robotic manipulation in remote satellite servicing applications and possibly the Lunar/Mars Program. In mid-1989, the FTS entered the flight system design and implementation phase (Phase C/D) of development with the signing of the FTS prime contract with Martin Marietta Astronautics Group in Denver, Colorado. The basic FTS design is now established and can be reported on in some detail. This paper will describe the FTS flight system design and the rationale for the specific design approaches and component selections. The current state of space technology and the nature of the FTS task dictate that the FTS be designed with sophisticated teleoperation capabilities for its initial primary operating mode. However, there are technologies, such as advanced computer vision and autonomous planning techniques currently in research and advanced development phases which would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Therefore, a specific requirement on the initial FTS design is that it has the capability to evolve as new technology becomes available. This paper will describe the FTS design approach for evolution to more autonomous capabilities. Some specific task applications of the FTS and partial automation approaches of these tasks will also be discussed in this paper.

  3. Microfine coal firing results from a retrofit gas/oil-designed industrial boiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, R.; Borio, R.W.; Liljedahl, G.

    1995-12-31

    The development of a High Efficiency Advanced Coal Combustor (HEACC) has been in progress since 1987 and the ABB Power Plant Laboratories. The initial work on this concept produced an advanced coal firing system that was capable of firing both water-based and dry pulverized coal in an industrial boiler environment. Economics may one day dictate that it makes sense to replace oil or natural gas with coal in boilers that were originally designed to burn these fuels. The objective of the current program is to demonstrate the technical and economic feasibility of retrofitting a gas/oil designed boiler to burn micronizedmore » coal. In support of this overall objective, the following specific areas were targeted: A coal handling/preparation system that can meet the technical requirements for retrofitting microfine coal on a boiler designed for burning oil or natural gas; Maintaining boiler thermal performance in accordance with specifications when burning oil or natural gas; Maintaining NOx emissions at or below 0.6 lb/MBtu; Achieving combustion efficiencies of 98% or higher; and Calculating economic payback periods as a function of key variables. The overall program has consisted of five major tasks: (1) A review of current state-of-the-art coal firing system components; (2) Design and experimental testing of a prototype HEACC burner; (3) Installation and testing of a HEACC system in a commercial retrofit application; (4) Economic evaluation of the HEACC concept for retrofit applications; and (5) Long term demonstration under commercial user demand conditions. This paper will summarize the latest key experimental results (Task 3) and the economic evaluation (Task 4) of the HEACC concept for retrofit applications. 28 figs., 6 tabs.« less

  4. Programming Models for Concurrency and Real-Time

    NASA Astrophysics Data System (ADS)

    Vitek, Jan

    Modern real-time applications are increasingly large, complex and concurrent systems which must meet stringent performance and predictability requirements. Programming those systems require fundamental advances in programming languages and runtime systems. This talk presents our work on Flexotasks, a programming model for concurrent, real-time systems inspired by stream-processing and concurrent active objects. Some of the key innovations in Flexotasks are that it support both real-time garbage collection and region-based memory with an ownership type system for static safety. Communication between tasks is performed by channels with a linear type discipline to avoid copying messages, and by a non-blocking transactional memory facility. We have evaluated our model empirically within two distinct implementations, one based on Purdue’s Ovm research virtual machine framework and the other on Websphere, IBM’s production real-time virtual machine. We have written a number of small programs, as well as a 30 KLOC avionics collision detector application. We show that Flexotasks are capable of executing periodic threads at 10 KHz with a standard deviation of 1.2us and have performance competitive with hand coded C programs.

  5. Development of flight experiment work performance and workstation interface requirements, part 1. Technical report and appendices A through G

    NASA Technical Reports Server (NTRS)

    Hatterick, R. G.

    1973-01-01

    A skill requirement definition method was applied to the problem of determining, at an early stage in system/mission definition, the skills required of on-orbit crew personnel whose activities will be related to the conduct or support of earth-orbital research. The experiment data base was selected from proposed experiments in NASA's earth orbital research and application investigation program as related to space shuttle missions, specifically those being considered for Sortie Lab. Concepts for two integrated workstation consoles for Sortie Lab experiment operations were developed, one each for earth observations and materials sciences payloads, utilizing a common supporting subsystems core console. A comprehensive data base of crew functions, operating environments, task dependencies, task-skills and occupational skills applicable to a representative cross section of earth orbital research experiments is presented. All data has been coded alphanumerically to permit efficient, low cost exercise and application of the data through automatic data processing in the future.

  6. High-pressure LOX/hydrocarbon preburners and gas generators

    NASA Technical Reports Server (NTRS)

    Huebner, A. W.

    1981-01-01

    The objective of the program was to conduct a small scale hardware test program to establish the technology base required for LOX/hydrocarbon preburners and gas generators. The program consisted of six major tasks; Task I reviewed and assessed the performance prediction models and defined a subscale test program. Task II designed and fabricated this subscale hardware. Task III tested and analyzed the data from this hardware. Task IV analyzed the hot fire results and formulated a preliminary design for 40K preburner assemblies. Task V took the preliminary design and detailed and fabricated three 40K size preburner assemblies, one each fuel-rich LOX/CH, and LOX/RP-1 and one oxidizer rich LOX/CH4. Task VI delivered these preburner assemblies to MSFC for subsequent evaluation.

  7. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  8. Process Versus Task in Social Planning

    ERIC Educational Resources Information Center

    Gilbert, Neil; Specht, Harry

    1977-01-01

    For several decades, the relative importance of process as opposed to task has been an issue in the literature. This study of the Model Cities program examines the relationship between program outcomes and the process and task orientations of program planners. (Author)

  9. Features and characterization needs of rubber composite structures

    NASA Technical Reports Server (NTRS)

    Tabaddor, Farhad

    1989-01-01

    Some of the major unique features of rubber composite structures are outlined. The features covered are those related to the material properties, but the analytical features are also briefly discussed. It is essential to recognize these features at the planning stage of any long-range analytical, experimental, or application program. The development of a general and comprehensive program which fully accounts for all the important characteristics of tires, under all the relevant modes of operation, may present a prohibitively expensive and impractical task at the near future. There is therefore a need to develop application methodologies which can utilize the less general models, beyond their theoretical limitations and yet with reasonable reliability, by proper mix of analytical, experimental, and testing activities.

  10. Earth resources data analysis program, phase 2

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The efforts and findings of the Earth Resources Data Analysis Program are summarized. Results of a detailed study of the needs of EOD with respect to an applications development system (ADS) for the analysis of remotely sensed data, including an evaluation of four existing systems with respect to these needs are described. Recommendations as to possible courses for EOD to follow to obtain a viable ADS are presented. Algorithmic development comprised of several subtasks is discussed. These subtasks include the following: (1) two algorithms for multivariate density estimation; (2) a data smoothing algorithm; (3) a method for optimally estimating prior probabilities of unclassified data; and (4) further applications of the modified Cholesky decomposition in various calculations. Little effort was expended on task 3, however, two reports were reviewed.

  11. THREAD: A programming environment for interactive planning-level robotics applications

    NASA Technical Reports Server (NTRS)

    Beahan, John J., Jr.

    1989-01-01

    THREAD programming language, which was developed to meet the needs of researchers in developing robotics applications that perform such tasks as grasp, trajectory design, sensor data analysis, and interfacing with external subsystems in order to perform servo-level control of manipulators and real time sensing is discussed. The philosophy behind THREAD, the issues which entered into its design, and the features of the language are discussed from the viewpoint of researchers who want to develop algorithms in a simulation environment, and from those who want to implement physical robotics systems. The detailed functions of the many complex robotics algorithms and tools which are part of the language are not explained, but an overall impression of their capability is given.

  12. Teaching and learning curriculum programs: recommendations for postgraduate pharmacy experiences in education.

    PubMed

    Wright, Eric A; Brown, Bonnie; Gettig, Jacob; Martello, Jay L; McClendon, Katie S; Smith, Kelly M; Teeters, Janet; Ulbrich, Timothy R; Wegrzyn, Nicole; Bradley-Baker, Lynette R

    2014-08-01

    Recommendations for the development and support of teaching and learning curriculum (TLC) experiences within postgraduate pharmacy training programs are discussed. Recent attention has turned toward meeting teaching- and learning-related educational outcomes through a programmatic process during the first or second year of postgraduate education. These programs are usually coordinated by schools and colleges of pharmacy and often referred to as "teaching certificate programs," though no national standards or regulation of these programs currently exists. In an effort to describe the landscape of these programs and to develop a framework for their basic design and content, the American Association of Colleges of Pharmacy Pharmacy Practice Section's Task Force on Student Engagement and Involvement, with input from the American Society of Health-System Pharmacists, reviewed evidence from the literature and conference proceedings and considered author experience and expertise over a two-year period. The members of the task force created and reached consensus on a policy statement and 12 recommendations to guide the development of best practices of TLC programs. The recommendations address topics such as the value of TLC programs, program content, teaching and learning experiences, feedback for participants, the development of a teaching portfolio, the provision of adequate resources for TLC programs, programmatic assessment and improvement, program transparency, and accreditation. TLC programs provide postgraduate participants with valuable knowledge and skills in teaching applicable to the practitioner and academician. Postgraduate programs should be transparent to candidates and seek to ensure the best experiences for participants through systematic program implementation and assessments. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  13. Object Oriented Programming Systems (OOPS) and frame representations: An investigation of programming paradigms

    NASA Technical Reports Server (NTRS)

    Auty, David

    1988-01-01

    The project was initiated to research Object Oriented Programming Systems (OOPS) and frame representation systems, their significance and applicability, and their implementation in or relationship to Ada. Object orientated is currently a very popular conceptual adjective. Object oriented programming, in particular, is promoted as a particularly productive approach to programming; an approach which maximizes opportunities for code reuse and lends itself to the definition of convenient and well-developed units. Such units are thus expected to be usable in a variety of situations, beyond the typical highly specific unit development of other approaches. Frame represenation systems share a common heritage and similar conceptual foundations. Together they represent a quickly emerging alternative approach to programming. The approach is to first define the terms, starting with relevant concepts and using these to put bounds on what is meant by OOPS and Frames. From this the possibilities were pursued to merge OOPS with Ada which will further elucidate the significant characteristics which make up this programming approach. Finally, some of the merits and demerits of OOPS were briefly considered as a way of addressing the applicability of OOPS to various programming tasks.

  14. An intelligent CNC machine control system architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.J.; Loucks, C.S.

    1996-10-01

    Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less

  15. Channel Access in Erlang

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicklaus, Dennis J.

    2013-10-13

    We have developed an Erlang language implementation of the Channel Access protocol. Included are low-level functions for encoding and decoding Channel Access protocol network packets as well as higher level functions for monitoring or setting EPICS process variables. This provides access to EPICS process variables for the Fermilab Acnet control system via our Erlang-based front-end architecture without having to interface to C/C++ programs and libraries. Erlang is a functional programming language originally developed for real-time telecommunications applications. Its network programming features and list management functions make it particularly well-suited for the task of managing multiple Channel Access circuits and PVmore » monitors.« less

  16. Considerations and measurements of latent-heat-storage salts for secondary thermal battery applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koenig, A.A.; Braithwaite, J.W.; Armijo, J.R.

    Given its potential benefits, the practicality of using a latent heat-storage material as the basis for a passive thermal management system is being assessed by Chloride Silent Power Ltd. (CSPL) with technical assistance from Beta Power, Inc. and Sandia National Laboratories (SNL). Based on the experience gained in large-scale solar energy storage programs, fused salts were selected as the primary candidates for the heat-storage material. The initial phase of this assessment was directed to an EV battery being designed at CSPL for the ETX-II program. Specific tasks included the identification and characterization of potential fused salts, a determination of placementmore » options for the salts within the battery, and an assessment of the ultimate benefit to the battery system. The results obtained to date for each of these tasks are presented in this paper.« less

  17. The Design and Analysis of a Network Interface for the Multi-Lingual Database System.

    DTIC Science & Technology

    1985-12-01

    IDENTIF:CATION NUMBER 0 ORGANIZATION (If applicable) 8c. ADDRESS (City, State. and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT...APPFNlDIX - THE~ KMS PROGRAM SPECIFICATI~bS ........ 94 I4 XST O)F REFEFRENCFS*O*IOebqBS~*OBS 124 Il LIST OF FIrURPS F’igure 1: The multi-Linqual Database...bacKend Database System *CABO0S). In this section, we Provide an overviev of Doti tne MLLS an tne 4B0S to enhance the readers understandin- of the

  18. Assessing Heat-to-Heat Variations Affecting Mechanism Based Modeling of Hydrogen Environment Cracking (HEAC) in High Strength Alloys for Marine Applications: Monel K-500

    DTIC Science & Technology

    2016-01-28

    PROGRAM ELEMENT NUMBER N/A 6. AUTHOR( S ) John R. Scully 5d. PROJECT NUMBER N/A 5e. TASK NUMBER N/A 5f. WORK UNIT NUMBER N/A 7. PERFORMING...ORGANIZATION NAME( S ) AND ADDRESS(ES) University of Virginia Office of Sponsored Programs P.O. Box 400195 Charlottesville, Virginia 22904-4195 8...PERFORMING ORGANIZATION REPORT NUMBER 140116-101-GG11530-31340 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) Office of Naval Research

  19. Research and technology, Lyndon B. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Johnson Space Center accomplishments in new and advanced concepts during 1984 are highlighted. Included are research funded by the Office of Aeronautics and Space Technology; Advanced Programs tasks funded by the Office of Space Flight; and Solar System Exploration and Life Sciences research funded by the Office of Space Sciences and Applications. Summary sections describing the role of the Johnson Space Center in each program are followed by one page descriptions of significant projects. Descriptions are suitable for external consumption, free of technical jargon, and illustrated to increase ease of comprehension.

  20. Design, processing and testing of LSI arrays, hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.; Rothrock, C. W.

    1979-01-01

    Mathematical cost models previously developed for hybrid microelectronic subsystems were refined and expanded. Rework terms related to substrate fabrication, nonrecurring developmental and manufacturing operations, and prototype production are included. Sample computer programs were written to demonstrate hybrid microelectric applications of these cost models. Computer programs were generated to calculate and analyze values for the total microelectronics costs. Large scale integrated (LST) chips utilizing tape chip carrier technology were studied. The feasibility of interconnecting arrays of LSU chips utilizing tape chip carrier and semiautomatic wire bonding technology was demonstrated.

  1. Consolidated fuel reprocessing program

    NASA Astrophysics Data System (ADS)

    1985-04-01

    A survey of electrochemical methods applications in fuel reprocessing was completed. A dummy fuel assembly shroud was cut using the remotely operated laser disassembly equipment. Operations and engineering efforts have continued to correct equipment operating, software, and procedural problems experienced during the previous uranium compaigns. Fuel cycle options were examined for the liquid metal reactor fuel cycle. In high temperature gas cooled reactor spent fuel studies, preconceptual designs were completed for the concrete storage cask and open field drywell storage concept. These and other tasks operating under the consolidated fuel reprocessing program are examined.

  2. Advanced Turboprop Project

    NASA Technical Reports Server (NTRS)

    Hager, Roy D.; Vrabel, Deborah

    1988-01-01

    At the direction of Congress, a task force headed by NASA was organized in 1975 to identify potential fuel saving concepts for aviation. The result was the Aircraft Energy Efficiency (ACEE) Program implemented in 1976. An important part of the program was the development of advanced turboprop technology for Mach 0.65 to 0.85 applications having the potential fuel saving of 30 to 50 percent relative to existing turbofan engines. A historical perspective is presented of the development and the accomplishments that brought the turboprop to successful flight tests in 1986 and 1987.

  3. Advanced turboprop project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hager, R.D.; Vrabel, D.

    1988-01-01

    At the direction of Congress, a task force headed by NASA was organized in 1975 to identify potential fuel saving concepts for aviation. The result was the Aircraft Energy Efficiency (ACEE) Program implemented in 1976. An important part of the program was the development of advanced turboprop technology for Mach 0.65 to 0.85 applications having the potential fuel saving of 30 to 50 percent relative to existing turbofan engines. A historical perspective is presented of the development and the accomplishments that brought the turboprop to successful flight tests in 1986 and 1987.

  4. Research and technology of the Lyndon B. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Johnson Space Center accomplishments in new and advanced concepts during 1987 are highlighted. Included are research projects funded by the Office of Aeronautics and Space Technology, Solar System Exploration and Life Sciences research funded by the Office of Space Sciences and Applications, and advanced Programs tasks funded by the Office of Space Flight. Summary sections describing the role of the Johnson Space Center in each program are followed by descriptions of significant projects. Descriptions are suitable for external consumption, free of technical jargon, and illustrated to increase ease of comprehension.

  5. Nanotechnology in Aerospace Applications

    DTIC Science & Technology

    2007-03-01

    CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...logic and memory chips, sensors, catalyst support, adsorption media, actuators, etc. All early works in nanoelectronics use CNTs as a conducting...inspection cost effectively , quickly, and efficiently than the present procedures, composites, wear resistant tires, improved avionics, satellite

  6. A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.

    ERIC Educational Resources Information Center

    Sproule, Susan; Archer, Norm

    2000-01-01

    Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…

  7. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  8. Synthesis and Functionalization of Atomic Layer Boron Nitride Nanosheets for Advanced Material Applications

    DTIC Science & Technology

    2014-06-05

    PAGES 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 5c. PROGRAM ELEMENT NUMBER 5b. GRANT NUMBER 5a. CONTRACT NUMBER Form Approved OMB...Phys., 89, (2001) 5243. [14] M. Depas, R. L. Van Meirhaegue, W. H. Laflère, F. Cardon , Solid- State Electron, 37, (1994) 433. [15] Muhammad Sajjad

  9. The Baltimore applications project: A new look at technology transfer

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The history of cooperation between Goddard Space Flight Center and Baltimore City administrators in solving urban problems is summarized. NASA provided consultation and advisory services as well as technology resources and demonstrations. Research and development programs for 69 tasks are briefly described. Technology utilization for incinerator energy, data collection, Health Department problems, and solarization experiments are presented as case histories.

  10. Scalable software architectures for decision support.

    PubMed

    Musen, M A

    1999-12-01

    Interest in decision-support programs for clinical medicine soared in the 1970s. Since that time, workers in medical informatics have been particularly attracted to rule-based systems as a means of providing clinical decision support. Although developers have built many successful applications using production rules, they also have discovered that creation and maintenance of large rule bases is quite problematic. In the 1980s, several groups of investigators began to explore alternative programming abstractions that can be used to build decision-support systems. As a result, the notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) problem-solving methods--domain-independent algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper highlights how developers can construct large, maintainable decision-support systems using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.

  11. Application Characterization at Scale: Lessons learned from developing a distributed Open Community Runtime system for High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landwehr, Joshua B.; Suetterlein, Joshua D.; Marquez, Andres

    2016-05-16

    Since 2012, the U.S. Department of Energy’s X-Stack program has been developing solutions including runtime systems, programming models, languages, compilers, and tools for the Exascale system software to address crucial performance and power requirements. Fine grain programming models and runtime systems show a great potential to efficiently utilize the underlying hardware. Thus, they are essential to many X-Stack efforts. An abundant amount of small tasks can better utilize the vast parallelism available on current and future machines. Moreover, finer tasks can recover faster and adapt better, due to a decrease in state and control. Nevertheless, current applications have been writtenmore » to exploit old paradigms (such as Communicating Sequential Processor and Bulk Synchronous Parallel processing). To fully utilize the advantages of these new systems, applications need to be adapted to these new paradigms. As part of the applications’ porting process, in-depth characterization studies, focused on both application characteristics and runtime features, need to take place to fully understand the application performance bottlenecks and how to resolve them. This paper presents a characterization study for a novel high performance runtime system, called the Open Community Runtime, using key HPC kernels as its vehicle. This study has the following contributions: one of the first high performance, fine grain, distributed memory runtime system implementing the OCR standard (version 0.99a); and a characterization study of key HPC kernels in terms of runtime primitives running on both intra and inter node environments. Running on a general purpose cluster, we have found up to 1635x relative speed-up for a parallel tiled Cholesky Kernels on 128 nodes with 16 cores each and a 1864x relative speed-up for a parallel tiled Smith-Waterman kernel on 128 nodes with 30 cores.« less

  12. Application and systems software in Ada: Development experiences

    NASA Technical Reports Server (NTRS)

    Kuschill, Jim

    1986-01-01

    In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.

  13. TES: A modular systems approach to expert system development for real-time space applications

    NASA Technical Reports Server (NTRS)

    Cacace, Ralph; England, Brenda

    1988-01-01

    A major goal of the Space Station era is to reduce reliance on support from ground based experts. The development of software programs using expert systems technology is one means of reaching this goal without requiring crew members to become intimately familiar with the many complex spacecraft subsystems. Development of an expert systems program requires a validation of the software with actual flight hardware. By combining accurate hardware and software modelling techniques with a modular systems approach to expert systems development, the validation of these software programs can be successfully completed with minimum risk and effort. The TIMES Expert System (TES) is an application that monitors and evaluates real time data to perform fault detection and fault isolation tasks as they would otherwise be carried out by a knowledgeable designer. The development process and primary features of TES, a modular systems approach, and the lessons learned are discussed.

  14. Decision blocks: A tool for automating decision making in CLIPS

    NASA Technical Reports Server (NTRS)

    Eick, Christoph F.; Mehta, Nikhil N.

    1991-01-01

    The human capability of making complex decision is one of the most fascinating facets of human intelligence, especially if vague, judgemental, default or uncertain knowledge is involved. Unfortunately, most existing rule based forward chaining languages are not very suitable to simulate this aspect of human intelligence, because of their lack of support for approximate reasoning techniques needed for this task, and due to the lack of specific constructs to facilitate the coding of frequently reoccurring decision block to provide better support for the design and implementation of rule based decision support systems. A language called BIRBAL, which is defined on the top of CLIPS, for the specification of decision blocks, is introduced. Empirical experiments involving the comparison of the length of CLIPS program with the corresponding BIRBAL program for three different applications are surveyed. The results of these experiments suggest that for decision making intensive applications, a CLIPS program tends to be about three times longer than the corresponding BIRBAL program.

  15. An application of multiattribute decision analysis to the Space Station Freedom program. Case study: Automation and robotics technology evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.

    1990-01-01

    The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.

  16. Technology Reinvestment Project Manufacturing Education and Training. Volume 1

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Bond, Arthur J.

    1997-01-01

    The manufacturing education program is a joint program between the University of Alabama in Huntsville's (UAH) College of Engineering and Alabama A&M University's (AAMLJ) School of Engineering and Technology. The objective of the program is to provide more hands-on experiences to undergraduate engineering and engineering technology students. The scope of work consisted of. Year 1, Task 1: Review courses at Alabama Industrial Development Training (AIDT); Task 2: Review courses at UAH and AAMU; Task 3: Develop new lab manuals; Task 4: Field test manuals; Task 5: Prepare annual report. Year 2, Task 1: Incorporate feedback into lab manuals; Task 2 : Introduce lab manuals into classes; Task 3: Field test manuals; Task 4: Prepare annual report. Year 3, Task 1: Incorporate feedback into lab manuals; Task 2: Introduce lab manuals into remaining classes; Task 3: Conduct evaluation with assistance of industry; Task 4: Prepare final report. This report only summarizes the activities of the University of Alabama in Huntsville. The activities of Alabama A&M University are contained in a separate report.

  17. An efficient liner cooling scheme for advanced small gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Paskin, Marc D.; Mongia, Hukam C.; Acosta, Waldo A.

    1993-01-01

    A joint Army/NASA program was conducted to design, fabricate, and test an advanced, small gas turbine, reverse-flow combustor utilizing a compliant metal/ceramic (CMC) wall cooling concept. The objectives of this effort were to develop a design method (basic design data base and analysis) for the CMC cooling technique and then demonstrate its application to an advanced cycle, small, reverse-flow combustor with 3000 F burner outlet temperature. The CMC concept offers significant improvements in wall cooling effectiveness resulting in a large reduction in cooling air requirements. Therefore, more air is available for control of burner outlet temperature pattern in addition to the benefits of improved efficiency, reduced emissions, and lower smoke levels. The program was divided into four tasks. Task 1 defined component materials and localized design of the composite wall structure in conjunction with development of basic design models for the analysis of flow and heat transfer through the wall. Task 2 included implementation of the selected materials and validated design models during combustor preliminary design. Detail design of the selected combustor concept and its refinement with 3D aerothermal analysis were completed in Task 3. Task 4 covered detail drawings, process development and fabrication, and a series of burner rig tests. The purpose of this paper is to provide details of the investigation into the fundamental flow and heat transfer characteristics of the CMC wall structure as well as implementation of the fundamental analysis method for full-scale combustor design.

  18. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  19. Use of EPANET solver to manage water distribution in Smart City

    NASA Astrophysics Data System (ADS)

    Antonowicz, A.; Brodziak, R.; Bylka, J.; Mazurkiewicz, J.; Wojtecki, S.; Zakrzewski, P.

    2018-02-01

    Paper presents a method of using EPANET solver to support manage water distribution system in Smart City. The main task is to develop the application that allows remote access to the simulation model of the water distribution network developed in the EPANET environment. Application allows to perform both single and cyclic simulations with the specified step of changing the values of the selected process variables. In the paper the architecture of application was shown. The application supports the selection of the best device control algorithm using optimization methods. Optimization procedures are possible with following methods: brute force, SLSQP (Sequential Least SQuares Programming), Modified Powell Method. Article was supplemented by example of using developed computer tool.

  20. Teaching program for the Unified Dyskinesia Rating Scale.

    PubMed

    Goetz, Christopher G; Nutt, John G; Stebbins, Glenn T; Chmura, Teresa A

    2009-07-15

    The Unified Dyskinesia Rating Scale (UDysRS) has been introduced as a comprehensive rating tool for the evaluation of dyskinesias in Parkinson's disease (PD). To enhance a uniform application, we developed a DVD-based training program with instructions, patient examples, and a certification exercise. For training on the objective assessment of dyskinesia, seventy PD patients spanning the gamut of dyskinesias (none to severe) were videotaped during four tasks of daily living (speaking, drinking from a cup, putting on a coat, and walking). Dyskinesia severity in seven body parts was rated by 20 international movement disorder specialists using the UDysRS for impairment. Each task was also rated for disability. Inter-rater reliability was assessed with generalized weighted kappa and intraclass correlation coefficients. For the teaching program, examples of each severity level and each body part were selected based on the criterion that they received a uniform rating (+/- 1 point) by at least 75% of the raters. For the certification exercise, four cases were selected to represent the four quartiles of overall objective UDysRS scores to reflect slight, mild, moderate, and severe dyskinesia. Each selection was based on the highest inter-rater reliability score for that quartile (minimum kappa or intraclass correlation coefficient = 0.6). UDysRS ranges for certification were calculated based on the 95% confidence interval. The teaching program lasts 41 min, and the certification exercise requires 10 min (total 51 min). This training program, based on visual examples of dyskinesia and anchored in scores generated by movement disorder experts is aimed at increasing homogeneity of ratings among and within raters and centers. Large-scale multicenter randomized clinical trials of dyskinesia treatment are strengthened by a uniform standard of scale application. 2009 Movement Disorder Society.

  1. Using LOTOS for Formalizing Wireless Sensor Network Applications

    PubMed Central

    Rosa, Nelson Souto; Cunha, Paulo Roberto Freire

    2007-01-01

    The number of wireless sensor network (WSN) applications is rapidly increasing and becoming an integral part of sensor nodes. These applications have been widely developed on TinyOS operating system using the nesC programming language. However, due to the tight integration to physical world, limited node power and resources (CPU and memory) and complexity of combining components into an application, to build such applications is not a trivial task. In this context, we present an approach for treating with this complexity adopting a formal description technique, namely LOTOS, for formalising the WSN applications ‘behaviour. The formalisation has three main benefits: better understanding on how the application actually works, checking of desired properties of the application's behaviour, and simulation facilities. In order to illustrate the proposed approach, we apply it to two nesC traditional applications, namely BLink and Sense.

  2. Gesture-Controlled Interface for Contactless Control of Various Computer Programs with a Hooking-Based Keyboard and Mouse-Mapping Technique in the Operating Room

    PubMed Central

    Park, Ben Joonyeon; Jang, Taekjin; Choi, Jong Woo; Kim, Namkug

    2016-01-01

    We developed a contactless interface that exploits hand gestures to effectively control medical images in the operating room. We developed an in-house program called GestureHook that exploits message hooking techniques to convert gestures into specific functions. For quantitative evaluation of this program, we used gestures to control images of a dynamic biliary CT study and compared the results with those of a mouse (8.54 ± 1.77 s to 5.29 ± 1.00 s; p < 0.001) and measured the recognition rates of specific gestures and the success rates of tasks based on clinical scenarios. For clinical applications, this program was set up in the operating room to browse images for plastic surgery. A surgeon browsed images from three different programs: CT images from a PACS program, volume-rendered images from a 3D PACS program, and surgical planning photographs from a basic image viewing program. All programs could be seamlessly controlled by gestures and motions. This approach can control all operating room programs without source code modification and provide surgeons with a new way to safely browse through images and easily switch applications during surgical procedures. PMID:26981146

  3. Gesture-Controlled Interface for Contactless Control of Various Computer Programs with a Hooking-Based Keyboard and Mouse-Mapping Technique in the Operating Room.

    PubMed

    Park, Ben Joonyeon; Jang, Taekjin; Choi, Jong Woo; Kim, Namkug

    2016-01-01

    We developed a contactless interface that exploits hand gestures to effectively control medical images in the operating room. We developed an in-house program called GestureHook that exploits message hooking techniques to convert gestures into specific functions. For quantitative evaluation of this program, we used gestures to control images of a dynamic biliary CT study and compared the results with those of a mouse (8.54 ± 1.77 s to 5.29 ± 1.00 s; p < 0.001) and measured the recognition rates of specific gestures and the success rates of tasks based on clinical scenarios. For clinical applications, this program was set up in the operating room to browse images for plastic surgery. A surgeon browsed images from three different programs: CT images from a PACS program, volume-rendered images from a 3D PACS program, and surgical planning photographs from a basic image viewing program. All programs could be seamlessly controlled by gestures and motions. This approach can control all operating room programs without source code modification and provide surgeons with a new way to safely browse through images and easily switch applications during surgical procedures.

  4. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  5. ROSCOE Manual, Volume 14A-1 - Ambient Atmosphere (Major and Minor Neutral Species and Ionosphere).

    DTIC Science & Technology

    1979-06-30

    height EDDSCH 100 - 300 Parabola, determined by data-point values EBOTD and EF2MXD at altitudes HEBOTD and HF2MXD and vertical slope at altitude HF2MXD...Ionosphere) c Science Applications, Inc. P.O. Box 2351 La Jolla, California 92038 30 June 1979 Final Report for Period 1 January 1976-30 June 1979...ORGANIZATION NAME AND ADDRESS 10 PROGRAM ELEMENT. PROJECT. TASK Science Applications, Inc. V/ AREA 6 WORK U NIT NUMBERS P.O. Box 2351 Subtask S99QAXHCO62-37 La

  6. Artificial Intelligence Applications to Maintenance Technology Working Group Report (IDA/OSD R&M (Institute for Defense Analyses/Office of the Secretary of Defense Reliability and Maintainability) Study).

    DTIC Science & Technology

    1983-08-01

    Research and Engineering and Office of the Assistant Secretary of Defense (Mw Reserve Affairs and Logistics) 1WTITR#OR DEFENSE ANALYSES ~~’AND... TITLE (and Subdlee) S.TYPE OF REPORT & PERIOD COVERED Final Artificial Intelligence Applications to Main- July 1982 - August 1983 tenance Technology...of DoD (Short Title : R&M Study). This task order was structured to address the improvement of R&M and readiness through innovative program structuring

  7. (Lack of) Corticospinal facilitation in association with hand laterality judgments.

    PubMed

    Ferron, Lucas; Tremblay, François

    2017-07-01

    In recent years, mental practice strategies have drawn much interest in the field of rehabilitation. One form of mental practice particularly advocated involves judging the laterality of images depicting body parts. Such laterality judgments are thought to rely on implicit motor imagery via mental rotation of one own's limb. In this study, we sought to further characterize the involvement of the primary motor cortex (M1) in hand laterality judgments (HLJ) as performed in the context of an application designed for rehabilitation. To this end, we measured variations in corticospinal excitability in both hemispheres with motor evoked potentials (MEPs) while participants (n = 18, young adults) performed either HLJ or a mental counting task. A third condition (foot observation) provided additional control. We hypothesized that HLJ would lead to a selective MEP facilitation when compared to the other tasks and that this facilitation would be greater on the right than the left hemisphere. Contrary to our predictions, we found no evidence of task effects and hemispheric effects for the HLJ task. Significant task-related MEP facilitation was detected only for the mental counting task. A secondary experiment performed in a subset of participants (n = 6) to further test modulation during HLJ yielded the same results. We interpret the lack of facilitation with HLJ in the light of evidence that participants may rely on alternative strategies when asked to judge laterality when viewing depictions of body parts. The use of visual strategies notably would reduce the need to engage in mental rotation, thus reducing M1 involvement. These results have implications for applications of laterality tasks in the context of the rehabilitation program.

  8. Applications Explorer Missions (AEM): Mission planners handbook

    NASA Technical Reports Server (NTRS)

    Smith, S. R. (Editor)

    1974-01-01

    The Applications Explorer Missions (AEM) Program is a planned series of space applications missions whose purpose is to perform various tasks that require a low cost, quick reaction, small spacecraft in a dedicated orbit. The Heat Capacity Mapping Mission (HCMM) is the first mission of this series. The spacecraft described in this document was conceived to support a variety of applications instruments and the HCMM instrument in particular. The maximum use of commonality has been achieved. That is, all of the subsystems employed are taken directly or modified from other programs such as IUE, IMP, RAE, and Nimbus. The result is a small versatile spacecraft. The purpose of this document, the AEM Mission Planners Handbook (AEM/MPH) is to describe the spacecraft and its capabilities in general and the HCMM in particular. This document will also serve as a guide for potential users as to the capabilities of the AEM spacecraft and its achievable orbits. It should enable each potential user to determine the suitability of the AEM concept to his mission.

  9. Microgravity sciences application visiting scientist program

    NASA Technical Reports Server (NTRS)

    Glicksman, Martin; Vanalstine, James

    1995-01-01

    Marshall Space Flight Center pursues scientific research in the area of low-gravity effects on materials and processes. To facilitate these Government performed research responsibilities, a number of supplementary research tasks were accomplished by a group of specialized visiting scientists. They participated in work on contemporary research problems with specific objectives related to current or future space flight experiments and defined and established independent programs of research which were based on scientific peer review and the relevance of the defined research to NASA microgravity for implementing a portion of the national program. The programs included research in the following areas: protein crystal growth, X-ray crystallography and computer analysis of protein crystal structure, optimization and analysis of protein crystal growth techniques, and design and testing of flight hardware.

  10. A review of the silicon material task

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1984-01-01

    The Silicon Material Task of the Flat-Plate Solar Array Project was assigned the objective of developing the technology for low-cost processes for producing polysilicon suitable for terrestrial solar-cell applications. The Task program comprised sections for process developments for semiconductor-grade and solar-cell-grade products. To provide information for deciding upon process designs, extensive investigations of the effects of impurities on material properties and the performance of cells were conducted. The silane process of the Union Carbide Corporation was carried through several stages of technical and engineering development; a pilot plant was the culmination of this effort. The work to establish silane fluidized-bed technology for a low-cost process is continuing. The advantages of the use of dichlorosilane is a siemens-type were shown by Hemlock Semiconductor Corporation. The development of other processes is described.

  11. A review of the silicon material task

    NASA Astrophysics Data System (ADS)

    Lutwack, R.

    1984-02-01

    The Silicon Material Task of the Flat-Plate Solar Array Project was assigned the objective of developing the technology for low-cost processes for producing polysilicon suitable for terrestrial solar-cell applications. The Task program comprised sections for process developments for semiconductor-grade and solar-cell-grade products. To provide information for deciding upon process designs, extensive investigations of the effects of impurities on material properties and the performance of cells were conducted. The silane process of the Union Carbide Corporation was carried through several stages of technical and engineering development; a pilot plant was the culmination of this effort. The work to establish silane fluidized-bed technology for a low-cost process is continuing. The advantages of the use of dichlorosilane is a siemens-type were shown by Hemlock Semiconductor Corporation. The development of other processes is described.

  12. Novel Process for Removal and Recovery of Vapor Phase Mercury

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwell, Collin; Roberts, Daryl L; Albiston, Jason

    We demonstrated in the Phase I program all key attributes of a new technology for removing mercury from flue gases, namely, a) removal of greater than 95% of both elemental and oxidized forms of mercury, both in the laboratory and in the field b) regenerability of the sorbent c) ability to scale up, and d) favorable economics. The Phase I program consisted of four tasks other than project reporting: Task I-1 Screen Sorbent Configurations in the Laboratory Task I-2 Design and Fabricate Bench-Scale Equipment Task I-3 Test Bench-Scale Equipment on Pilot Combustor Task I-4 Evaluate Economics Based on Bench-Scale Resultsmore » In Task I-1, we demonstrated that the sorbents are thermally durable and are regenerable through at least 55 cycles of mercury uptake and desorption. We also demonstrated two low-pressure- drop configurations of the sorbent, namely, a particulate form and a monolithic form. We showed that the particulate form of the sorbent would take up 100% of the mercury so long as the residence time in a bed of the sorbent exceeded 0.1 seconds. In principle, the particulate form of the sorbent could be imbedded in the back side of a higher temperature bag filter in a full-scale application. With typical bag face velocities of four feet per minute, the thickness of the particulate layer would need to be about 2000 microns to accomplish the uptake of the mercury. For heat transfer efficiency, however, we believed the monolithic form of the sorbent would be the more practical in a full scale application. Therefore, we purchased commercially-available metallic monoliths and applied the sorbent to the inside of the flow channels of the monoliths. At face velocities we tested (up to 1.5 ft/sec), these monoliths had less than 0.05 inches of water pressure drop. We tested the monolithic form of the sorbent through 21 cycles of mercury sorption and desorption in the laboratory and included a test of simultaneous uptake of both mercury and mercuric chloride. Overall, in Task I-1, we found that the particulate and monolith forms of the sorbent were thermally stable and durable and would repeatedly sorb and desorb 100% of the mercury, including mercuric chloride, with low pressure drop and short residence times at realistic flue gas conditions.« less

  13. Microgravity Science and Applications: Program Tasks and Bibliography for Fiscal Year 1996

    NASA Technical Reports Server (NTRS)

    1997-01-01

    NASA's Microgravity Science and Applications Division (MSAD) sponsors a program that expands the use of space as a laboratory for the study of important physical, chemical, and biochemical processes. The primary objective of the program is to broaden the value and capabilities of human presence in space by exploiting the unique characteristics of the space environment for research. However, since flight opportunities are rare and flight research development is expensive, a vigorous ground-based research program, from which only the best experiments evolve, is critical to the continuing strength of the program. The microgravity environment affords unique characteristics that allow the investigation of phenomena and processes that are difficult or impossible to study an Earth. The ability to control gravitational effects such as buoyancy driven convection, sedimentation, and hydrostatic pressures make it possible to isolate phenomena and make measurements that have significantly greater accuracy than can be achieved in normal gravity. Space flight gives scientists the opportunity to study the fundamental states of physical matter-solids, liquids and gasses-and the forces that affect those states. Because the orbital environment allows the treatment of gravity as a variable, research in microgravity leads to a greater fundamental understanding of the influence of gravity on the world around us. With appropriate emphasis, the results of space experiments lead to both knowledge and technological advances that have direct applications on Earth. Microgravity research also provides the practical knowledge essential to the development of future space systems. The Office of Life and Microgravity Sciences and Applications (OLMSA) is responsible for planning and executing research stimulated by the Agency's broad scientific goals. OLMSA's Microgravity Science and Applications Division (MSAD) is responsible for guiding and focusing a comprehensive program, and currently manages its research and development tasks through five major scientific areas: biotechnology, combustion science, fluid physics, fundamental physics, and materials science. FY 1996 was an important year for MSAD. NASA continued to build a solid research community for the coming space station era. During FY 1996, the NASA Microgravity Research Program continued investigations selected from the 1994 combustion science, fluid physics, and materials science NRAS. MSAD also released a NASA Research Announcement in microgravity biotechnology, with more than 130 proposals received in response. Selection of research for funding is expected in early 1997. The principal investigators chosen from these NRAs will form the core of the MSAD research program at the beginning of the space station era. The third United States Microgravity Payload (USMP-3) and the Life and Microgravity Spacelab (LMS) missions yielded a wealth of microgravity data in FY 1996. The USMP-3 mission included a fluids facility and three solidification furnaces, each designed to examine a different type of crystal growth.

  14. Development of a fiber optic high temperature strain sensor

    NASA Technical Reports Server (NTRS)

    Rausch, E. O.; Murphy, K. E.; Brookshire, S. P.

    1992-01-01

    From 1 Apr. 1991 to 31 Aug. 1992, the Georgia Tech Research Institute conducted a research program to develop a high temperature fiber optic strain sensor as part of a measurement program for the space shuttle booster rocket motor. The major objectives of this program were divided into four tasks. Under Task 1, the literature on high-temperature fiber optic strain sensors was reviewed. Task 2 addressed the design and fabrication of the strain sensor. Tests and calibration were conducted under Task 3, and Task 4 was to generate recommendations for a follow-on study of a distributed strain sensor. Task 4 was submitted to NASA as a separate proposal.

  15. Incorporating Language Structure in a Communicative Task: An Analysis of the Language Component of a Communicative Task in the LINC Home Study Program

    ERIC Educational Resources Information Center

    Lenchuk, Iryna

    2014-01-01

    The purpose of this article is to analyze a task included in the LINC Home Study (LHS) program. LHS is a federally funded distance education program offered to newcomers to Canada who are unable to attend regular LINC classes. A task, in which a language structure (a gerund) is chosen and analyzed, was selected from one instructional module of LHS…

  16. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less

  17. Applications of neural networks to landmark detection in 3-D surface data

    NASA Astrophysics Data System (ADS)

    Arndt, Craig M.

    1992-09-01

    The problem of identifying key landmarks in 3-dimensional surface data is of considerable interest in solving a number of difficult real-world tasks, including object recognition and image processing. The specific problem that we address in this research is to identify the specific landmarks (anatomical) in human surface data. This is a complex task, currently performed visually by an expert human operator. In order to replace these human operators and increase reliability of the data acquisition, we need to develop a computer algorithm which will utilize the interrelations between the 3-dimensional data to identify the landmarks of interest. The current presentation describes a method for designing, implementing, training, and testing a custom architecture neural network which will perform the landmark identification task. We discuss the performance of the net in relationship to human performance on the same task and how this net has been integrated with other AI and traditional programming methods to produce a powerful analysis tool for computer anthropometry.

  18. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  19. Ultraviolet Communication for Medical Applications

    DTIC Science & Technology

    2014-05-01

    parent company Imaging Systems Technology (IST) demonstrated feasibility of several key concepts are being developed into a working prototype in the...program using multiple high-end GPUs ( NVIDIA Tesla K20). Finally, the Monte Carlo simulation task will be resumed after the Milestone 2 demonstration...is acceptable for automated printing and handling. Next, the option of having our shells electroded by an external company was investigated and DEI

  20. Training and Personnel Systems Technology R and D Program Description FY 93

    DTIC Science & Technology

    1992-07-24

    instructional strategies provide the best training in ICAT applications, and (c) demonstration of microcomputer authoring techniques for rapid development...learning strategies for language training, (b) develop a behavioral taxonomy to evaluate Military Intelligence (MI) performance and to characterize the...training requirements for collective tasks. In FY93, plans are to: (a) develop training strategies for sustaining command and control skills, and (b

  1. Integrating Images, Applications, and Communications Networks. Volume 5.

    DTIC Science & Technology

    1987-12-01

    AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK Massachusetts Institute of Technology AREA A WORK UNIT NUMBERS Cambridge, MA 02139 II. CONTROLLING...Systems Center (TSC) for their support and assistance, to Professor Joseph Sussman, Director, Center for Transportation Studies ( CTS ) at MIT for his...IEEE Press, New York, 1987. [15] W. J. Hawkins. Bits and Bytes. 0 Popular Science, January, 1984. [16] G. N. Hounsfield . Computerized Tranverse Axial

  2. Binary Classification using Decision Tree based Genetic Programming and Its Application to Analysis of Bio-mass Data

    NASA Astrophysics Data System (ADS)

    To, Cuong; Pham, Tuan D.

    2010-01-01

    In machine learning, pattern recognition may be the most popular task. "Similar" patterns identification is also very important in biology because first, it is useful for prediction of patterns associated with disease, for example cancer tissue (normal or tumor); second, similarity or dissimilarity of the kinetic patterns is used to identify coordinately controlled genes or proteins involved in the same regulatory process. Third, similar genes (proteins) share similar functions. In this paper, we present an algorithm which uses genetic programming to create decision tree for binary classification problem. The application of the algorithm was implemented on five real biological databases. Base on the results of comparisons with well-known methods, we see that the algorithm is outstanding in most of cases.

  3. Continued Development and Improvement of Pneumatic Heavy Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert J. Englar

    2005-07-15

    The objective of this applied research effort led by Georgia Tech Research Institute is the application of pneumatic aerodynamic technology previously developed and patented by us to the design of an appropriate Heavy Vehicle (HV) tractor-trailer configuration, and experimental confirmation of this pneumatic configuration's improved aerodynamic characteristics. In Phases I to IV of our previous DOE program (Reference 1), GTRI has developed, patented, wind-tunnel tested and road-tested blown aerodynamic devices for Pneumatic Heavy Vehicles (PHVs) and Pneumatic Sports Utility Vehicles (PSUVs). To further advance these pneumatic technologies towards HV and SUV applications, additional Phase V tasks were included in themore » first year of a continuing DOE program (Reference 2). Based on the results of the Phase IV full-scale test programs, these Phase V tasks extended the application of pneumatic aerodynamics to include: further economy and performance improvements; increased aerodynamic stability and control; and safety of operation of Pneumatic HVs. Continued development of a Pneumatic SUV was also conducted during the Phase V program. Phase V was completed in July, 2003; its positive results towards development and confirmation of this pneumatic technology are reported in References 3 and 4. The current Phase VI of this program was incrementally funded by DOE in order to continue this technology development towards a second fuel economy test on the Pneumatic Heavy Vehicle. The objectives of this current Phase VI research and development effort (Ref. 5) fall into two categories: (1) develop improved pneumatic aerodynamic technology and configurations on smaller-scale models of the advanced Pneumatic Heavy Vehicle (PHV); and based on these findings, (2) redesign, modify, and re-test the modified full-scale PHV test vehicle. This second objective includes conduct of an on-road preliminary road test of this configuration to prepare it for a second series of SAE Type-U fuel economy evaluations, as described in Ref. 5. Both objectives are based on the pneumatic technology already developed and confirmed for DOE OHVT/OAAT in Phases I-V. This new Phase VI effort was initiated by contract amendment to the Phase V effort using carryover FY02 funds. This were conducted under a new and distinct project number, GTRI Project A-6935, separate from the Phase I-IV program. However, the two programs are closely integrated, and thus Phase VI continues with the previous program and goals.« less

  4. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  5. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    NASA Astrophysics Data System (ADS)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.

  6. Cancer Patient Navigator Tasks across the Cancer Care Continuum

    PubMed Central

    Braun, Kathryn L.; Kagawa-Singer, Marjorie; Holden, Alan E. C.; Burhansstipanov, Linda; Tran, Jacqueline H.; Seals, Brenda F.; Corbie-Smith, Giselle; Tsark, JoAnn U.; Harjo, Lisa; Foo, Mary Anne; Ramirez, Amelie G.

    2011-01-01

    Cancer patient navigation (PN) programs have been shown to increase access to and utilization of cancer care for poor and underserved individuals. Despite mounting evidence of its value, cancer patient navigation is not universally understood or provided. We describe five PN programs and the range of tasks their navigators provide across the cancer care continuum (education and outreach, screening, diagnosis and staging, treatment, survivorship, and end-of-life). Tasks are organized by their potential to make cancer services understandable, available, accessible, affordable, appropriate, and accountable. Although navigators perform similar tasks across the five programs, their specific approaches reflect differences in community culture, context, program setting, and funding. Task lists can inform the development of programs, job descriptions, training, and evaluation. They also may be useful in the move to certify navigators and establish mechanisms for reimbursement for navigation services. PMID:22423178

  7. Orbital transfer vehicle concept definition and systems analysis study. Volume 11: Study extension 2 results

    NASA Technical Reports Server (NTRS)

    Willcockson, W. H.

    1988-01-01

    Work conducted in the second extension of the Phase A Orbit Transfer Vehicle Concept Definition and Systems Analysis Study is summarized. Four major tasks were identified: (1) define an initial OTV program consistent with near term Civil Space Leadership Initiative missions; (2) develop program evolution to long term advanced missions; (3) investigate the implications of current STS safety policy on an Aft Cargo Carrier based OTV; and (4) expand the analysis of high entry velocity aeroassist. An increased emphasis on the breath of OTV applications was undertaken to show the need for the program on the basis of the expansion of the nation's capabilities in space.

  8. Usability test of an internet-based informatics tool for diabetes care providers: the comprehensive diabetes management program.

    PubMed

    Fonda, Stephanie J; Paulsen, Christine A; Perkins, Joan; Kedziora, Richard J; Rodbard, David; Bursell, Sven-Erik

    2008-02-01

    Research suggests Internet-based care management tools are associated with improvements in care and patient outcomes. However, although such tools change workflow, rarely is their usability addressed and reported. This article presents a usability study of an Internet-based informatics application called the Comprehensive Diabetes Management Program (CDMP), developed by content experts and technologists. Our aim is to demonstrate a process for conducting a usability study of such a tool and to report results. We conducted the usability test with six diabetes care providers under controlled conditions. Each provider worked with the CDMP in a single session using a "think aloud" process. Providers performed standardized tasks with fictitious patient data, and we observed how they approached these tasks, documenting verbalizations and subjective ratings. The providers then completed a usability questionnaire and interviews. Overall, the scores on the usability questionnaire were neutral to favorable. For specific subdomains of the questionnaire, the providers' reported problems with the application's ease of use, performance, and support features, but were satisfied with its visual appeal and content. The results from the observational and interview data indicated areas for improvement, particularly in navigation and terminology. The usability study identified several issues for improvement, confirming the need for usability testing of Internet-based informatics applications, even those developed by experts. To our knowledge, there have been no other usability studies of an Internet-based informatics application with the functionality of the CDMP. Such studies can form the foundation for translation of Internet-based medical informatics tools into clinical practice.

  9. Parallel design patterns for a low-power, software-defined compressed video encoder

    NASA Astrophysics Data System (ADS)

    Bruns, Michael W.; Hunt, Martin A.; Prasad, Durga; Gunupudi, Nageswara R.; Sonachalam, Sekar

    2011-06-01

    Video compression algorithms such as H.264 offer much potential for parallel processing that is not always exploited by the technology of a particular implementation. Consumer mobile encoding devices often achieve real-time performance and low power consumption through parallel processing in Application Specific Integrated Circuit (ASIC) technology, but many other applications require a software-defined encoder. High quality compression features needed for some applications such as 10-bit sample depth or 4:2:2 chroma format often go beyond the capability of a typical consumer electronics device. An application may also need to efficiently combine compression with other functions such as noise reduction, image stabilization, real time clocks, GPS data, mission/ESD/user data or software-defined radio in a low power, field upgradable implementation. Low power, software-defined encoders may be implemented using a massively parallel memory-network processor array with 100 or more cores and distributed memory. The large number of processor elements allow the silicon device to operate more efficiently than conventional DSP or CPU technology. A dataflow programming methodology may be used to express all of the encoding processes including motion compensation, transform and quantization, and entropy coding. This is a declarative programming model in which the parallelism of the compression algorithm is expressed as a hierarchical graph of tasks with message communication. Data parallel and task parallel design patterns are supported without the need for explicit global synchronization control. An example is described of an H.264 encoder developed for a commercially available, massively parallel memorynetwork processor device.

  10. Intelligent behavior generator for autonomous mobile robots using planning-based AI decision making and supervisory control logic

    NASA Astrophysics Data System (ADS)

    Shah, Hitesh K.; Bahl, Vikas; Martin, Jason; Flann, Nicholas S.; Moore, Kevin L.

    2002-07-01

    In earlier research the Center for Self-Organizing and Intelligent Systems (CSOIS) at Utah State University (USU) have been funded by the US Army Tank-Automotive and Armaments Command's (TACOM) Intelligent Mobility Program to develop and demonstrate enhanced mobility concepts for unmanned ground vehicles (UGVs). One among the several out growths of this work has been the development of a grammar-based approach to intelligent behavior generation for commanding autonomous robotic vehicles. In this paper we describe the use of this grammar for enabling autonomous behaviors. A supervisory task controller (STC) sequences high-level action commands (taken from the grammar) to be executed by the robot. It takes as input a set of goals and a partial (static) map of the environment and produces, from the grammar, a flexible script (or sequence) of the high-level commands that are to be executed by the robot. The sequence is derived by a planning function that uses a graph-based heuristic search (A* -algorithm). Each action command has specific exit conditions that are evaluated by the STC following each task completion or interruption (in the case of disturbances or new operator requests). Depending on the system's state at task completion or interruption (including updated environmental and robot sensor information), the STC invokes a reactive response. This can include sequencing the pending tasks or initiating a re-planning event, if necessary. Though applicable to a wide variety of autonomous robots, an application of this approach is demonstrated via simulations of ODIS, an omni-directional inspection system developed for security applications.

  11. MIA - A free and open source software for gray scale medical image analysis

    PubMed Central

    2013-01-01

    Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed. PMID:24119305

  12. MIA - A free and open source software for gray scale medical image analysis.

    PubMed

    Wollny, Gert; Kellman, Peter; Ledesma-Carbayo, María-Jesus; Skinner, Matthew M; Hublin, Jean-Jaques; Hierl, Thomas

    2013-10-11

    Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large.Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers.One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development.Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don't provide an clear approach when one wants to shape a new command line tool from a prototype shell script. The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.

  13. 76 FR 49527 - Joint Motor Carrier Safety Advisory Committee and Medical Review Board Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-10

    ... Task 11-03, regarding the Agency's Cross Border Trucking Pilot Program, will meet. Copies of all MCSAC... Trucking Pilot Program Task The MCSAC Subcommittee will continue its work on Task 11-03 concerning the... a meeting of the Cross-Border Trucking Pilot Program subcommittee. All three days of the meeting...

  14. Development of flight experiment task requirements. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Hatterick, G. R.

    1972-01-01

    A study was conducted to develop the means to identify skills required of scientist passengers on advanced missions related to the space shuttle and RAM programs. The scope of the study was defined to include only the activities of on-orbit personnel which are directly related to, or required by, on-orbit experimentation and scientific investigations conducted on or supported by the shuttle orbiter. A program summary is presented which provides a description of the methodology developed, an overview of the activities performed during the study, and the results obtained through application of the methodology.

  15. Research and technology: 1986 annual report of the Lyndon B. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Johnson Space Center accomplishments in new and advanced concepts during 1986 are highlighted. Included are research funded by the Office of Aeronautics and Space Technology; Solar System Exploration and Life Sciences research funded by the Office of Space Sciences and Applications; and Advanced Programs tasks funded by the Office of Space Flight. Summary sections describing the role of the Johnson Space Center in each program are followed by one-page descriptions of significant projects. Descriptions are suitable for external consumption, free of technical jargon, and illustrated to increase ease of comprehension.

  16. Research and technology: 1985 annual report of the Lyndon B. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Johnson Space Center accomplishments in new and advanced concepts during 1985 are highlighted. Included are research funded by the Office of Aeronautics and Space Technology; Solar System Exploration and Life Sciences research funded by the Office of Space Sciences and Applications; and Advanced Programs tasks funded by the Office of Space Flight. Summary sections describing the role of the Johnson Space Center in each program are followed by one-page descriptions of significant projects. Descriptions are suitable for external consumption, free of technical jargon, and illustrated to increase ease of comprehension.

  17. An introduction to scripting in Ruby for biologists.

    PubMed

    Aerts, Jan; Law, Andy

    2009-07-16

    The Ruby programming language has a lot to offer to any scientist with electronic data to process. Not only is the initial learning curve very shallow, but its reflection and meta-programming capabilities allow for the rapid creation of relatively complex applications while still keeping the code short and readable. This paper provides a gentle introduction to this scripting language for researchers without formal informatics training such as many wet-lab scientists. We hope this will provide such researchers an idea of how powerful a tool Ruby can be for their data management tasks and encourage them to learn more about it.

  18. Fault tolerant architectures for integrated aircraft electronics systems, task 2

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Melliar-Smith, P. M.; Schwartz, R. L.

    1984-01-01

    The architectural basis for an advanced fault tolerant on-board computer to succeed the current generation of fault tolerant computers is examined. The network error tolerant system architecture is studied with particular attention to intercluster configurations and communication protocols, and to refined reliability estimates. The diagnosis of faults, so that appropriate choices for reconfiguration can be made is discussed. The analysis relates particularly to the recognition of transient faults in a system with tasks at many levels of priority. The demand driven data-flow architecture, which appears to have possible application in fault tolerant systems is described and work investigating the feasibility of automatic generation of aircraft flight control programs from abstract specifications is reported.

  19. The effectiveness of robotic training depends on motor task characteristics.

    PubMed

    Marchal-Crespo, Laura; Rappo, Nicole; Riener, Robert

    2017-12-01

    Previous research suggests that the effectiveness of robotic training depends on the motor task to be learned. However, it is still an open question which specific task's characteristics influence the efficacy of error-modulating training strategies. Motor tasks can be classified based on the time characteristics of the task, in particular the task's duration (discrete vs. continuous). Continuous tasks require movements without distinct beginning or end. Discrete tasks require fast movements that include well-defined postures at the beginning and the end. We developed two games, one that requires a continuous movement-a tracking task-and one that requires discrete movements-a fast reaching task. We conducted an experiment with thirty healthy subjects to evaluate the effectiveness of three error-modulating training strategies-no guidance, error amplification (i.e., repulsive forces proportional to errors) and haptic guidance-on self-reported motivation and learning of the continuous and discrete games. Training with error amplification resulted in better motor learning than haptic guidance, besides the fact that error amplification reduced subjects' interest/enjoyment and perceived competence during training. Only subjects trained with error amplification improved their performance after training the discrete game. In fact, subjects trained without guidance improved the performance in the continuous game significantly more than in the discrete game, probably because the continuous task required greater attentional levels. Error-amplifying training strategies have a great potential to provoke better motor learning in continuous and discrete tasks. However, their long-lasting negative effects on motivation might limit their applicability in intense neurorehabilitation programs.

  20. Centralized Alert-Processing and Asset Planning for Sensorwebs

    NASA Technical Reports Server (NTRS)

    Castano, Rebecca; Chien, Steve A.; Rabideau, Gregg R.; Tang, Benyang

    2010-01-01

    A software program provides a Sensorweb architecture for alert-processing, event detection, asset allocation and planning, and visualization. It automatically tasks and re-tasks various types of assets such as satellites and robotic vehicles in response to alerts (fire, weather) extracted from various data sources, including low-level Webcam data. JPL has adapted cons iderable Sensorweb infrastructure that had been previously applied to NASA Earth Science applications. This NASA Earth Science Sensorweb has been in operational use since 2003, and has proven reliability of the Sensorweb technologies for robust event detection and autonomous response using space and ground assets. Unique features of the software include flexibility to a range of detection and tasking methods including those that require aggregation of data over spatial and temporal ranges, generality of the response structure to represent and implement a range of response campaigns, and the ability to respond rapidly.

  1. System engineering techniques for establishing balanced design and performance guidelines for the advanced telerobotic testbed

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Matijevic, J. R.

    1987-01-01

    Novel system engineering techniques have been developed and applied to establishing structured design and performance objectives for the Telerobotics Testbed that reduce technical risk while still allowing the testbed to demonstrate an advancement in state-of-the-art robotic technologies. To estblish the appropriate tradeoff structure and balance of technology performance against technical risk, an analytical data base was developed which drew on: (1) automation/robot-technology availability projections, (2) typical or potential application mission task sets, (3) performance simulations, (4) project schedule constraints, and (5) project funding constraints. Design tradeoffs and configuration/performance iterations were conducted by comparing feasible technology/task set configurations against schedule/budget constraints as well as original program target technology objectives. The final system configuration, task set, and technology set reflected a balanced advancement in state-of-the-art robotic technologies, while meeting programmatic objectives and schedule/cost constraints.

  2. Plan recognition and generalization in command languages with application to telerobotics

    NASA Technical Reports Server (NTRS)

    Yared, Wael I.; Sheridan, Thomas B.

    1991-01-01

    A method for pragmatic inference as a necessary accompaniment to command languages is proposed. The approach taken focuses on the modeling and recognition of the human operator's intent, which relates sequences of domain actions ('plans') to changes in some model of the task environment. The salient feature of this module is that it captures some of the physical and linguistic contextual aspects of an instruction. This provides a basis for generalization and reinterpretation of the instruction in different task environments. The theoretical development is founded on previous work in computational linguistics and some recent models in the theory of action and intention. To illustrate these ideas, an experimental command language to a telerobot is implemented. The program consists of three different components: a robot graphic simulation, the command language itself, and the domain-independent pragmatic inference module. Examples of task instruction processes are provided to demonstrate the benefits of this approach.

  3. Development and Applications of a Self-Contained, Non-Invasive EVA Joint Angle and Muscle Fatigue Sensor System

    NASA Technical Reports Server (NTRS)

    Ranniger, C. U.; Sorenson, E. A.; Akin, D. L.

    1995-01-01

    The University of Maryland Space Systems Laboratory, as a participant in NASA's INSTEP program, is developing a non-invasive, self-contained sensor system which can provide quantitative measurements of joint angles and muscle fatigue in the hand and forearm. The goal of this project is to develop a system with which hand/forearm motion and fatigue metrics can be determined in various terrestrial and zero-G work environments. A preliminary study of the prototype sensor systems and data reduction techniques for the fatigue measurement system are presented. The sensor systems evaluated include fiberoptics, used to measure joint angle, surface electrodes, which measure the electrical signals created in muscle as it contracts; microphones, which measure the noise made by contracting muscle; and accelerometers, which measure the lateral muscle acceleration during contraction. The prototype sensor systems were used to monitor joint motion of the metacarpophalangeal joint and muscle fatigue in flexor digitorum superficialis and flexor carpi ulnaris in subjects performing gripping tasks. Subjects were asked to sustain a 60-second constant-contraction (isometric) exercise and subsequently to perform a repetitive handgripping task to failure. Comparison of the electrical and mechanical signals of the muscles during the different tasks will be used to evaluate the applicability of muscle signal measurement techniques developed for isometric contraction tasks to fatigue prediction in quasi-dynamic exercises. Potential data reduction schemes are presented.

  4. Aspect-object alignment with Integer Linear Programming in opinion mining.

    PubMed

    Zhao, Yanyan; Qin, Bing; Liu, Ting; Yang, Wei

    2015-01-01

    Target extraction is an important task in opinion mining. In this task, a complete target consists of an aspect and its corresponding object. However, previous work has always simply regarded the aspect as the target itself and has ignored the important "object" element. Thus, these studies have addressed incomplete targets, which are of limited use for practical applications. This paper proposes a novel and important sentiment analysis task, termed aspect-object alignment, to solve the "object neglect" problem. The objective of this task is to obtain the correct corresponding object for each aspect. We design a two-step framework for this task. We first provide an aspect-object alignment classifier that incorporates three sets of features, namely, the basic, relational, and special target features. However, the objects that are assigned to aspects in a sentence often contradict each other and possess many complicated features that are difficult to incorporate into a classifier. To resolve these conflicts, we impose two types of constraints in the second step: intra-sentence constraints and inter-sentence constraints. These constraints are encoded as linear formulations, and Integer Linear Programming (ILP) is used as an inference procedure to obtain a final global decision that is consistent with the constraints. Experiments on a corpus in the camera domain demonstrate that the three feature sets used in the aspect-object alignment classifier are effective in improving its performance. Moreover, the classifier with ILP inference performs better than the classifier without it, thereby illustrating that the two types of constraints that we impose are beneficial.

  5. a Recursive Approach to Compute Normal Forms

    NASA Astrophysics Data System (ADS)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  6. Developing CORBA-Based Distributed Scientific Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Kim, Chan; Lopez, Isaac

    2000-01-01

    Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.

  7. An Agent-Based Cockpit Task Management System

    NASA Technical Reports Server (NTRS)

    Funk, Ken

    1997-01-01

    An agent-based program to facilitate Cockpit Task Management (CTM) in commercial transport aircraft is developed and evaluated. The agent-based program called the AgendaManager (AMgr) is described and evaluated in a part-task simulator study using airline pilots.

  8. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers and Automation Technology, Number 31

    DTIC Science & Technology

    1978-02-09

    incorporation of subsystems and tasks into systems are specified by state and industrial- sector standards. However, such rigid requirements interfere...construction of the step-down substation, without which the new sector is inoperable. The question of who will build the substation is still unresolved... economico -mathematical evaluation). Among such applications, program decks completed by the "Soyuzsistemprom" are those for data integra- tion and

  9. The National Shipbuilding Research Program Executive Summary Robotics in Shipbuilding Workshop

    DTIC Science & Technology

    1981-01-01

    based on technoeconomic analysis and consideration environment. of working c-c-2 (3) The conceptual designs were based on application of commercial...results of our study. We identified shipbuilding tasks that should be performed by industrial robots based on technoeconomic and working-life incentives...is the TV image of the illuminated workplaces. The image is analyzed by the computer. The analysis includes noise rejection and fitting of straight

  10. Application of shuttle EVA systems to payloads. Volume 1: EVA systems and operational modes description

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Descriptions of the EVA system baselined for the space shuttle program were provided, as well as a compendium of data on available EVA operational modes for payload and orbiter servicing. Operational concepts and techniques to accomplish representative EVA payload tasks are proposed. Some of the subjects discussed include: extravehicular mobility unit, remote manipulator system, airlock, EVA translation aids, restraints, workstations, tools and support equipment.

  11. GPHS-RTGs in support of the Cassini Mission

    NASA Astrophysics Data System (ADS)

    1994-10-01

    The progress on the radioisotope generators and ancillary activities is described. This report is organized by program task as follows: spacecraft integration and liaison; engineering support; safety; qualified unicouple fabrication; ETG fabrication, assembly, and test; ground support equipment; RTG shipping and launch support; design, reviews, and mission applications; project management, quality assurance and reliability, contract changes, non-capital CAGO acquisition, and CAGO maintenance; contractor acquired government owned property (CAGO) acquisition.

  12. Evaluation of the ACEC Benchmark Suite for Real-Time Applications

    DTIC Science & Technology

    1990-07-23

    1.0 benchmark suite waSanalyzed with respect to its measuring of Ada real-time features such as tasking, memory management, input/output, scheduling...and delay statement, Chapter 13 features , pragmas, interrupt handling, subprogram overhead, numeric computations etc. For most of the features that...meant for programming real-time systems. The ACEC benchmarks have been analyzed extensively with respect to their measuring of Ada real-time features

  13. Development of OTM Syngas Process and Testing of Syngas Derived Ultra-clean Fuels in Diesel Engines and Fuel Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.T.; James P. Meagher; Prasad Apte

    2002-12-31

    This topical report summarizes work accomplished for the Program from November 1, 2001 to December 31, 2002 in the following task areas: Task 1: Materials Development; Task 2: Composite Development; Task 4: Reactor Design and Process Optimization; Task 8: Fuels and Engine Testing; 8.1 International Diesel Engine Program; 8.2 Nuvera Fuel Cell Program; and Task 10: Program Management. Major progress has been made towards developing high temperature, high performance, robust, oxygen transport elements. In addition, a novel reactor design has been proposed that co-produces hydrogen, lowers cost and improves system operability. Fuel and engine testing is progressing well, but wasmore » delayed somewhat due to the hiatus in program funding in 2002. The Nuvera fuel cell portion of the program was completed on schedule and delivered promising results regarding low emission fuels for transportation fuel cells. The evaluation of ultra-clean diesel fuels continues in single cylinder (SCTE) and multiple cylinder (MCTE) test rigs at International Truck and Engine. FT diesel and a BP oxygenate showed significant emissions reductions in comparison to baseline petroleum diesel fuels. Overall through the end of 2002 the program remains under budget, but behind schedule in some areas.« less

  14. Impact of Frequent Interruption on Nurses' Patient-Controlled Analgesia Programming Performance.

    PubMed

    Campoe, Kristi R; Giuliano, Karen K

    2017-12-01

    The purpose was to add to the body of knowledge regarding the impact of interruption on acute care nurses' cognitive workload, total task completion times, nurse frustration, and medication administration error while programming a patient-controlled analgesia (PCA) pump. Data support that the severity of medication administration error increases with the number of interruptions, which is especially critical during the administration of high-risk medications. Bar code technology, interruption-free zones, and medication safety vests have been shown to decrease administration-related errors. However, there are few published data regarding the impact of number of interruptions on nurses' clinical performance during PCA programming. Nine acute care nurses completed three PCA pump programming tasks in a simulation laboratory. Programming tasks were completed under three conditions where the number of interruptions varied between two, four, and six. Outcome measures included cognitive workload (six NASA Task Load Index [NASA-TLX] subscales), total task completion time (seconds), nurse frustration (NASA-TLX Subscale 6), and PCA medication administration error (incorrect final programming). Increases in the number of interruptions were associated with significant increases in total task completion time ( p = .003). We also found increases in nurses' cognitive workload, nurse frustration, and PCA pump programming errors, but these increases were not statistically significant. Complex technology use permeates the acute care nursing practice environment. These results add new knowledge on nurses' clinical performance during PCA pump programming and high-risk medication administration.

  15. Spatial cluster detection using dynamic programming.

    PubMed

    Sverchkov, Yuriy; Jiang, Xia; Cooper, Gregory F

    2012-03-25

    The task of spatial cluster detection involves finding spatial regions where some property deviates from the norm or the expected value. In a probabilistic setting this task can be expressed as finding a region where some event is significantly more likely than usual. Spatial cluster detection is of interest in fields such as biosurveillance, mining of astronomical data, military surveillance, and analysis of fMRI images. In almost all such applications we are interested both in the question of whether a cluster exists in the data, and if it exists, we are interested in finding the most accurate characterization of the cluster. We present a general dynamic programming algorithm for grid-based spatial cluster detection. The algorithm can be used for both Bayesian maximum a-posteriori (MAP) estimation of the most likely spatial distribution of clusters and Bayesian model averaging over a large space of spatial cluster distributions to compute the posterior probability of an unusual spatial clustering. The algorithm is explained and evaluated in the context of a biosurveillance application, specifically the detection and identification of Influenza outbreaks based on emergency department visits. A relatively simple underlying model is constructed for the purpose of evaluating the algorithm, and the algorithm is evaluated using the model and semi-synthetic test data. When compared to baseline methods, tests indicate that the new algorithm can improve MAP estimates under certain conditions: the greedy algorithm we compared our method to was found to be more sensitive to smaller outbreaks, while as the size of the outbreaks increases, in terms of area affected and proportion of individuals affected, our method overtakes the greedy algorithm in spatial precision and recall. The new algorithm performs on-par with baseline methods in the task of Bayesian model averaging. We conclude that the dynamic programming algorithm performs on-par with other available methods for spatial cluster detection and point to its low computational cost and extendability as advantages in favor of further research and use of the algorithm.

  16. Spatial cluster detection using dynamic programming

    PubMed Central

    2012-01-01

    Background The task of spatial cluster detection involves finding spatial regions where some property deviates from the norm or the expected value. In a probabilistic setting this task can be expressed as finding a region where some event is significantly more likely than usual. Spatial cluster detection is of interest in fields such as biosurveillance, mining of astronomical data, military surveillance, and analysis of fMRI images. In almost all such applications we are interested both in the question of whether a cluster exists in the data, and if it exists, we are interested in finding the most accurate characterization of the cluster. Methods We present a general dynamic programming algorithm for grid-based spatial cluster detection. The algorithm can be used for both Bayesian maximum a-posteriori (MAP) estimation of the most likely spatial distribution of clusters and Bayesian model averaging over a large space of spatial cluster distributions to compute the posterior probability of an unusual spatial clustering. The algorithm is explained and evaluated in the context of a biosurveillance application, specifically the detection and identification of Influenza outbreaks based on emergency department visits. A relatively simple underlying model is constructed for the purpose of evaluating the algorithm, and the algorithm is evaluated using the model and semi-synthetic test data. Results When compared to baseline methods, tests indicate that the new algorithm can improve MAP estimates under certain conditions: the greedy algorithm we compared our method to was found to be more sensitive to smaller outbreaks, while as the size of the outbreaks increases, in terms of area affected and proportion of individuals affected, our method overtakes the greedy algorithm in spatial precision and recall. The new algorithm performs on-par with baseline methods in the task of Bayesian model averaging. Conclusions We conclude that the dynamic programming algorithm performs on-par with other available methods for spatial cluster detection and point to its low computational cost and extendability as advantages in favor of further research and use of the algorithm. PMID:22443103

  17. Integrated Application of Active Controls (IAAC) technology to an advanced subsonic transport project: Current and advanced act control system definition study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The Current and Advanced Technology ACT control system definition tasks of the Integrated Application of Active Controls (IAAC) Technology project within the Energy Efficient Transport Program are summarized. The systems mechanize six active control functions: (1) pitch augmented stability; (2) angle of attack limiting; (3) lateral/directional augmented stability; (4) gust load alleviation; (5) maneuver load control; and (6) flutter mode control. The redundant digital control systems meet all function requirements with required reliability and declining weight and cost as advanced technology is introduced.

  18. ISIS and META projects

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth; Cooper, Robert; Marzullo, Keith

    1990-01-01

    The ISIS project has developed a new methodology, virtual synchony, for writing robust distributed software. High performance multicast, large scale applications, and wide area networks are the focus of interest. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project is distributed control in a soft real-time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor, and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are reported.

  19. Using CLIPS in the domain of knowledge-based massively parallel programming

    NASA Technical Reports Server (NTRS)

    Dvorak, Jiri J.

    1994-01-01

    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.

  20. PC_Eyewitness: a computerized framework for the administration and practical application of research in eyewitness psychology.

    PubMed

    MacLin, Otto H; Meissner, Christian A; Zimmerman, Laura A

    2005-05-01

    Eyewitness identification evidence is an important aspect of our legal system. Society relies on witnesses to identify suspects whom they have observed during the commission of a crime. Because a witness has only a mental representation of the individual he or she observed, law enforcement must rely on verbal descriptions and identification procedures to document eyewitness evidence. The present article introduces and details a computer program, referred to as PC_Eyewitness (PCE), which can be used in laboratories to conduct research on eyewitness memory. PCE is a modular program written in Visual Basic 6.0 that allows a researcher to present stimuli to a participant, to conduct distractor tasks, to elicit verbal descriptors regarding a target individual, and to present a lineup for the participant to provide an identification response. To illustrate the versatility of the program, several classic studies in the eyewitness literature are recreated in the context of PCE. The program is also shown to have applications in the conduct of field research and for use by law enforcement to administer lineups in everyday practice. PCE is distributed at no cost.

  1. Towards the Automatic Generation of Programmed Foreign-Language Instructional Materials.

    ERIC Educational Resources Information Center

    Van Campen, Joseph A.

    The purpose of this report is to describe a set of programs which either perform certain tasks useful in the generation of programed foreign-language instructional material or facilitate the writing of such task-oriented programs by other researchers. The programs described are these: (1) a PDP-10 assembly language program for the selection from a…

  2. Performance Evaluation Tests for Environmental Research (PETER): evaluation of 114 measures

    NASA Technical Reports Server (NTRS)

    Bittner, A. C. Jr; Carter, R. C.; Kennedy, R. S.; Harbeson, M. M.; Krause, M.

    1986-01-01

    The goal of the Performance Evaluation Tests for Environmental Research (PETER) Program was to identify a set of measures of human capabilities for use in the study of environmental and other time-course effects. 114 measures studied in the PETER Program were evaluated and categorized into four groups based upon task stability and task definition. The Recommended category contained 30 measures that clearly obtained total stabilization and had an acceptable level of reliability efficiency. The Acceptable-But-Redundant category contained 15 measures. The 37 measures in the Marginal category, which included an inordinate number of slope and other derived measures, usually had desirable features which were outweighed by faults. The 32 measures in the Unacceptable category had either differential instability or weak reliability efficiency. It is our opinion that the 30 measures in the Recommended category should be given first consideration for environmental research applications. Further, it is recommended that information pertaining to preexperimental practice requirements and stabilized reliabilities should be utilized in repeated-measures environmental studies.

  3. Intelligent systems for KSC ground processing

    NASA Technical Reports Server (NTRS)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  4. PPC750 Performance Monitor

    NASA Technical Reports Server (NTRS)

    Meyer, Donald; Uchenik, Igor

    2007-01-01

    The PPC750 Performance Monitor (Perfmon) is a computer program that helps the user to assess the performance characteristics of application programs running under the Wind River VxWorks real-time operating system on a PPC750 computer. Perfmon generates a user-friendly interface and collects performance data by use of performance registers provided by the PPC750 architecture. It processes and presents run-time statistics on a per-task basis over a repeating time interval (typically, several seconds or minutes) specified by the user. When the Perfmon software module is loaded with the user s software modules, it is available for use through Perfmon commands, without any modification of the user s code and at negligible performance penalty. Per-task run-time performance data made available by Perfmon include percentage time, number of instructions executed per unit time, dispatch ratio, stack high water mark, and level-1 instruction and data cache miss rates. The performance data are written to a file specified by the user or to the serial port of the computer

  5. Improving the human readability of Arden Syntax medical logic modules using a concept-oriented terminology and object-oriented programming expressions.

    PubMed

    Choi, Jeeyae; Bakken, Suzanne; Lussier, Yves A; Mendonça, Eneida A

    2006-01-01

    Medical logic modules are a procedural representation for sharing task-specific knowledge for decision support systems. Based on the premise that clinicians may perceive object-oriented expressions as easier to read than procedural rules in Arden Syntax-based medical logic modules, we developed a method for improving the readability of medical logic modules. Two approaches were applied: exploiting the concept-oriented features of the Medical Entities Dictionary and building an executable Java program to replace Arden Syntax procedural expressions. The usability evaluation showed that 66% of participants successfully mapped all Arden Syntax rules to Java methods. These findings suggest that these approaches can play an essential role in the creation of human readable medical logic modules and can potentially increase the number of clinical experts who are able to participate in the creation of medical logic modules. Although our approaches are broadly applicable, we specifically discuss the relevance to concept-oriented nursing terminologies and automated processing of task-specific nursing knowledge.

  6. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  7. Neural networks as a control methodology

    NASA Technical Reports Server (NTRS)

    Mccullough, Claire L.

    1990-01-01

    While conventional computers must be programmed in a logical fashion by a person who thoroughly understands the task to be performed, the motivation behind neural networks is to develop machines which can train themselves to perform tasks, using available information about desired system behavior and learning from experience. There are three goals of this fellowship program: (1) to evaluate various neural net methods and generate computer software to implement those deemed most promising on a personal computer equipped with Matlab; (2) to evaluate methods currently in the professional literature for system control using neural nets to choose those most applicable to control of flexible structures; and (3) to apply the control strategies chosen in (2) to a computer simulation of a test article, the Control Structures Interaction Suitcase Demonstrator, which is a portable system consisting of a small flexible beam driven by a torque motor and mounted on springs tuned to the first flexible mode of the beam. Results of each are discussed.

  8. Applications of artificial intelligence to mission planning

    NASA Technical Reports Server (NTRS)

    Ford, Donnie R.; Rogers, John S.; Floyd, Stephen A.

    1990-01-01

    The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques.

  9. MHSS: a material handling system simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomernacki, L.; Hollstien, R.B.

    1976-04-07

    A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less

  10. FY 1987 current fiscal year work plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This Current Year Work Plan presents a detailed description of the activities to be performed by the Joint Integration Office during FY87. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance, task monitoring, informationmore » gathering and task reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of program status reports for DOE. Program Analysis is performed by the JIO to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. This work plan includes: system analyses, requirements analyses, interim and procedure development, legislative and regulatory analyses, dispatch and traffic analyses, and data bases.« less

  11. Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation/Briefing

    DTIC Science & Technology

    2005-10-01

    AFRL-HE-WP-TP-2005-0030 AIR FORCE RESEARCH LABORATORY Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation...TITLE AND SUBTITLE 5a. CONTRACT NUMBER FA8650-04-C-6406 Application of Cognitive Task Analysis in User Requirements 5b.GRANTNUMBER and Prototype...maintainer experience 21 21 Questions & Answers Application of Cognitive Task Analysis in User Requirements Definition and Prototype Design Christopher Curtis

  12. A scientific assessment of a new technology orbital telescope

    NASA Technical Reports Server (NTRS)

    1995-01-01

    As part of a program designed to test the Alpha chemical laser weapons system in space, the Ballistic Missile Defense Organization (BMDO) developed components of an agile, lightweight, 4-meter telescope, equipped with an advanced active-optics system. BMDO had proposed to make space available in the telescope's focal plane for instrumentation optimized for scientific applications in astrophysics and planetary astronomy for a potential flight mission. Such a flight mission could be undertaken if new or additional sponsorship can be found. Despite this uncertainty, BMDO requested assistance in defining the instrumentation and other design aspects necessary to enhance the scientific value of a pointing and tracking mission. In response to this request, the Space Studies Board established the Task Group on BMDO New Technology Orbital Observatory (TGBNTOO) and charged it to: (1) provide instrumentation, data management, and science-operations advice to BMDO to optimize the scientific value of a 4-meter mission; and (2) support a space studies board assessment of the relative scientific merit of the program. This report deals with the first of these tasks, assisting the Advanced Technology Demonstrator's (ATD's) program scientific potential. Given the potential scientific aspects of the 4-meter telescope, this project is referred to as the New Technology Orbital Telescope (NTOT), or as the ATD/NTOT, to emphasize its dual-use character. The task group's basic conclusion is that the ATD/NTOT mission does have the potential for contributing in a major way to astronomical goals.

  13. Predictors of FIFA 11+ Implementation Intention in Female Adolescent Soccer: An Application of the Health Action Process Approach (HAPA) Model

    PubMed Central

    McKay, Carly D.; Merrett, Charlotte K.; Emery, Carolyn A.

    2016-01-01

    The Fédération Internationale de Football (FIFA) 11+ warm-up program is efficacious at preventing lower limb injury in youth soccer; however, there has been poor adoption of the program in the community. The purpose of this study was to determine the utility of the Health Action Process Approach (HAPA) behavior change model in predicting intention to use the FIFA 11+ in a sample of 12 youth soccer teams (coaches n = 10; 12–16 year old female players n = 200). A bespoke cross-sectional questionnaire measured pre-season risk perceptions, outcome expectancies, task self-efficacy, facilitators, barriers, and FIFA 11+ implementation intention. Most coaches (90.0%) and players (80.0%) expected the program to reduce injury risk but reported limited intention to use it. Player data demonstrated an acceptable fit to the hypothesized model (standardized root mean square residual (SRMR) = 0.08; root mean square of error of approximation (RMSEA) = 0.06 (0.047–0.080); comparative fit index (CFI) = 0.93; Tucker Lewis index (TLI) = 0.91) Task self-efficacy (β = 0.53, p ≤ 0.01) and outcome expectancies (β = 0.13 p ≤ 0.05) were positively associated with intention, but risk perceptions were not (β = −0.02). The findings suggest that the HAPA model is appropriate for use in this context, and highlight the need to target task self-efficacy and outcome expectancies in FIFA 11+ implementation strategies. PMID:27399746

  14. Automation and robotics technology for intelligent mining systems

    NASA Technical Reports Server (NTRS)

    Welsh, Jeffrey H.

    1989-01-01

    The U.S. Bureau of Mines is approaching the problems of accidents and efficiency in the mining industry through the application of automation and robotics to mining systems. This technology can increase safety by removing workers from hazardous areas of the mines or from performing hazardous tasks. The short-term goal of the Automation and Robotics program is to develop technology that can be implemented in the form of an autonomous mining machine using current continuous mining machine equipment. In the longer term, the goal is to conduct research that will lead to new intelligent mining systems that capitalize on the capabilities of robotics. The Bureau of Mines Automation and Robotics program has been structured to produce the technology required for the short- and long-term goals. The short-term goal of application of automation and robotics to an existing mining machine, resulting in autonomous operation, is expected to be accomplished within five years. Key technology elements required for an autonomous continuous mining machine are well underway and include machine navigation systems, coal-rock interface detectors, machine condition monitoring, and intelligent computer systems. The Bureau of Mines program is described, including status of key technology elements for an autonomous continuous mining machine, the program schedule, and future work. Although the program is directed toward underground mining, much of the technology being developed may have applications for space systems or mining on the Moon or other planets.

  15. After-effects of human-computer interaction indicated by P300 of the event-related brain potential.

    PubMed

    Trimmel, M; Huber, R

    1998-05-01

    After-effects of human-computer interaction (HCI) were investigated by using the P300 component of the event-related brain potential (ERP). Forty-nine subjects (naive non-users, beginners, experienced users, programmers) completed three paper/pencil tasks (text editing, solving intelligence test items, filling out a questionnaire on sensation seeking) and three HCI tasks (text editing, executing a tutor program or programming, playing Tetris). The sequence of 7-min tasks was randomized between subjects and balanced between groups. After each experimental condition ERPs were recorded during an acoustic discrimination task at F3, F4, Cz, P3 and P4. Data indicate that: (1) mental after-effects of HCI can be detected by P300 of the ERP; (2) HCI showed in general a reduced amplitude; (3) P300 amplitude varied also with type of task, mainly at F4 where it was smaller after cognitive tasks (intelligence test/programming) and larger after emotion-based tasks (sensation seeking/Tetris); (4) cognitive tasks showed shorter latencies; (5) latencies were widely location-independent (within the range of 356-358 ms at F3, F4, P3 and P4) after executing the tutor program or programming; and (6) all observed after-effects were independent of the user's experience in operating computers and may therefore reflect short-term after-effects only and no structural changes of information processing caused by HCI.

  16. JSpOC Mission System Application Development Environment

    NASA Astrophysics Data System (ADS)

    Luce, R.; Reele, P.; Sabol, C.; Zetocha, P.; Echeverry, J.; Kim, R.; Golf, B.

    2012-09-01

    The Joint Space Operations Center (JSpOC) Mission System (JMS) is the program of record tasked with replacing the legacy Space Defense Operations Center (SPADOC) and Astrodynamics Support Workstation (ASW) capabilities by the end of FY2015 as well as providing additional Space Situational Awareness (SSA) and Command and Control (C2) capabilities post-FY2015. To meet the legacy replacement goal, the JMS program is maturing a government Service Oriented Architecture (SOA) infrastructure that supports the integration of mission applications while acquiring mature industry and government mission applications. Future capabilities required by the JSpOC after 2015 will require development of new applications and procedures as well as the exploitation of new SSA data sources. To support the post FY2015 efforts, the JMS program is partnering with the Air Force Research Laboratory (AFRL) to build a JMS application development environment. The purpose of this environment is to: 1) empower the research & development community, through access to relevant tools and data, to accelerate technology development, 2) allow the JMS program to communicate user capability priorities and requirements to the developer community, 3) provide the JMS program with access to state-of-the-art research, development, and computing capabilities, and 4) support market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. The application development environment will consist of both unclassified and classified environments that can be accessed over common networks (including the Internet) to provide software developers, scientists, and engineers everything they need (e.g., building block JMS services, modeling and simulation tools, relevant test scenarios, documentation, data sources, user priorities/requirements, and SOA integration tools) to develop and test mission applications. The developed applications will be exercised in these relevant environments with representative data sets to help bridge the gap between development and integration into the operational JMS enterprise.

  17. User-Assisted Store Recycling for Dynamic Task Graph Schedulers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurt, Mehmet Can; Krishnamoorthy, Sriram; Agrawal, Gagan

    The emergence of the multi-core era has led to increased interest in designing effective yet practical parallel programming models. Models based on task graphs that operate on single-assignment data are attractive in several ways: they can support dynamic applications and precisely represent the available concurrency. However, they also require nuanced algorithms for scheduling and memory management for efficient execution. In this paper, we consider memory-efficient dynamic scheduling of task graphs. Specifically, we present a novel approach for dynamically recycling the memory locations assigned to data items as they are produced by tasks. We develop algorithms to identify memory-efficient store recyclingmore » functions by systematically evaluating the validity of a set of (user-provided or automatically generated) alternatives. Because recycling function can be input data-dependent, we have also developed support for continued correct execution of a task graph in the presence of a potentially incorrect store recycling function. Experimental evaluation demonstrates that our approach to automatic store recycling incurs little to no overheads, achieves memory usage comparable to the best manually derived solutions, often produces recycling functions valid across problem sizes and input parameters, and efficiently recovers from an incorrect choice of store recycling functions.« less

  18. Elevated temperature crack growth

    NASA Technical Reports Server (NTRS)

    Kim, K. S.; Vanstone, R. H.

    1992-01-01

    The purpose of this program was to extend the work performed in the base program (CR 182247) into the regime of time-dependent crack growth under isothermal and thermal mechanical fatigue (TMF) loading, where creep deformation also influences the crack growth behavior. The investigation was performed in a two-year, six-task, combined experimental and analytical program. The path-independent integrals for application to time-dependent crack growth were critically reviewed. The crack growth was simulated using a finite element method. The path-independent integrals were computed from the results of finite-element analyses. The ability of these integrals to correlate experimental crack growth data were evaluated under various loading and temperature conditions. The results indicate that some of these integrals are viable parameters for crack growth prediction at elevated temperatures.

  19. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  20. New Mexico statewide geothermal energy program. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Icerman, L.; Parker, S.K.

    1988-04-01

    This report summarizes the results of geothermal energy resource assessment work conducted by the New Mexico Statewide Geothermal Energy Program during the period September 7, 1984, through February 29, 1988, under the sponsorship of the US Dept. of Energy and the State of New Mexico Research and Development Institute. The research program was administered by the New Mexico Research and Development Institute and was conducted by professional staff members at New Mexico State University and Lightning Dock Geothermal, Inc. The report is divided into four chapters, which correspond to the principal tasks delineated in the above grant. This work extendsmore » the knowledge of the geothermal energy resource base in southern New Mexico with the potential for commercial applications.« less

  1. Task Description Language

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Apfelbaum, David

    2005-01-01

    Task Description Language (TDL) is an extension of the C++ programming language that enables programmers to quickly and easily write complex, concurrent computer programs for controlling real-time autonomous systems, including robots and spacecraft. TDL is based on earlier work (circa 1984 through 1989) on the Task Control Architecture (TCA). TDL provides syntactic support for hierarchical task-level control functions, including task decomposition, synchronization, execution monitoring, and exception handling. A Java-language-based compiler transforms TDL programs into pure C++ code that includes calls to a platform-independent task-control-management (TCM) library. TDL has been used to control and coordinate multiple heterogeneous robots in projects sponsored by NASA and the Defense Advanced Research Projects Agency (DARPA). It has also been used in Brazil to control an autonomous airship and in Canada to control a robotic manipulator.

  2. Cooperative Scheduling of Imaging Observation Tasks for High-Altitude Airships Based on Propagation Algorithm

    PubMed Central

    Chuan, He; Dishan, Qiu; Jin, Liu

    2012-01-01

    The cooperative scheduling problem on high-altitude airships for imaging observation tasks is discussed. A constraint programming model is established by analyzing the main constraints, which takes the maximum task benefit and the minimum cruising distance as two optimization objectives. The cooperative scheduling problem of high-altitude airships is converted into a main problem and a subproblem by adopting hierarchy architecture. The solution to the main problem can construct the preliminary matching between tasks and observation resource in order to reduce the search space of the original problem. Furthermore, the solution to the sub-problem can detect the key nodes that each airship needs to fly through in sequence, so as to get the cruising path. Firstly, the task set is divided by using k-core neighborhood growth cluster algorithm (K-NGCA). Then, a novel swarm intelligence algorithm named propagation algorithm (PA) is combined with the key node search algorithm (KNSA) to optimize the cruising path of each airship and determine the execution time interval of each task. Meanwhile, this paper also provides the realization approach of the above algorithm and especially makes a detailed introduction on the encoding rules, search models, and propagation mechanism of the PA. Finally, the application results and comparison analysis show the proposed models and algorithms are effective and feasible. PMID:23365522

  3. SCTE: An open-source Perl framework for testing equipment control and data acquisition

    NASA Astrophysics Data System (ADS)

    Mostaço-Guidolin, Luiz C.; Frigori, Rafael B.; Ruchko, Leonid; Galvão, Ricardo M. O.

    2012-07-01

    SCTE intends to provide a simple, yet powerful, framework for building data acquisition and equipment control systems for experimental Physics, and correlated areas. Via its SCTE::Instrument module, RS-232, USB, and LAN buses are supported, and the intricacies of hardware communication are encapsulated underneath an object oriented abstraction layer. Written in Perl, and using the SCPI protocol, enabled instruments can be easily programmed to perform a wide variety of tasks. While this work presents general aspects of the development of data acquisition systems using the SCTE framework, it is illustrated by particular applications designed for the calibration of several in-house developed devices for power measurement in the tokamak TCABR Alfvén Waves Excitement System. Catalogue identifier: AELZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License Version 3 No. of lines in distributed program, including test data, etc.: 13 811 No. of bytes in distributed program, including test data, etc.: 743 709 Distribution format: tar.gz Programming language: Perl version 5.10.0 or higher. Computer: PC. SCPI capable digital oscilloscope, with RS-232, USB, or LAN communication ports, null modem, USB, or Ethernet cables Operating system: GNU/Linux (2.6.28-11), should also work on any Unix-based operational system Classification: 4.14 External routines: Perl modules: Device::SerialPort, Term::ANSIColor, Math::GSL, Net::HTTP. Gnuplot 4.0 or higher Nature of problem: Automation of experiments and data acquisition often requires expensive equipment and in-house development of software applications. Nowadays personal computers and test equipment come with fast and easy-to-use communication ports. Instrument vendors often supply application programs capable of controlling such devices, but are very restricted in terms of functionalities. For instance, they are not capable of controlling more than one test equipment at a same time or to automate repetitive tasks. SCTE provides a way of using auxiliary equipment in order to automate experiment procedures at low cost using only free, and open-source operational system and libraries. Solution method: SCTE provides a Perl module that implements RS-232, USB, and LAN communication allowing the use of SCPI capable instruments [1]. Therefore providing a straightforward way of creating automation and data acquisition applications using personal computers and testing instruments [2]. SCPI Consortium, Standard Commands for Programmable Instruments, 1999, http://www.scpiconsortium.org. L.C.B. Mostaço-Guidolin, Determinação da configuração de ondas de Alfvén excitadas no tokamak TCABR, Master's thesis, Universidade de São Paulo (2007), http://www.teses.usp.br/teses/disponiveis/43/43134/tde-23042009-230419/.

  4. Enhancing UCSF Chimera through web services

    PubMed Central

    Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.

    2014-01-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624

  5. Pilot GPS LORAN Receiver Programming Performance A Laboratory Evaluation

    DOT National Transportation Integrated Search

    1994-02-01

    This study was designed to explore GPS/LORAN receiver programming performance under : simulated flight conditions. The programming task consisted of entering, editing, and : verifying a four-waypoint flight plan. The task demands were manipulated by ...

  6. Simulation For Task Practice in Technical Training.

    ERIC Educational Resources Information Center

    Mallory, W. J.

    1981-01-01

    Describes two programs used by the Ford Motor Company to train manufacturing skilled trades personnel. Programmable Controller Maintenance Training Program for Industrial Technicians and Troubleshooting Strategy Program use simulation and provide improved task performance after training. (JOW)

  7. Combined Diet and Physical Activity Promotion Programs for Prevention of Diabetes: Community Preventive Services Task Force Recommendation Statement.

    PubMed

    Pronk, Nicolaas P; Remington, Patrick L

    2015-09-15

    Community Preventive Services Task Force recommendation on the use of combined diet and physical activity promotion programs to reduce progression to type 2 diabetes in persons at increased risk. The Task Force commissioned an evidence review that assessed the benefits and harms of programs to promote and support individual improvements in diet, exercise, and weight and supervised a review on the economic efficiency of these programs in clinical trial, primary care, and primary care-referable settings. Adolescents and adults at increased risk for progression to type 2 diabetes. The Task Force recommends the use of combined diet and physical activity promotion programs by health care systems, communities, and other implementers to provide counseling and support to clients identified as being at increased risk for type 2 diabetes. Economic evidence indicates that these programs are cost-effective.

  8. [Exploring Aeronautics

    NASA Technical Reports Server (NTRS)

    Robinson, Brandi

    2004-01-01

    This summer I have been working with the N.A.S.A. Project at Cuyahoga Community College (Tri-C) under the title of Exploring Aeronautics Project Leader. The class that I have worked with is comprised of students that will enter the eighth grade in the fall of 2004. The program primarily focuses upon math proficiency and individualized class projects. My duties have encompassed both realms. During the first 2-3 weeks of my internship, I worked at NASA Glenn Research Center (GRC) researching, organizing, and compiling information for weekly Scholastic Challenges and the Super Scholastic Challenge. I was able to complete an overview of Scholastic Challenge and staff responsibilities regarding the competition; a proposal for an interactive learning system, Quizdom; a schedule for challenge equipment, as well as a schedule listing submission deadlines for the staff. Also included in my tasks, during these first 2-3 weeks, were assisting Tammy Allen and Candice Thomas with the student application review and interview processes for student applicants. For the student and parent orientation, I was assigned publications and other varying tasks to complete before the start of the program. Upon the commencement of the program, I changed location from NASA GRC to Tri-C Metro Campus, where student classes for the Cleveland site are held. During the duration of the program, I work with the instructor for the Exploring Aeronautics class, kkkk, assisting in classroom management, daily attendance, curriculum, project building, and other tasks as needed. These tasks include the conducting of the weekly competition, known as Scholastic Challenge. As a Project Leader, I am also responsible for one subject area of the Scholastic Challenge aspect of the N.A.S.A. Project curriculum. Each week I have to prepare a mission that the participants will take home the following Monday and at least 10 questions that will be included in the pool of questions used for the Scholastic Challenge competition on Thursdays. For at least one of these competitions, I must compile all mission and question information submitted by the staff, distribute missions to the students, and enter questions into Jeopardy formatted PowerPoint presentation. Unique to the N.A.S.A. Project are its Saturday sessions and opportunities for field trips. As a Project Leader, I am required to attend all field trips and Saturday sessions held for participants and their parent(s)/guardian(s). The Saturday sessions do not require my assistance because they are facilitated by a contracting company, Imhotep. This leaves my duties to observation unless instructed otherwise.

  9. About the mechanism of ERP-system pilot test

    NASA Astrophysics Data System (ADS)

    Mitkov, V. V.; Zimin, V. V.

    2018-05-01

    In the paper the mathematical problem of defining the scope of pilot test is stated, which is a task of quadratic programming. The procedure of the problem solving includes the method of network programming based on the structurally similar network representation of the criterion and constraints and which reduces the original problem to a sequence of simpler evaluation tasks. The evaluation tasks are solved by the method of dichotomous programming.

  10. Network issues for large mass storage requirements

    NASA Technical Reports Server (NTRS)

    Perdue, James

    1992-01-01

    File Servers and Supercomputing environments need high performance networks to balance the I/O requirements seen in today's demanding computing scenarios. UltraNet is one solution which permits both high aggregate transfer rates and high task-to-task transfer rates as demonstrated in actual tests. UltraNet provides this capability as both a Server-to-Server and Server-to-Client access network giving the supercomputing center the following advantages highest performance Transport Level connections (to 40 MBytes/sec effective rates); matches the throughput of the emerging high performance disk technologies, such as RAID, parallel head transfer devices and software striping; supports standard network and file system applications using SOCKET's based application program interface such as FTP, rcp, rdump, etc.; supports access to the Network File System (NFS) and LARGE aggregate bandwidth for large NFS usage; provides access to a distributed, hierarchical data server capability using DISCOS UniTree product; supports file server solutions available from multiple vendors, including Cray, Convex, Alliant, FPS, IBM, and others.

  11. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  12. Design and Implementation of a Modern Automatic Deformation Monitoring System

    NASA Astrophysics Data System (ADS)

    Engel, Philipp; Schweimler, Björn

    2016-03-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  13. Software Issues in High-Performance Computing and a Framework for the Development of HPC Applications

    DTIC Science & Technology

    1995-01-01

    possible to determine communication points. For this version, a C program spawning Posix threads and using semaphores to synchronize would have to...performance such as the time required for network communication and synchronization as well as issues of asynchrony and memory hierarchy. For example...enhances reusability. Process (or task) parallel computations can also be succinctly expressed with a small set of process creation and synchronization

  14. Screwworm Eradication Data System (SEDS) applications program documentation, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    National Oceanic and Atmospheric Administration (NOAA) satellite data, which is in analog tape form, is converted into output data products. These products are used by investigators in determining optimum areas for airdrops of steril screwworm flies over Mexico. The output product of SEDS takes the form of imagery data film slides showing Mexico and the lower southwest portion of the United States. The area analog preprocessing phase of the SEDS task is covered.

  15. Pressure Studies of Protein Dynamics.

    DTIC Science & Technology

    1987-02-20

    applicable ) Office of Naval Research ONR N00014-86-K-0270 kc. ADDRESS (City, State,and ZIP Code) 10. SOURCE OF FUNDING NUMBERS - PROGRAM PROJECT I TASK IWORK...Pressure Studies of Protein Dynamics 12. PERSONAL AUTHOR(S) Hans Frauenfelder and Robert D. Young 13a. TYPE OF REPORT |13b. TIME COVERED 114 DATE OF...relatioihbetween dynamic structure and function of protein protein dyna -bsey observing the phenomena induced by flash photolysis using near ultravfilet

  16. Space station System Engineering and Integration (SE and I). Volume 2: Study results

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A summary of significant study results that are products of the Phase B conceptual design task are contained. Major elements are addressed. Study results applicable to each major element or area of design are summarized and included where appropriate. Areas addressed include: system engineering and integration; customer accommodations; test and program verification; product assurance; conceptual design; operations and planning; technical and management information system (TMIS); and advanced development.

  17. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-07-01

    Background, Issues, and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK...performed out of sequence and significant rework has been required, disrupting the optimal construction sequence and application of lessons learned...deeply concerned about Northrop Grumman Ship Systems’ ( NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to

  18. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-06-10

    Background, Issues, and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK...out of sequence and significant rework has been required, disrupting the optimal construction sequence and application of lessons learned for...concerned about Northrop Grumman Ship Systems’ ( NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to construction

  19. Navy LPD-17 Amphibious Ship Procurement: Background, Issues, and Options for Congress

    DTIC Science & Technology

    2010-03-29

    and Options for Congress 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK NUMBER 5f...performed out of sequence and significant rework has been required, disrupting the optimal construction sequence and application of lessons learned...deeply concerned about Northrop Grumman Ship Systems’ ( NGSS ) ability to recover in the aftermath of Hurricane Katrina, particularly in regard to

  20. An Application of Instructional System Development to Determine Financial Management Education Needs for Logistics Management Positions.

    DTIC Science & Technology

    1976-09-01

    The purpose of this research effort was to determine the financial management educational needs of USAF graduate logistics positions. Goal analysis...was used to identify financial management techniques and task analysis was used to develop a method to identify the use of financial management techniques...positions. The survey identified financial management techniques in five areas: cost accounting, capital budgeting, working capital, financial forecasting, and programming. (Author)

  1. Overview of the NASA automation and robotics research program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Larsen, Ron

    1985-01-01

    NASA studies over the last eight years have identified five opportunities for the application of automation and robotics technology: (1) satellite servicing; (2) system monitoring, control, sequencing and diagnosis; (3) space manufacturing; (4) space structure assembly; and (5) planetary rovers. The development of these opportunities entails two technology R&D thrusts: telerobotics and system autonomy; both encompass such concerns as operator interface, task planning and reasoning, control execution, sensing, and systems integration.

  2. DebrisLV Hypervelocity Impact Post-Shot Physical Results Summary

    DTIC Science & Technology

    2015-02-27

    Sheaffer1, Paul M. Adams2, Naoki Hemmi3, Christopher Hartney1 1Space Science Applications Laboratory Physical Sciences Laboratories 2Space Materials...PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Patti M. Sheaffer, Paul M. Adams, Naoki Hemmi, Christopher Hartney 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...this document could not have been acquired without the active help and support of NASA (J.-C. Liou, Robert Markowicz); Jacobs Technologies ( John Opiela

  3. GPHS-RTGs in support of the Cassini mission

    NASA Astrophysics Data System (ADS)

    1994-04-01

    This report is organized by the program task structure as follows: (1) spacecraft integration and liaison; (2) engineering support; (3) safety; (4) qualified unicouple fabrication; (5) ETG fabrication, assembly, and test; (6) ground support equipment (GSE); (7) RTG shipping and launch support; (8) designs, reviews, and mission applications; (9) project management, quality assurance and reliability, contract changes, noncapital contractor acquired government owned property (CAGO) acquisition, and CAGO maintenance; and (10) CAGO acquisition.

  4. Targeting histone abnormality in triple negative breast cancer

    DTIC Science & Technology

    2017-08-01

    PROGRAM ELEMENT NUMBER N/A 6. AUTHOR(S) 5d. PROJECT NUMBER N/A Steffi Oesterreich, PhD 5e. TASK NUMBER N/A E -Mail: oesterreichs@upmc.edu 5f...AVAILABILITY STATEMENT Approved for Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES N/A 14. ABSTRACT During this funding period, Dr. Nancy E ...Technologies or techniques Nothing to Report (d) Inventions, patent applications, and/or licenses Nothing to Report ( e ) Other Products Nothing to

  5. Integrated propulsion technology demonstrator. Program plan

    NASA Technical Reports Server (NTRS)

    1994-01-01

    NASA and Rockwell have embarked on a cooperative agreement to define, develop, fabricate, and operate an integrated propulsion technology demonstrator (IPTD) for the purpose of validating design, process, and technology improvements of launch vehicle propulsion systems. This program, a result of NRA8-11, Task Area 1 A, is jointly funded by both NASA and Rockwell and is sponsored by the Reusable Launch Vehicle office at NASA Marshall Space flight Center. This program plan provides to the joint NASA/Rockwell integrated propulsion technology demonstrator (IPTD) team a description of the activities within tasks / sub tasks and associated schedules required to successfully achieve program objectives. This document also defines the cost elements and manpower allocations for each sub task for purpose of program control. This plan is updated periodically by developing greater depth of direction for outyear tasks as the program matures. Updating is accomplished by adding revisions to existing pages or attaching page revisions to this plan. In either case, revisions will be identified by appropriate highlighting of the change, or specifying a revision page through the use of footnotes on the bottom right of each change page. Authorization for the change is provided by the principal investigators to maintain control of this program plan document and IPTD program activities.

  6. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.

  7. On Why It Is Impossible to Prove that the BDX90 Dispatcher Implements a Time-sharing System

    NASA Technical Reports Server (NTRS)

    Boyer, R. S.; Moore, J. S.

    1983-01-01

    The Software Implemented Fault Tolerance SIFT system, is written in PASCAL except for about a page of machine code. The SIFT system implements a small time sharing system in which PASCAL programs for separate application tasks are executed according to a schedule with real time constraints. The PASCAL language has no provision for handling the notion of an interrupt such as the B930 clock interrupt. The PASCAL language also lacks the notion of running a PASCAL subroutine for a given amount of time, suspending it, saving away the suspension, and later activating the suspension. Machine code was used to overcome these inadequacies of PASCAL. Code which handles clock interrupts and suspends processes is called a dispatcher. The time sharing/virtual machine idea is completely destroyed by the reconfiguration task. After termination of the reconfiguration task, the tasks run by the dispatcher have no relation to those run before reconfiguration. It is impossible to view the dispatcher as a time-sharing system implementing virtual BDX930s running concurrently when one process can wipe out the others.

  8. Laser ablation/ionization characterization of solids: Second interim progress report of the strategic environmental research development program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, W.P.; Bushaw, B.A.; McCarthy, M.I.

    1996-10-01

    The Department of Energy is undertaking the enormous task of remediating defense wastes and environmental insults which have occurred over 50 years of nuclear weapons production. It is abundantly clear that significant technology advances are needed to characterize, process, and store highly radioactive waste and to remediate contaminated zones. In addition to the processing and waste form issues, analytical technologies needed for the characterization of solids, and for monitoring storage tanks and contaminated sites do not exist or are currently expensive labor-intensive tasks. This report describes progress in developing sensitive, rapid, and widely applicable laser-based mass spectrometry techniques for analysismore » of mixed chemical wastes and contaminated soils.« less

  9. Research, development and demonstration of nickel-zinc batteries for electric vehicle propulsion

    NASA Astrophysics Data System (ADS)

    1980-06-01

    The feasibility of the nickel zinc battery for electric vehicle propulsion is discussed. The program is divided into seven distinct but highly interactive tasks collectively aimed at the development and commercialization of nickel zinc technology. These basic technical tasks are separator development, electrode development, product design and analysis, cell/module battery testing, process development, pilot manufacturing, and thermal manufacturing, and thermal management. Significant progress has been made in the understanding of separator failure mechanisms, and a generic category of materials has been specified for the 300+ deep discharge applications. Shape change has been reduced significantly. Progress in the area of thermal management was significant, with the development of a model that accurately represents heat generation and rejection rates during battery operation.

  10. Small Engine Technology (SET) - Task 13 ANOPP Noise Prediction for Small Engines: Jet Noise Prediction Module, Wing Shielding Module, and System Studies Results

    NASA Technical Reports Server (NTRS)

    Lieber, Lysbeth; Golub, Robert (Technical Monitor)

    2000-01-01

    This Final Report has been prepared by AlliedSignal Engines and Systems, Phoenix, Arizona, documenting work performed during the period May 1997 through June 1999, under the Small Engines Technology Program, Contract No. NAS3-27483, Task Order 13, ANOPP Noise Prediction for Small Engines. The report specifically covers the work performed under Subtasks 4, 5 and 6. Subtask 4 describes the application of a semi-empirical procedure for jet noise prediction, subtask 5 describes the development of a procedure to predict the effects of wing shielding, and subtask 6 describes the results of system studies of the benefits of the new noise technology on business and regional aircraft.

  11. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  12. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    PubMed

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  13. High-Throughput Tabular Data Processor – Platform independent graphical tool for processing large data sets

    PubMed Central

    Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475

  14. Preliminary Sizing of Vertical Take-off Rocket-based Combined-cycle Powered Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Roche, Joseph M.; McCurdy, David R.

    2001-01-01

    The task of single-stage-to-orbit has been an elusive goal due to propulsion performance, materials limitations, and complex system integration. Glenn Research Center has begun to assemble a suite of relationships that tie Rocket-Based Combined-Cycle (RBCC) performance and advanced material data into a database for the purpose of preliminary sizing of RBCC-powered launch vehicles. To accomplish this, a near optimum aerodynamic and structural shape was established as a baseline. The program synthesizes a vehicle to meet the mission requirements, tabulates the results, and plots the derived shape. A discussion of the program architecture and an example application is discussed herein.

  15. XPRESS: eXascale PRogramming Environment and System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brightwell, Ron; Sterling, Thomas; Koniges, Alice

    The XPRESS Project is one of four major projects of the DOE Office of Science Advanced Scientific Computing Research X-stack Program initiated in September, 2012. The purpose of XPRESS is to devise an innovative system software stack to enable practical and useful exascale computing around the end of the decade with near-term contributions to efficient and scalable operation of trans-Petaflops performance systems in the next two to three years; both for DOE mission-critical applications. To this end, XPRESS directly addresses critical challenges in computing of efficiency, scalability, and programmability through introspective methods of dynamic adaptive resource management and task scheduling.

  16. Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic

    NASA Technical Reports Server (NTRS)

    Hjermstad, Chris

    1986-01-01

    Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.

  17. Monopropellant hydrazine resistojet: Flight application design

    NASA Technical Reports Server (NTRS)

    Kurch, C. K.

    1973-01-01

    The design, development, and testing of an engineering model nominal 20-millipound thrust monopropellant hydrazine resistojet program is directed toward the advanced development of an electrothermal hydrazine thruster (EHT). The EHT decomposes hydrazine thermally and expands the decomposition products through a nozzle to provide the impulse necessary to fulfill spacecraft propulsive requirements. The thruster is capable of operation at pulse widths from 0.050 second to steady state and delivers specific impulse values up to about 230 seconds depending on the duty cycle. The program is comprised of six tasks including analyses, the generation of specifications and other documentation, design, fabrication and test, data correlation, and recommendations for the design of flight units.

  18. An introduction to scripting in Ruby for biologists

    PubMed Central

    Aerts, Jan; Law, Andy

    2009-01-01

    The Ruby programming language has a lot to offer to any scientist with electronic data to process. Not only is the initial learning curve very shallow, but its reflection and meta-programming capabilities allow for the rapid creation of relatively complex applications while still keeping the code short and readable. This paper provides a gentle introduction to this scripting language for researchers without formal informatics training such as many wet-lab scientists. We hope this will provide such researchers an idea of how powerful a tool Ruby can be for their data management tasks and encourage them to learn more about it. PMID:19607723

  19. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    PubMed

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  20. Exploiting Vector and Multicore Parallelsim for Recursive, Data- and Task-Parallel Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Bin; Krishnamoorthy, Sriram; Agrawal, Kunal

    Modern hardware contains parallel execution resources that are well-suited for data-parallelism-vector units-and task parallelism-multicores. However, most work on parallel scheduling focuses on one type of hardware or the other. In this work, we present a scheduling framework that allows for a unified treatment of task- and data-parallelism. Our key insight is an abstraction, task blocks, that uniformly handles data-parallel iterations and task-parallel tasks, allowing them to be scheduled on vector units or executed independently as multicores. Our framework allows us to define schedulers that can dynamically select between executing task- blocks on vector units or multicores. We show that thesemore » schedulers are asymptotically optimal, and deliver the maximum amount of parallelism available in computation trees. To evaluate our schedulers, we develop program transformations that can convert mixed data- and task-parallel pro- grams into task block-based programs. Using a prototype instantiation of our scheduling framework, we show that, on an 8-core system, we can simultaneously exploit vector and multicore parallelism to achieve 14×-108× speedup over sequential baselines.« less

Top