Sample records for processing system capable

  1. Enhanced Training by a Systemic Governance of Force Capabilities, Tasks, and Processes

    DTIC Science & Technology

    2013-06-01

    18th ICCRTS “C2 in Underdeveloped, Degraded and Denied Operational Environments” Enhanced Training by a Systemic Governance of Force Capabilities...TITLE AND SUBTITLE Enhanced Training by a Systemic Governance of Force Capabilities, Tasks, and Processes 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...assess, evaluate and accredit the Swedish forces. This paper presents a Systemic Governance of Capabilities, Tasks, and Processes applied to the

  2. Toward a Capability Engineering Process

    DTIC Science & Technology

    2004-12-01

    TOWARD A CAPABILITY ENGINEERING PROCESS M. Lizotte, F. Bernier, M. Mokhtari , M. Couture, G. Dussault, C. Lalancette, F. Lemieux System of Systems...Lizotte, F. Lemieux, the US DoD 5000 acquisition strategies?; and (8) since a M. Mokhtari , 2004: Toward Capability Engineering capability can be

  3. Systems Thinking for the Enterprise: A Thought Piece

    NASA Astrophysics Data System (ADS)

    Rebovich, George

    This paper suggests a way of managing the acquisition of capabilities for large-scale government enterprises that is different from traditional "specify and build" approaches commonly employed by U.S. government agencies in acquiring individual systems or systems of systems (SoS). Enterprise capabilities evolve through the emergence and convergence of information and other technologies and their integration into social, institutional and operational organizations and processes. Enterprise capabilities evolve whether or not the enterprise has processes in place to actively manage them. Thus the critical role of enterprise system engineering (ESE) processes should be to shape, enhance and accelerate the "natural" evolution of enterprise capabilities. ESE processes do not replace or add a layer to traditional system engineering (TSE) processes used in developing individual systems or SoS. ESE processes should complement TSE processes by shaping outcome spaces and stimulating interactions among enterprise participants through marketlike mechanisms to reward those that create innovation which moves and accelerates the evolution of the enterprise.

  4. Human Planetary Landing System (HPLS) Capability Roadmap NRC Progress Review

    NASA Technical Reports Server (NTRS)

    Manning, Rob; Schmitt, Harrison H.; Graves, Claude

    2005-01-01

    Capability Roadmap Team. Capability Description, Scope and Capability Breakdown Structure. Benefits of the HPLS. Roadmap Process and Approach. Current State-of-the-Art, Assumptions and Key Requirements. Top Level HPLS Roadmap. Capability Presentations by Leads. Mission Drivers Requirements. "AEDL" System Engineering. Communication & Navigation Systems. Hypersonic Systems. Super to Subsonic Decelerator Systems. Terminal Descent and Landing Systems. A Priori In-Situ Mars Observations. AEDL Analysis, Test and Validation Infrastructure. Capability Technical Challenges. Capability Connection Points to other Roadmaps/Crosswalks. Summary of Top Level Capability. Forward Work.

  5. Toshiba TDF-500 High Resolution Viewing And Analysis System

    NASA Astrophysics Data System (ADS)

    Roberts, Barry; Kakegawa, M.; Nishikawa, M.; Oikawa, D.

    1988-06-01

    A high resolution, operator interactive, medical viewing and analysis system has been developed by Toshiba and Bio-Imaging Research. This system provides many advanced features including high resolution displays, a very large image memory and advanced image processing capability. In particular, the system provides CRT frame buffers capable of update in one frame period, an array processor capable of image processing at operator interactive speeds, and a memory system capable of updating multiple frame buffers at frame rates whilst supporting multiple array processors. The display system provides 1024 x 1536 display resolution at 40Hz frame and 80Hz field rates. In particular, the ability to provide whole or partial update of the screen at the scanning rate is a key feature. This allows multiple viewports or windows in the display buffer with both fixed and cine capability. To support image processing features such as windowing, pan, zoom, minification, filtering, ROI analysis, multiplanar and 3D reconstruction, a high performance CPU is integrated into the system. This CPU is an array processor capable of up to 400 million instructions per second. To support the multiple viewer and array processors' instantaneous high memory bandwidth requirement, an ultra fast memory system is used. This memory system has a bandwidth capability of 400MB/sec and a total capacity of 256MB. This bandwidth is more than adequate to support several high resolution CRT's and also the fast processing unit. This fully integrated approach allows effective real time image processing. The integrated design of viewing system, memory system and array processor are key to the imaging system. It is the intention to describe the architecture of the image system in this paper.

  6. JSC earth resources data analysis capabilities available to EOD revision B

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A list and summary description of all Johnson Space Center electronic laboratory and photographic laboratory capabilities available to earth resources division personnel for processing earth resources data are provided. The electronic capabilities pertain to those facilities and systems that use electronic and/or photographic products as output. The photographic capabilities pertain to equipment that uses photographic images as input and electronic and/or table summarizes processing steps. A general hardware description is presented for each of the data processing systems, and the titles of computer programs are used to identify the capabilities and data flow.

  7. Implementing NASA's Capability-Driven Approach: Insight into NASA's Processes for Maturing Exploration Systems

    NASA Technical Reports Server (NTRS)

    Williams-Byrd, Julie; Arney, Dale; Rodgers, Erica; Antol, Jeff; Simon, Matthew; Hay, Jason; Larman, Kevin

    2015-01-01

    NASA is engaged in transforming human spaceflight. The Agency is shifting from an exploration-based program with human activities focused on low Earth orbit (LEO) and targeted robotic missions in deep space to a more sustainable and integrated pioneering approach. Through pioneering, NASA seeks to address national goals to develop the capacity for people to work, learn, operate, live, and thrive safely beyond the Earth for extended periods of time. However, pioneering space involves more than the daunting technical challenges of transportation, maintaining health, and enabling crew productivity for long durations in remote, hostile, and alien environments. This shift also requires a change in operating processes for NASA. The Agency can no longer afford to engineer systems for specific missions and destinations and instead must focus on common capabilities that enable a range of destinations and missions. NASA has codified a capability driven approach, which provides flexible guidance for the development and maturation of common capabilities necessary for human pioneers beyond LEO. This approach has been included in NASA policy and is captured in the Agency's strategic goals. It is currently being implemented across NASA's centers and programs. Throughout 2014, NASA engaged in an Agency-wide process to define and refine exploration-related capabilities and associated gaps, focusing only on those that are critical for human exploration beyond LEO. NASA identified 12 common capabilities ranging from Environmental Control and Life Support Systems to Robotics, and established Agency-wide teams or working groups comprised of subject matter experts that are responsible for the maturation of these exploration capabilities. These teams, called the System Maturation Teams (SMTs) help formulate, guide and resolve performance gaps associated with the identified exploration capabilities. The SMTs are defining performance parameters and goals for each of the 12 capabilities, developing maturation plans and roadmaps for the identified performance gaps, specifying the interfaces between the various capabilities, and ensuring that the capabilities mature and integrate to enable future pioneering missions. By managing system development through the SMTs instead of traditional NASA programs and projects, the Agency is shifting from mission-driven development to a more flexible, capability-driven development. The process NASA uses to establish, integrate, prioritize, and manage the SMTs and associated capabilities is iterative. NASA relies on the Human Exploration and Operation Mission Directorate's SMT Integration Team within Advanced Exploration Systems to coordinate and facilitate the SMT process. The SMT Integration team conducts regular reviews and coordination meetings among the SMTs and has developed a number of tools to help the Agency implement capability driven processes. The SMT Integration team is uniquely positioned to help the Agency coordinate the SMTs and other processes that are making the capability-driven approach a reality. This paper will introduce the SMTs and the 12 key capabilities they represent. The role of the SMTs will be discussed with respect to Agency-wide processes to shift from mission-focused exploration to a capability-driven pioneering approach. Specific examples will be given to highlight systems development and testing within the SMTs. These examples will also show how NASA is using current investments in the International Space Station and future investments to develop and demonstrate capabilities. The paper will conclude by describing next steps and a process for soliciting feedback from the space exploration community to refine NASA's process for developing common exploration capabilities.

  8. Low-cost Landsat digital processing system for state and local information systems

    NASA Technical Reports Server (NTRS)

    Hooper, N. J.; Spann, G. W.; Faust, N. L.; Paludan, C. T. N.

    1979-01-01

    The paper details a minicomputer-based system which is well within the budget of many state, regional, and local agencies that previously could not afford digital processing capability. In order to achieve this goal a workable small-scale Landsat system is examined to provide low-cost automated processing. It is anticipated that the alternative systems will be based on a single minicomputer, but that the peripherals will vary depending on the capability emphasized in a particular system.

  9. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  10. ISO 9000 and/or Systems Engineering Capability Maturity Model?

    NASA Technical Reports Server (NTRS)

    Gholston, Sampson E.

    2002-01-01

    For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.

  11. Intelligent robotic tracker

    NASA Technical Reports Server (NTRS)

    Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.

    1987-01-01

    An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.

  12. 10 Steps to Building an Architecture for Space Surveillance Projects

    NASA Astrophysics Data System (ADS)

    Gyorko, E.; Barnhart, E.; Gans, H.

    Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.

  13. Bibliographic Post-Processing with the TIS Intelligent Gateway: Analytical and Communication Capabilities.

    ERIC Educational Resources Information Center

    Burton, Hilary D.

    TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…

  14. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    NASA Astrophysics Data System (ADS)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.

  15. Capability Maturity Model (CMM) for Software Process Improvements

    NASA Technical Reports Server (NTRS)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  16. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  17. Process Improvement Should Link to Security: SEPG 2007 Security Track Recap

    DTIC Science & Technology

    2007-09-01

    the Systems Security Engineering Capability Maturity Model (SSE- CMM / ISO 21827) and its use in system software developments ...software development life cycle ( SDLC )? 6. In what ways should process improvement support security in the SDLC ? 1.2 10BPANEL RESOURCES For each... project management, and support practices through the use of the capability maturity models including the CMMI and the Systems Security

  18. Integrated cockpit display and processor: the best solution for Link-16 applications

    NASA Astrophysics Data System (ADS)

    Smeyne, Alan L.; Savaya, John

    2000-08-01

    Link-16 Data Link systems are being added to current avionics systems to provide increased situational awareness and command data. By using a single intelligent display system, the impact to existing aircraft systems to implement Link-16 capabilities is minimized. Litton Guidance & Control Systems (G&CS), a military avionics supplier for more than forty years, provides Open System Architecture (OSA), large screen aircraft display systems. Based on a common set of plug-in modules, these Smart Multi-Function Displays (SMFD) are available in a variety of sizes and processing capabilities, any one of which can meet the Link-16 requirements. Using a single smart SMFD connected to a Link-16 subsystem has many advantages. With digital moving map capability, the SMFD can monitor and display air and ground tracks of both friendly and hostile forces while providing potential threat data to the operator. The SMFD can also monitor vehicle status and mission data to share between friendly air and surface forces. To support the integrated digital battlefield, Link-16 capability is required and the Litton G&CS SMFD provides the processing/display functionality to implement this capability.

  19. Information Processing Capabilities in Performers Differing in Levels of Motor Skill

    DTIC Science & Technology

    1979-01-01

    F. I. 1. , ’ Lockhart , R. S. Levels of* processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 1972, 11, 671-684...ARI TECHNICAL REPORT LEVEr.79iA4 Information Processing Capabilities in Performers Differing In Levels of 00 Motor Skill ,4 by Robert N. Singer... PROCESSING CAPABILITIES IN PERFORMERS DIFFERING IN LEVELS OF MOTOR SKILL INTRODUCTION In the human behaving systems model developed by Singer, Gerson, and

  20. The Use of a UNIX-Based Workstation in the Information Systems Laboratory

    DTIC Science & Technology

    1989-03-01

    system. The conclusions of the research and the resulting recommendations are presented in Chapter III. These recommendations include how to manage...required to run the program on a new system, these should not be significant changes. 2. Processing Environment The UNIX processing environment is...interactive with multi-tasking and multi-user capabilities. Multi-tasking refers to the fact that many programs can be run concurrently. This capability

  1. Station to instrumented aircraft L-band telemetry system and RF signal controller for spacecraft simulations and station calibration

    NASA Technical Reports Server (NTRS)

    Scaffidi, C. A.; Stocklin, F. J.; Feldman, M. B.

    1971-01-01

    An L-band telemetry system designed to provide the capability of near-real-time processing of calibration data is described. The system also provides the capability of performing computerized spacecraft simulations, with the aircraft as a data source, and evaluating the network response. The salient characteristics of a telemetry analysis and simulation program (TASP) are discussed, together with the results of TASP testing. The results of the L-band system testing have successfully demonstrated the capability of near-real-time processing of telemetry test data, the control of the ground-received signal to within + or - 0.5 db, and the computer generation of test signals.

  2. Vortex information display system program description manual. [data acquisition from laser Doppler velocimeters and real time operation

    NASA Technical Reports Server (NTRS)

    Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.

    1975-01-01

    A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.

  3. Aviation System Analysis Capability Executive Assistant Design

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Osman, Mohammed; Godso, David; King, Brent; Ricciardi, Michael

    1998-01-01

    In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment.

  4. Aviation System Analysis Capability Executive Assistant Development

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul

    1999-01-01

    In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.

  5. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  6. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  7. Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process

    DTIC Science & Technology

    2012-10-01

    involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems

  8. A Contrast in Use of Metrics in Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Behnke, Jeanne; Hines-Watts, Tonjua

    2007-01-01

    In recent years there has been a surge in the number of systems for processing, archiving and distributing remotely sensed data. Such systems, working independently as well as in collaboration, have been contributing greatly to the advances in the scientific understanding of the Earth system, as well as utilization of the data for nationally and internationally important applications. Among such systems, we consider those that are developed by or under the sponsorship of NASA to fulfill one of its strategic objectives: "Study Earth from space to advance scientific understanding and meet societal needs." NASA's Earth science data systems are of varying size and complexity depending on the requirements they are intended to meet. Some data systems are regarded as NASA's "Core Capabilities" that provide the basic infrastructure for processing, archiving and distributing a set of data products to a large and diverse user community in a robust and reliable manner. Other data systems constitute "Community Capabilities". These provide specialized and innovative services to data users and/or research products offering new scientific insight. Such data systems are generally supported by NASA through peer reviewed competition. Examples of Core Capabilities are 1. Earth Observing Data and Information System (EOSDIS) with its Distributed Active Archive Centers (DAACs), Science Investigator-led Processing Systems (SIPSs), and the EOS Clearing House (ECHO); 2. Tropical Rainfall Measurement Mission (TRMM) Science Data and Information System (TSDIS); 3. Ocean Data Processing System (ODPS); and 4. CloudSat Data Processing Center. Examples of Community Capabilities are projects under the Research, Education and Applications Solutions Network (REASON), and Advancing Collaborative Connections for Earth System Science (ACCESS) Programs. In managing these data system capabilities, it is necessary to have well-established goals and to measure progress relative to them. Progress is measured through "metrics", which can be a combination of quantitative as well as qualitative assessments. The specific metrics of interest depend on the user of the metrics as well as the type of data system. The users of metrics can be data system managers, program managers, funding agency or the public. Data system managers need metrics for assessing and improving the performance of the system and for future planning. Program managers need metrics to assess progress and the value of the data systems sponsored by them. Also, there is a difference in the metrics needed for core capabilities that tend to be more complex, larger and longer-term compared to community capabilities and the community capabilities that tend to be simpler, smaller and shorter-term. Even among community capabilities there are differences; hence the same set of metrics does not apply to all. Some provide data products to users, some provide services that enable better utilization of data or interoperability among other systems, and some are a part of a larger project where provision of data or services is only a minor activity. There is also a contrast between metrics used for internal and external purposes. Examples of internal purposes are: ensuring that the system meets its requirements, and planning for evolution and growth. Examples of external purposes are: providing to sponsors indicators of success of the systems, demonstrating the contributions of the system to overall program success, etc. This paper will consider EOSDIS, REASON and ACCESS programs to show the various types of metrics needed and how they need to be tailored to the types of data systems while maintaining the overall management goals of measuring progress and contributions made by the data systems.

  9. A Contrast in Use of Metrics in Earth Science Data Systems

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H. K.; Behnke, J.; Hines-Watts, T. M.

    2007-12-01

    In recent years there has been a surge in the number of systems for processing, archiving and distributing remotely sensed data. Such systems, working independently as well as in collaboration, have been contributing greatly to the advances in the scientific understanding of the Earth system, as well as utilization of the data for nationally and internationally important applications. Among such systems, we consider those that are developed by or under the sponsorship of NASA to fulfill one of its strategic objectives: "Study Earth from space to advance scientific understanding and meet societal needs." NASA's Earth science data systems are of varying size and complexity depending on the requirements they are intended to meet. Some data systems are regarded as NASA's Core Capabilities that provide the basic infrastructure for processing, archiving and distributing a set of data products to a large and diverse user community in a robust and reliable manner. Other data systems constitute Community Capabilities. These provide specialized and innovative services to data users and/or research products offering new scientific insight. Such data systems are generally supported by NASA through peer reviewed competition. Examples of Core Capabilities are 1. Earth Observing Data and Information System (EOSDIS) with its Distributed Active Archive Centers (DAACs), Science Investigator-led Processing Systems (SIPSs), and the EOS Clearing House (ECHO); 2. Tropical Rainfall Measurement Mission (TRMM) Science Data and Information System (TSDIS); 3. Ocean Data Processing System (ODPS); and 4. CloudSat Data Processing Center. Examples of Community Capabilities are projects under the Research, Education and Applications Solutions Network (REASoN), and Advancing Collaborative Connections for Earth System Science (ACCESS) Programs. In managing these data system capabilities, it is necessary to have well-established goals and to measure progress relative to them. Progress is measured through metrics, which can be a combination of quantitative as well as qualitative assessments. The specific metrics of interest depend on the user of the metrics as well as the type of data system. The users of metrics can be data system managers, program managers, funding agency or the public. Data system managers need metrics for assessing and improving the performance of the system and for future planning. Program managers need metrics to assess progress and the value of the data systems sponsored by them. Also, there is a difference in the metrics needed for core capabilities that tend to be more complex, larger and longer-term compared to community capabilities and the community capabilities that tend to be simpler, smaller and shorter-term. Even among community capabilities there are differences; hence the same set of metrics does not apply to all. Some provide data products to users, some provide services that enable better utilization of data or interoperability among other systems, and some are a part of a larger project where provision of data or services is only a minor activity. There is also a contrast between metrics used for internal and external purposes. Examples of internal purposes are: ensuring that the system meets its requirements, and planning for evolution and growth. Examples of external purposes are: providing to sponsors indicators of success of the systems, demonstrating the contributions of the system to overall program success, etc. This paper will consider EOSDIS, REASoN and ACCESS programs to show the various types of metrics needed and how they need to be tailored to the types of data systems while maintaining the overall management goals of measuring progress and contributions made by the data systems.

  10. DSN command system Mark III-78. [data processing

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1978-01-01

    The Deep Space Network command Mark III-78 data processing system includes a capability for a store-and-forward handling method. The functions of (1) storing the command files at a Deep Space station; (2) attaching the files to a queue; and (3) radiating the commands to the spacecraft are straightforward. However, the total data processing capability is a result of assuming worst case, failure-recovery, or nonnominal operating conditions. Optional data processing functions include: file erase, clearing the queue, suspend radiation, command abort, resume command radiation, and close window time override.

  11. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.12

    DTIC Science & Technology

    2015-09-03

    the Geostationary Ocean Color Imager (GOCI) sensor, aboard the Communication Ocean and Meteorological Satellite (COMS) satellite. Additionally, this...this capability works in conjunction with AOPS • Improvements to the AOPS mosaicking capability • Prepare the NRT Geostationary Ocean Color Imager...Warfare (EXW) Geostationary Ocean Color Imager (GOCI) Gulf of Mexico (GOM) Hierarchical Data Format (HDF) Integrated Data Processing System (IDPS

  12. Performance evaluation capabilities for the design of physical systems

    NASA Technical Reports Server (NTRS)

    Pilkey, W. D.; Wang, B. P.

    1972-01-01

    The results are presented of a study aimed at developing and formulating a capability for the limiting performance of large steady state systems. The accomplishments reported include: (1) development of a theory of limiting performance of large systems subject to steady state inputs; (2) application and modification of PERFORM, the computational capability for the limiting performance of systems with transient inputs; and (3) demonstration that use of an inherently smooth control force for a limiting performance calculation improves the system identification phase of the design process for physical systems subjected to transient loading.

  13. Towards Automatic Treatment of Natural Language.

    ERIC Educational Resources Information Center

    Lonsdale, Deryle

    1984-01-01

    Because automated natural language processing relies heavily on the still developing fields of linguistics, knowledge representation, and computational linguistics, no system is capable of mimicking human linguistic capabilities. For the present, interactive systems may be used to augment today's technology. (MSE)

  14. Integrated System Health Management (ISHM): Systematic Capability Implementation

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Holland, Randy; Schmalzwel, John; Duncavage, Dan

    2006-01-01

    This paper provides a credible approach for implementation of ISHM capability in any system. The requirements and processes to implement ISHM capability are unique in that a credible capability is initially implemented at a low level, and it evolves to achieve higher levels by incremental augmentation. In contrast, typical capabilities, such as thrust of an engine, are implemented once at full Functional Capability Level (FCL), which is not designed to change during the life of the product. The approach will describe core ingredients (e.g. technologies, architectures, etc.) and when and how ISHM capabilities may be implemented. A specific architecture/taxonomy/ontology will be described, as well as a prototype software environment that supports development of ISHM capability. This paper will address implementation of system-wide ISHM as a core capability, and ISHM for specific subsystems as expansions and evolution, but always focusing on achieving an integrated capability.

  15. An open-loop system design for deep space signal processing applications

    NASA Astrophysics Data System (ADS)

    Tang, Jifei; Xia, Lanhua; Mahapatra, Rabi

    2018-06-01

    A novel open-loop system design with high performance is proposed for space positioning and navigation signal processing. Divided by functions, the system has four modules, bandwidth selectable data recorder, narrowband signal analyzer, time-delay difference of arrival estimator and ANFIS supplement processor. A hardware-software co-design approach is made to accelerate computing capability and improve system efficiency. Embedded with the proposed signal processing algorithms, the designed system is capable of handling tasks with high accuracy over long period of continuous measurements. The experiment results show the Doppler frequency tracking root mean square error during 3 h observation is 0.0128 Hz, while the TDOA residue analysis in correlation power spectrum is 0.1166 rad.

  16. Functional and performance requirements of the next NOAA-Kasas City computer system

    NASA Technical Reports Server (NTRS)

    Mosher, F. R.

    1985-01-01

    The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.

  17. KSC Technical Capabilities Website

    NASA Technical Reports Server (NTRS)

    Nufer, Brian; Bursian, Henry; Brown, Laurette L.

    2010-01-01

    This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.

  18. Test Capability Enhancements to the NASA Langley 8-Foot High Temperature Tunnel

    NASA Technical Reports Server (NTRS)

    Harvin, S. F.; Cabell, K. F.; Gallimore, S. D.; Mekkes, G. L.

    2006-01-01

    The NASA Langley 8-Foot High Temperature Tunnel produces true enthalpy environments simulating flight from Mach 4 to Mach 7, primarily for airbreathing propulsion and aerothermal/thermo-structural testing. Flow conditions are achieved through a methane-air heater and nozzles producing aerodynamic Mach numbers of 4, 5 or 7 and have exit diameters of 8 feet or 4.5 feet. The 12-ft long free-jet test section, housed inside a 26-ft vacuum sphere, accommodates large test articles. Recently, the facility underwent significant upgrades to support hydrocarbon fueled scramjet engine testing and to expand flight simulation capability. The upgrades were required to meet engine system development and flight clearance verification requirements originally defined by the joint NASA-Air Force X-43C Hypersonic Flight Demonstrator Project and now the Air Force X-51A Program. Enhancements to the 8-Ft. HTT were made in four areas: 1) hydrocarbon fuel delivery; 2) flight simulation capability; 3) controls and communication; and 4) data acquisition/processing. The upgrades include the addition of systems to supply ethylene and liquid JP-7 to test articles; a Mach 5 nozzle with dynamic pressure simulation capability up to 3200 psf, the addition of a real-time model angle-of-attack system; a new programmable logic controller sub-system to improve process controls and communication with model controls; the addition of MIL-STD-1553B and high speed data acquisition systems and a classified data processing environment. These additions represent a significant increase to the already unique test capability and flexibility of the facility, and complement the existing array of test support hardware such as a model injection system, radiant heaters, six-component force measurement system, and optical flow field visualization hardware. The new systems support complex test programs that require sophisticated test sequences and precise management of process fluids. Furthermore, the new systems, such as the real-time angle of attack system and the new programmable logic controller enhance the test efficiency of the facility. The motivation for the upgrades and the expanded capabilities is described here.

  19. Smart laser hole drilling for gas turbine combustors

    NASA Astrophysics Data System (ADS)

    Laraque, Edy

    1991-04-01

    A smart laser drilling system, which incorporates air flow inspection-in-process of the holes and intelligent real-time process parameter corrections, is described. The system along with good laser parameter developments is proved to be efficient for producing cooling holes which meet the highest aeronautical standards. To date, the system is used for percussion drilling of combustion chamber cooling holes. The system is considered to be very economical due to the drilling-on-the-fly capability that is capable of drilling up to 3 holes of 0.025-in. dia. per second.

  20. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  1. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  2. State/federal interaction of LANDSAT system and related technical assistance

    NASA Technical Reports Server (NTRS)

    Tesser, P. A.

    1981-01-01

    The history of state involvement in LANDSAT systems planning and related efforts is described. Currently 16 states have visual LANDSAT capabilities and 10 others are planning on developing such capabilities. The federal government's future plans for the LANDSAT system, the impacts of recent budget decisions on the systems, and the FY 82 budget process are examined.

  3. Aircraft Alerting Systems Standardization Study. Phase IV. Accident Implications on Systems Design.

    DTIC Science & Technology

    1982-06-01

    computing and processing to assimilate and process status informa- 5 tion using...provided with capabilities in computing and processing , sensing, interfacing, and controlling and displaying. 17 o Computing and Processing - Algorithms...alerting system to perform a flight status monitor function would require additional sensinq, computing and processing , interfacing, and controlling

  4. Investigation of Capabilities and Technologies Supporting Rapid UAV Launch System Development

    DTIC Science & Technology

    2015-06-01

    NUMBERS 6. AUTHOR(S) Patrick Alan Livesay 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943 8. PERFORMING ...to operate. This enabled the launcher design team to more clearly determine and articulate system require- ments and performance parameters. Next, a...Process (AHP) was performed to xvii prioritize the capabilities and assist in the decision-making process [1]. The AHP decision-analysis technique is

  5. Integrated Systems Health Management for Intelligent Systems

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Melcher, Kevin

    2011-01-01

    The implementation of an integrated system health management (ISHM) capability is fundamentally linked to the management of data, information, and knowledge (DIaK) with the purposeful objective of determining the health of a system. Management implies storage, distribution, sharing, maintenance, processing, reasoning, and presentation. ISHM is akin to having a team of experts who are all individually and collectively observing and analyzing a complex system, and communicating effectively with each other in order to arrive at an accurate and reliable assessment of its health. In this chapter, concepts, procedures, and approaches are presented as a foundation for implementing an ISHM capability relevant to intelligent systems. The capability stresses integration of DIaK from all elements of a system, emphasizing an advance toward an on-board, autonomous capability. Both ground-based and on-board ISHM capabilities are addressed. The information presented is the result of many years of research, development, and maturation of technologies, and of prototype implementations in operational systems.

  6. First Results From A Multi-Ion Beam Lithography And Processing System At The University Of Florida

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gila, Brent; Appleton, Bill R.; Fridmann, Joel

    2011-06-01

    The University of Florida (UF) have collaborated with Raith to develop a version of the Raith ionLiNE IBL system that has the capability to deliver multi-ion species in addition to the Ga ions normally available. The UF system is currently equipped with a AuSi liquid metal alloy ion source (LMAIS) and ExB filter making it capable of delivering Au and Si ions and ion clusters for ion beam processing. Other LMAIS systems could be developed in the future to deliver other ion species. This system is capable of high performance ion beam lithography, sputter profiling, maskless ion implantation, ion beammore » mixing, and spatial and temporal ion beam assisted writing and processing over large areas (100 mm2)--all with selected ion species at voltages from 15-40 kV and nanometer precision. We discuss the performance of the system with the AuSi LMAIS source and ExB mass separator. We report on initial results from the basic system characterization, ion beam lithography, as well as for basic ion-solid interactions.« less

  7. Data Visualization and Animation Lab (DVAL) overview

    NASA Technical Reports Server (NTRS)

    Stacy, Kathy; Vonofenheim, Bill

    1994-01-01

    The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.

  8. Enterprise and system of systems capability development life-cycle processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, David Franklin

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less

  9. Simulator for concurrent processing data flow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.

    1992-01-01

    A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.

  10. Ultra-Wideband Time-Difference-of-Arrival Two-Point-Tracking System

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dekome, Kent; Dusl, John

    2009-01-01

    A UWB TDOA Two-Point-Tracking System has been conceived and developed at JSC. This system can provide sub-inch tracking capability of two points on one target. This capability can be applied to guide a docking process in a 2D space. Lab tests demonstrate the feasibility of this technology.

  11. Future Directions in Navy Electronic System Reliability and Survivability.

    DTIC Science & Technology

    1981-06-01

    CENTERSAN DIEGO, CA 92152 AN ACTIVITY OF THE NAVAL MATERIAL COMMAND SL GUILLE, CAPT, USN HLBLOOD Commander Technical Director ADMINISTRATIVE INFORMATION...maintenancepoiys proposed as one remedy to these problems. To implement this policy, electronic systems which are very reliable and which include health ...distribute vital data, data-processing capability, and communication capability through the use of intraship and intership networks. The capability to

  12. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  13. The Capabilities of the Graphical Observation Scheduling System (GROSS) as Used by the Astro-2 Spacelab Mission

    NASA Technical Reports Server (NTRS)

    Phillips, Shaun

    1996-01-01

    The Graphical Observation Scheduling System (GROSS) and its functionality and editing capabilities are reported on. The GROSS system was developed as a replacement for a suite of existing programs and associated processes with the aim of: providing a software tool that combines the functionality of several of the existing programs, and provides a Graphical User Interface (GUI) that gives greater data visibility and editing capabilities. It is considered that the improved editing capability provided by this approach enhanced the efficiency of the second astronomical Spacelab mission's (ASTRO-2) mission planning.

  14. An end-to-end communications architecture for condition-based maintenance applications

    NASA Astrophysics Data System (ADS)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  15. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  16. The effect of requirements prioritization on avionics system conceptual design

    NASA Astrophysics Data System (ADS)

    Lorentz, John

    This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.

  17. Defining Medical Capabilities for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Hailey, M.; Antonsen, E.; Blue, R.; Reyes, D.; Mulcahy, R.; Kerstman, E.; Bayuse, T.

    2018-01-01

    Exploration-class missions to the moon, Mars and beyond will require a significant change in medical capability from today's low earth orbit centric paradigm. Significant increases in autonomy will be required due to differences in duration, distance and orbital mechanics. Aerospace medicine and systems engineering teams are working together within ExMC to meet these challenges. Identifying exploration medical system needs requires accounting for planned and unplanned medical care as defined in the concept of operations. In 2017, the ExMC Clinicians group identified medical capabilities to feed into the Systems Engineering process, including: determining what and how to address planned and preventive medical care; defining an Accepted Medical Condition List (AMCL) of conditions that may occur and a subset of those that can be treated effectively within the exploration environment; and listing the medical capabilities needed to treat those conditions in the AMCL. This presentation will discuss the team's approach to addressing these issues, as well as how the outputs of the clinical process impact the systems engineering effort.

  18. IT Acquisition: Expediting the Process to Deliver Business Capabilities to the DoD Enterprise. Revised Edition

    DTIC Science & Technology

    2012-07-01

    effectively manage delivery of information capabilities. Under IT 360, they will need to incorporate constantly evolving, market -driven commercial systems...traditional acquisition system; under IT 360, these processes are largely obsolete and create oversight ambiguities. • Congress requires that funds be...2004). Furthermore, because the product is not available on the commercial market , the development of any complementary updates will also need to be

  19. Implementation of Integrated System Fault Management Capability

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Schmalzel, John; Morris, Jon; Smith, Harvey; Turowski, Mark

    2008-01-01

    Fault Management to support rocket engine test mission with highly reliable and accurate measurements; while improving availability and lifecycle costs. CORE ELEMENTS: Architecture, taxonomy, and ontology (ATO) for DIaK management. Intelligent Sensor Processes; Intelligent Element Processes; Intelligent Controllers; Intelligent Subsystem Processes; Intelligent System Processes; Intelligent Component Processes.

  20. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    NASA Astrophysics Data System (ADS)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  1. Photoresist thin-film effects on alignment process capability

    NASA Astrophysics Data System (ADS)

    Flores, Gary E.; Flack, Warren W.

    1993-08-01

    Two photoresists were selected for alignment characterization based on their dissimilar coating properties and observed differences on alignment capability. The materials are Dynachem OFPR-800 and Shipley System 8. Both photoresists were examined on two challenging alignment levels in a submicron CMOS process, a nitride level and a planarized second level metal. An Ultratech Stepper model 1500 which features a darkfield alignment system with a broadband green light for alignment signal detection was used for this project. Initially, statistically designed linear screening experiments were performed to examine six process factors for each photoresist: viscosity, spin acceleration, spin speed, spin time, softbake time, and softbake temperature. Using the results derived from the screening experiments, a more thorough examination of the statistically significant process factors was performed. A full quadratic experimental design was conducted to examine viscosity, spin speed, and spin time coating properties on alignment. This included a characterization of both intra and inter wafer alignment control and alignment process capability. Insight to the different alignment behavior is analyzed in terms of photoresist material properties and the physical nature of the alignment detection system.

  2. Toward information management in corporations (4)

    NASA Astrophysics Data System (ADS)

    Yamamoto, Takeo

    The roles of personal computers (PC's) and workstations (WS's) in developing the corporate information system is discussed. The history and state of art for PC's and WS's are reviewed. Checkpoints for introducing PC's and WS's are ; Japanese word-processing capabilities, multi-media capabilities and network capabilities.

  3. Requirements Development for Interoperability Simulation Capability for Law Enforcement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holter, Gregory M.

    2004-05-19

    The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less

  4. Development of an imaging system for the detection of alumina on turbine blades

    NASA Astrophysics Data System (ADS)

    Greenwell, S. J.; Kell, J.; Day, J. C. C.

    2014-03-01

    An imaging system capable of detecting alumina on turbine blades by acquiring LED-induced fluorescence images has been developed. Acquiring fluorescence images at adjacent spectral bands allows the system to distinguish alumina from fluorescent surface contaminants. Repair and overhaul processes require that alumina is entirely removed from the blades by grit blasting and chemical stripping. The capability of the system to detect alumina has been investigated with two series of turbine blades provided by Rolls-Royce plc. The results illustrate that the system provides a superior inspection method to visual assessment when ascertaining whether alumina is present on turbine blades during repair and overhaul processes.

  5. Automated drug identification system

    NASA Technical Reports Server (NTRS)

    Campen, C. F., Jr.

    1974-01-01

    System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.

  6. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  7. MIRADS-2 Implementation Manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.

  8. A wireless data acquisition system for acoustic emission testing

    NASA Astrophysics Data System (ADS)

    Zimmerman, A. T.; Lynch, J. P.

    2013-01-01

    As structural health monitoring (SHM) systems have seen increased demand due to lower costs and greater capabilities, wireless technologies have emerged that enable the dense distribution of transducers and the distributed processing of sensor data. In parallel, ultrasonic techniques such as acoustic emission (AE) testing have become increasingly popular in the non-destructive evaluation of materials and structures. These techniques, which involve the analysis of frequency content between 1 kHz and 1 MHz, have proven effective in detecting the onset of cracking and other early-stage failure in active structures such as airplanes in flight. However, these techniques typically involve the use of expensive and bulky monitoring equipment capable of accurately sensing AE signals at sampling rates greater than 1 million samples per second. In this paper, a wireless data acquisition system is presented that is capable of collecting, storing, and processing AE data at rates of up to 20 MHz. Processed results can then be wirelessly transmitted in real-time, creating a system that enables the use of ultrasonic techniques in large-scale SHM systems.

  9. Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success

    DTIC Science & Technology

    2009-09-01

    comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to

  10. Interactive information processing for NASA's mesoscale analysis and space sensor program

    NASA Technical Reports Server (NTRS)

    Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.

    1985-01-01

    The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.

  11. David Florida Laboratory Thermal Vacuum Data Processing System

    NASA Technical Reports Server (NTRS)

    Choueiry, Elie

    1994-01-01

    During 1991, the Space Simulation Facility conducted a survey to assess the requirements and analyze the merits for purchasing a new thermal vacuum data processing system for its facilities. A new, integrated, cost effective PC-based system was purchased which uses commercial off-the-shelf software for operation and control. This system can be easily reconfigured and allows its users to access a local area network. In addition, it provides superior performance compared to that of the former system which used an outdated mini-computer and peripheral hardware. This paper provides essential background on the old data processing system's features, capabilities, and the performance criteria that drove the genesis of its successor. This paper concludes with a detailed discussion of the thermal vacuum data processing system's components, features, and its important role in supporting our space-simulation environment and our capabilities for spacecraft testing. The new system was tested during the ANIK E spacecraft test, and was fully operational in November 1991.

  12. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    NASA Astrophysics Data System (ADS)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less

  14. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  15. Using Design Capability Indices to Satisfy Ranged Sets of Design Requirements

    NASA Technical Reports Server (NTRS)

    Chen, Wei; Allen, Janet K.; Simpson, Timothy W.; Mistree, Farrokh

    1996-01-01

    For robust design it is desirable to allow the design requirements to vary within a certain range rather than setting point targets. This is particularly important during the early stages of design when little is known about the system and its requirements. Toward this end, design capability indices are developed in this paper to assess the capability of a family of designs, represented by a range of top-level design specifications, to satisfy a ranged set of design requirements. Design capability indices are based on process capability indices from statistical process control and provide a single objective, alternate approach to the use of Taguchi's signal-to- noise ratio which is often used for robust design. Successful implementation of design capability indices ensures that a family of designs conforms to a given ranged set of design requirements. To demonstrate an application and the usefulness of design capability indices, the design of a solar powered irrigation system is presented. Our focus in this paper is on the development and implementation of design capability indices as an alternate approach to the use of the signal-to-noise ratio and not on the results of the example problem, per se.

  16. Engineering Supply Management System: The Next Generation

    DTIC Science & Technology

    1991-09-01

    010 Partia! receipts 0018 Automatic inventory update 0 048 Discrepant material 0 004 Order processing requirements Transaction reversal capability 0 012...August 1991. 2-5 sys.em’s modules that support the DEH’s needs are the Sales Order Processing , Register Sales, Purchase Order Processing , Inventory...modular system developed by PIC Business Systems, Incorporated. This system possesses Order Processing , Inventory Management, Purchase Orders, and

  17. Supportability Technologies for Future Exploration Missions

    NASA Technical Reports Server (NTRS)

    Watson, Kevin; Thompson, Karen

    2007-01-01

    Future long-duration human exploration missions will be challenged by resupply limitations and mass and volume constraints. Consequently, it will be essential that the logistics footprint required to support these missions be minimized and that capabilities be provided to make them highly autonomous from a logistics perspective. Strategies to achieve these objectives include broad implementation of commonality and standardization at all hardware levels and across all systems, repair of failed hardware at the lowest possible hardware level, and manufacture of structural and mechanical replacement components as needed. Repair at the lowest hardware levels will require the availability of compact, portable systems for diagnosis of failures in electronic systems and verification of system functionality following repair. Rework systems will be required that enable the removal and replacement of microelectronic components with minimal human intervention to minimize skill requirements and training demand for crews. Materials used in the assembly of electronic systems (e.g. solders, fluxes, conformal coatings) must be compatible with the available repair methods and the spacecraft environment. Manufacturing of replacement parts for structural and mechanical applications will require additive manufacturing systems that can generate near-net-shape parts from the range of engineering alloys employed in the spacecraft structure and in the parts utilized in other surface systems. These additive manufacturing processes will need to be supported by real-time non-destructive evaluation during layer-additive processing for on-the-fly quality control. This will provide capabilities for quality control and may serve as an input for closed-loop process control. Additionally, non-destructive methods should be available for material property determination. These nondestructive evaluation processes should be incorporated with the additive manufacturing process - providing an in-process capability to ensure that material deposited during layer-additive processing meets required material property criteria.

  18. Automated fiber placement composite manufacturing: The mission at MSFC's Productivity Enhancement Complex

    NASA Technical Reports Server (NTRS)

    Vickers, John H.; Pelham, Larry I.

    1993-01-01

    Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.

  19. Human Support Technology Research to Enable Exploration

    NASA Technical Reports Server (NTRS)

    Joshi, Jitendra

    2003-01-01

    Contents include the following: Advanced life support. System integration, modeling, and analysis. Progressive capabilities. Water processing. Air revitalization systems. Why advanced CO2 removal technology? Solid waste resource recovery systems: lyophilization. ISRU technologies for Mars life support. Atmospheric resources of Mars. N2 consumable/make-up for Mars life. Integrated test beds. Monitoring and controlling the environment. Ground-based commercial technology. Optimizing size vs capability. Water recovery systems. Flight verification topics.

  20. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  1. Design of a low cost earth resources system

    NASA Technical Reports Server (NTRS)

    Faust, N. L.; Furman, M. D.; Spann, G. W. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. Survey results indicated that users of remote sensing data in the Southeastern U.S. were increasingly turning to digital processing techniques. All the states surveyed have had some involvement in projects using digitally processed data. Even those states which do not yet have in-house capabilities for digital processing were extremely interested in and were planning to develop such capabilities.

  2. Overview of TPS Tasks

    NASA Technical Reports Server (NTRS)

    Johnson, Sylvia M.

    2000-01-01

    The objectives of the project summarized in this viewgraph presentation are the following: (1) Develop a lightweight and low cost durable Thermal Protection System (TPS) for easy application to reusable launch vehicle payload launchers; (2) Develop quickly processed composite TPS processing and repair techniques; and (3) Develop higher temperature capability tile TPS. The benefits of this technology include reduced installation and operations cost, enhanced payload capability resulting from TPS weight reduction, and enhanced flight envelope and performance resulting from higher temperature capability TPS which can result in improved safety.

  3. SiC/SiC Composites for 1200 C and Above

    NASA Technical Reports Server (NTRS)

    DiCarlo, J. A.; Yun, H.-M.; Morscher, G. N.; Bhatt, R. T.

    2004-01-01

    The successful replacement of metal alloys by ceramic matrix composites (CMC) in high-temperature engine components will require the development of constituent materials and processes that can provide CMC systems with enhanced thermal capability along with the key thermostructural properties required for long-term component service. This chapter presents information concerning processes and properties for five silicon carbide (SiC) fiber-reinforced SiC matrix composite systems recently developed by NASA that can operate under mechanical loading and oxidizing conditions for hundreds of hours at 1204, 1315, and 1427 C, temperatures well above current metal capability. This advanced capability stems in large part from specific NASA-developed processes that significantly improve the creep-rupture and environmental resistance of the SiC fiber as well as the thermal conductivity, creep resistance, and intrinsic thermal stability of the SiC matrices.

  4. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  5. Self-conscious robotic system design process--from analysis to implementation.

    PubMed

    Chella, Antonio; Cossentino, Massimo; Seidita, Valeria

    2011-01-01

    Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.

  6. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Basile, Lisa R.; Kelly, Angelita C.

    1987-01-01

    The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

  7. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  8. Concept for Highly Mechanized Data Processing, Project 111.

    DTIC Science & Technology

    A concept is developed for a highly mechanized maintenance data processing system capable of deriving factors, influences, and correlations to raise...the level of logistics knowledge and lead to the design of a management-control system. (Author)

  9. A Description of the Development, Capabilities, and Operational Status of the Test SLATE Data Acquisition System at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Cramer, Christopher J.; Wright, James D.; Simmons, Scott A.; Bobbitt, Lynn E.; DeMoss, Joshua A.

    2015-01-01

    The paper will present a brief background of the previous data acquisition system at the National Transonic Facility (NTF) and the reasoning and goals behind the upgrade to the current Test SLATE (Test Software Laboratory and Automated Testing Environments) data acquisition system. The components, performance characteristics, and layout of the Test SLATE system within the NTF control room will be discussed. The development, testing, and integration of Test SLATE within NTF operations will be detailed. The operational capabilities of the system will be outlined including: test setup, instrumentation calibration, automatic test sequencer setup, data recording, communication between data and facility control systems, real time display monitoring, and data reduction. The current operational status of the Test SLATE system and its performance during recent NTF testing will be highlighted including high-speed, frame-by-frame data acquisition with conditional sampling post-processing applied. The paper concludes with current development work on the system including the capability for real-time conditional sampling during data acquisition and further efficiency enhancements to the wind tunnel testing process.

  10. SIGMA Release v1.2 - Capabilities, Enhancements and Fixes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Vijay; Grindeanu, Iulian R.; Ray, Navamita

    In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.

  11. A comparison of the UHF Follow-On and MILSTAR satellite communication systems

    NASA Astrophysics Data System (ADS)

    Perkins, Clifton E., Jr.

    1991-09-01

    The author compares the UHF Follow-On and MILSTAR satellite communication systems. The comparison uses an analytical hierarchy process. Although the two systems have been tasked with different missions, a comparison of cost, capability, and orbit is conducted. UFO provides many of the same capabilities as MILSTAR, but on a smaller scale. Since UFO is also a new space system acquisition, it is used to compare dollars spent to field a viable communication system. A review of frequency bands, losses, and problems is conducted to establish the relationship. Cost data is provided to establish the major difference in the systems. While MILSTAR does possess more total capability than UFO, it is 10 times more costly. Additionally, UFO is a satellite that will evolve with new technology while MILSTAR is built to full capability immediately. In the author's opinion, the incremental performance of MILSTAR does not justify its incremental cost.

  12. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  13. System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.

  14. Development of a preliminary design of a method to measure the effectiveness of virus exclusion during water process reclamation at zero-G

    NASA Technical Reports Server (NTRS)

    Fraser, A. S.; Wells, A. F.; Tenoso, H. J.; Linnecke, C. B.

    1976-01-01

    Organon Diagnostics has developed, under NASA sponsorship, a monitoring system to test the capability of a water recovery system to reject the passage of viruses into the recovered water. In this system, a non-pathogenic marker virus, bacteriophage F2, is fed into the process stream before the recovery unit and the reclaimed water is assayed for its presence. An engineering preliminary design has been performed as a parallel effort to the laboratory development of the marker virus test system. Engineering schematics and drawings present a preliminary instrument design of a fully functional laboratory prototype capable of zero-G operation.

  15. Sensing Super-position: Visual Instrument Sensor Replacement

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Schipper, John F.

    2006-01-01

    The coming decade of fast, cheap and miniaturized electronics and sensory devices opens new pathways for the development of sophisticated equipment to overcome limitations of the human senses. This project addresses the technical feasibility of augmenting human vision through Sensing Super-position using a Visual Instrument Sensory Organ Replacement (VISOR). The current implementation of the VISOR device translates visual and other passive or active sensory instruments into sounds, which become relevant when the visual resolution is insufficient for very difficult and particular sensing tasks. A successful Sensing Super-position meets many human and pilot vehicle system requirements. The system can be further developed into cheap, portable, and low power taking into account the limited capabilities of the human user as well as the typical characteristics of his dynamic environment. The system operates in real time, giving the desired information for the particular augmented sensing tasks. The Sensing Super-position device increases the image resolution perception and is obtained via an auditory representation as well as the visual representation. Auditory mapping is performed to distribute an image in time. The three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. This paper details the approach of developing Sensing Super-position systems as a way to augment the human vision system by exploiting the capabilities of the human hearing system as an additional neural input. The human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns. The known capabilities of the human hearing system to learn and understand complicated auditory patterns provided the basic motivation for developing an image-to-sound mapping system.

  16. Tunable Low Energy, Compact and High Performance Neuromorphic Circuit for Spike-Based Synaptic Plasticity

    PubMed Central

    Rahimi Azghadi, Mostafa; Iannella, Nicolangelo; Al-Sarawi, Said; Abbott, Derek

    2014-01-01

    Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities. PMID:24551089

  17. Tunable low energy, compact and high performance neuromorphic circuit for spike-based synaptic plasticity.

    PubMed

    Rahimi Azghadi, Mostafa; Iannella, Nicolangelo; Al-Sarawi, Said; Abbott, Derek

    2014-01-01

    Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities.

  18. Improvements and Extensions for Joint Polar Satellite System Algorithms

    NASA Astrophysics Data System (ADS)

    Grant, K. D.

    2016-12-01

    The National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation civilian weather satellite system: the Joint Polar Satellite System (JPSS). JPSS replaced the afternoon orbit component and ground processing of the old POES system managed by NOAA. JPSS satellites carry sensors designed to collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground processing system for JPSS is the Common Ground System (CGS), and provides command, control, and communications (C3), data processing and product delivery. CGS's data processing capability provides environmental data products (Sensor Data Records (SDRs) and Environmental Data Records (EDRs)) to the NOAA Satellite Operations Facility. The first satellite in the JPSS constellation, S-NPP, was launched in October 2011. The second satellite, JPSS-1, is scheduled for launch in January 2017. During a satellite's calibration and validation (Cal/Val) campaign, numerous algorithm updates occur. Changes identified during Cal/Val become available for implementation into the operational system for both S-NPP and JPSS-1. In addition, new capabilities, such as higher spectral and spatial resolution, will be exercised on JPSS-1. This paper will describe changes to current algorithms and products as a result of S-NPP Cal/Val and related initiatives for improved capabilities. Improvements include Cross Track Infrared Sounder high spectral processing, extended spectral and spatial ranges for Ozone Mapping and Profiler Suite ozone Total Column and Nadir Profiles, and updates to Vegetation Index, Snow Cover, Active Fires, Suspended Matter, and Ocean Color. Updates will include Sea Surface Temperature, Cloud Mask, Cloud Properties, and other improvements.

  19. Capability Description for NASA's F/A-18 TN 853 as a Testbed for the Integrated Resilient Aircraft Control Project

    NASA Technical Reports Server (NTRS)

    Hanson, Curt

    2009-01-01

    The NASA F/A-18 tail number (TN) 853 full-scale Integrated Resilient Aircraft Control (IRAC) testbed has been designed with a full array of capabilities in support of the Aviation Safety Program. Highlights of the system's capabilities include: 1) a quad-redundant research flight control system for safely interfacing controls experiments to the aircraft's control surfaces; 2) a dual-redundant airborne research test system for hosting multi-disciplinary state-of-the-art adaptive control experiments; 3) a robust reversionary configuration for recovery from unusual attitudes and configurations; 4) significant research instrumentation, particularly in the area of static loads; 5) extensive facilities for experiment simulation, data logging, real-time monitoring and post-flight analysis capabilities; and 6) significant growth capability in terms of interfaces and processing power.

  20. Oak Ridge Computerized Hierarchical Information System (ORCHIS) status report, July 1973

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, A.A.

    1974-01-01

    This report summarizes the concepts, software, and contents of the Oak Ridge Computerized Hierarchical Information System. This data analysis and text processing system was developed as an integrated, comprehensive information processing capability to meet the needs of an on-going multidisciplinary research and development organization. (auth)

  1. Eigensystem realization algorithm user's guide forVAX/VMS computers: Version 931216

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.

    1994-01-01

    The eigensystem realization algorithm (ERA) is a multiple-input, multiple-output, time domain technique for structural modal identification and minimum-order system realization. Modal identification is the process of calculating structural eigenvalues and eigenvectors (natural vibration frequencies, damping, mode shapes, and modal masses) from experimental data. System realization is the process of constructing state-space dynamic models for modern control design. This user's guide documents VAX/VMS-based FORTRAN software developed by the author since 1984 in conjunction with many applications. It consists of a main ERA program and 66 pre- and post-processors. The software provides complete modal identification capabilities and most system realization capabilities.

  2. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A processing system capable of producing solar cell junctions by ion implantation followed by pulsed electron beam annealing was developed and constructed. The machine was to be capable of processing 4-inch diameter single-crystal wafers at a rate of 10(7) wafers per year. A microcomputer-controlled pulsed electron beam annealer with a vacuum interlocked wafer transport system was designed, built and demonstrated to produce solar cell junctions on 4-inch wafers with an AMI efficiency of 12%. Experiments showed that a non-mass-analyzed (NMA) ion beam could implant 10 keV phosphorous dopant to form solar cell junctions which were equivalent to mass-analyzed implants. A NMA ion implanter, compatible with the pulsed electron beam annealer and wafer transport system was designed in detail but was not built because of program termination.

  3. NASA Stennis Space Center Integrated System Health Management Test Bed and Development Capabilities

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Holland, Randy; Coote, David

    2006-01-01

    Integrated System Health Management (ISHM) is a capability that focuses on determining the condition (health) of every element in a complex System (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK)-not just data-to control systems for safe and effective operation. This capability is currently done by large teams of people, primarily from ground, but needs to be embedded on-board systems to a higher degree to enable NASA's new Exploration Mission (long term travel and stay in space), while increasing safety and decreasing life cycle costs of spacecraft (vehicles; platforms; bases or outposts; and ground test, launch, and processing operations). The topics related to this capability include: 1) ISHM Related News Articles; 2) ISHM Vision For Exploration; 3) Layers Representing How ISHM is Currently Performed; 4) ISHM Testbeds & Prototypes at NASA SSC; 5) ISHM Functional Capability Level (FCL); 6) ISHM Functional Capability Level (FCL) and Technology Readiness Level (TRL); 7) Core Elements: Capabilities Needed; 8) Core Elements; 9) Open Systems Architecture for Condition-Based Maintenance (OSA-CBM); 10) Core Elements: Architecture, taxonomy, and ontology (ATO) for DIaK management; 11) Core Elements: ATO for DIaK Management; 12) ISHM Architecture Physical Implementation; 13) Core Elements: Standards; 14) Systematic Implementation; 15) Sketch of Work Phasing; 16) Interrelationship Between Traditional Avionics Systems, Time Critical ISHM and Advanced ISHM; 17) Testbeds and On-Board ISHM; 18) Testbed Requirements: RETS AND ISS; 19) Sustainable Development and Validation Process; 20) Development of on-board ISHM; 21) Taxonomy/Ontology of Object Oriented Implementation; 22) ISHM Capability on the E1 Test Stand Hydraulic System; 23) Define Relationships to Embed Intelligence; 24) Intelligent Elements Physical and Virtual; 25) ISHM Testbeds and Prototypes at SSC Current Implementations; 26) Trailer-Mounted RETS; 27) Modeling and Simulation; 28) Summary ISHM Testbed Environments; 29) Data Mining - ARC; 30) Transitioning ISHM to Support NASA Missions; 31) Feature Detection Routines; 32) Sample Features Detected in SSC Test Stand Data; and 33) Health Assessment Database (DIaK Repository).

  4. Material Processing Laser Systems In Production

    NASA Astrophysics Data System (ADS)

    Taeusch, David R.

    1988-11-01

    The laser processing system is now a respected, productive machine tool in the manufacturing industries. Systems in use today are proving their cost effectiveness and capabilities of processing quality parts. Several types of industrial lasers are described and their applications are discussed, with emphasis being placed on the production environment and methods of protection required for optical equipment against this normally hostile environment.

  5. Architecture for Survivable System Processing (ASSP)

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  6. Architecture for Survivable System Processing (ASSP)

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1991-01-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  7. Printed Carbon Nanotube Electronics and Sensor Systems.

    PubMed

    Chen, Kevin; Gao, Wei; Emaminejad, Sam; Kiriya, Daisuke; Ota, Hiroki; Nyein, Hnin Yin Yin; Takei, Kuniharu; Javey, Ali

    2016-06-01

    Printing technologies offer large-area, high-throughput production capabilities for electronics and sensors on mechanically flexible substrates that can conformally cover different surfaces. These capabilities enable a wide range of new applications such as low-cost disposable electronics for health monitoring and wearables, extremely large format electronic displays, interactive wallpapers, and sensing arrays. Solution-processed carbon nanotubes have been shown to be a promising candidate for such printing processes, offering stable devices with high performance. Here, recent progress made in printed carbon nanotube electronics is discussed in terms of materials, processing, devices, and applications. Research challenges and opportunities moving forward from processing and system-level integration points of view are also discussed for enabling practical applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Matrix evaluation of science objectives

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.

    1994-01-01

    The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.

  9. A modular, programmable measurement system for physiological and spaceflight applications

    NASA Technical Reports Server (NTRS)

    Hines, John W.; Ricks, Robert D.; Miles, Christopher J.

    1993-01-01

    The NASA-Ames Sensors 2000! Program has developed a small, compact, modular, programmable, sensor signal conditioning and measurement system, initially targeted for Life Sciences Spaceflight Programs. The system consists of a twelve-slot, multi-layer, distributed function backplane, a digital microcontroller/memory subsystem, conditioned and isolated power supplies, and six application-specific, physiological signal conditioners. Each signal condition is capable of being programmed for gains, offsets, calibration and operate modes, and, in some cases, selectable outputs and functional modes. Presently, the system has the capability for measuring ECG, EMG, EEG, Temperature, Respiration, Pressure, Force, and Acceleration parameters, in physiological ranges. The measurement system makes heavy use of surface-mount packaging technology, resulting in plug in modules sized 125x55 mm. The complete 12-slot system is contained within a volume of 220x150x70mm. The system's capabilities extend well beyond the specific objectives of NASA programs. Indeed, the potential commercial uses of the technology are virtually limitless. In addition to applications in medical and biomedical sensing, the system might also be used in process control situations, in clinical or research environments, in general instrumentation systems, factory processing, or any other applications where high quality measurements are required.

  10. A modular, programmable measurement system for physiological and spaceflight applications

    NASA Astrophysics Data System (ADS)

    Hines, John W.; Ricks, Robert D.; Miles, Christopher J.

    1993-02-01

    The NASA-Ames Sensors 2000] Program has developed a small, compact, modular, programmable, sensor signal conditioning and measurement system, initially targeted for Life Sciences Spaceflight Programs. The system consists of a twelve-slot, multi-layer, distributed function backplane, a digital microcontroller/memory subsystem, conditioned and isolated power supplies, and six application-specific, physiological signal conditioners. Each signal condition is capable of being programmed for gains, offsets, calibration and operate modes, and, in some cases, selectable outputs and functional modes. Presently, the system has the capability for measuring ECG, EMG, EEG, Temperature, Respiration, Pressure, Force, and Acceleration parameters, in physiological ranges. The measurement system makes heavy use of surface-mount packaging technology, resulting in plug in modules sized 125x55 mm. The complete 12-slot system is contained within a volume of 220x150x70mm. The system's capabilities extend well beyond the specific objectives of NASA programs. Indeed, the potential commercial uses of the technology are virtually limitless. In addition to applications in medical and biomedical sensing, the system might also be used in process control situations, in clinical or research environments, in general instrumentation systems, factory processing, or any other applications where high quality measurements are required.

  11. Ion Implantation with in-situ Patterning for IBC Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graff, John W.

    2014-10-24

    Interdigitated back-side Contact (IBC) solar cells are the highest efficiency silicon solar cells currently on the market. Unfortunately the cost to produce these solar cells is also very high, due to the large number of processing steps required. Varian believes that only the combination of high efficiency and low cost can meet the stated goal of $1/Wp. The core of this program has been to develop an in-situ patterning capability for an ion implantation system capable of producing patterned doped regions for IBC solar cells. Such a patterning capable ion implanter can reduce the number of process steps required tomore » manufacture IBC cells, and therefore significantly reduce the cost. The present program was organized into three phases. Phase I was to select a patterning approach and determine the patterning requirements for IBC cells. Phase II consists of construction of a Beta ion implantation system containing in-situ patterning capability. Phase III consists of shipping and installation of the ion implant system in a customer factory where it will be tested and proven in a pilot production line.« less

  12. From Process to Product: Your Risk Process at Work

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Fogarty, Jenifer; Charles, John; Buquo, Lynn; Sibonga, Jean; Alexander, David; Horn, Wayne G.; Edwards, J. Michelle

    2010-01-01

    The Space Life Sciences Directorate (SLSD) and Human Research Program (HRP) at the NASA/Johnson Space Center work together to address and manage the human health and performance risks associated with human space flight. This includes all human system requirements before, during, and after space flight, providing for research, and managing the risk of adverse long-term health outcomes for the crew. We previously described the framework and processes developed for identifying and managing these human system risks. The focus of this panel is to demonstrate how the implementation of the framework and associated processes has provided guidance in the management and communication of human system risks. The risks of early onset osteoporosis, CO2 exposure, and intracranial hypertension in particular have all benefitted from the processes developed for human system risk management. Moreover, we are continuing to develop capabilities, particularly in the area of information architecture, which will also be described. We are working to create a system whereby all risks and associated actions can be tracked and related to one another electronically. Such a system will enhance the management and communication capabilities for the human system risks, thereby increasing the benefit to researchers and flight surgeons.

  13. Wake Vortex Tangential Velocity Adaptive Spectral (TVAS) algorithm for pulsed Lidar systems.

    DOT National Transportation Integrated Search

    2011-06-20

    In 2008 the FAA tasked the Volpe Center with the development of a government owned processing package capable of performing wake detection, characterization and tracking. : The current paper presents the background, progress, and capabilities to date...

  14. The Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Survey

    DTIC Science & Technology

    2012-11-01

    reflecting the fact that project managers can often optimize the value of one of these parameters, but only at the expense of the other two. For example...which system developers can compare their SE capabilities to manage SE process improvements. As a reward for their participation, the companion...higher requirements development and management capability is strongly associat- ed with better program performance, particularly on challenging projects

  15. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  16. Low-cost, high-speed back-end processing system for high-frequency ultrasound B-mode imaging.

    PubMed

    Chang, Jin Ho; Sun, Lei; Yen, Jesse T; Shung, K Kirk

    2009-07-01

    For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution.

  17. Low-Cost, High-Speed Back-End Processing System for High-Frequency Ultrasound B-Mode Imaging

    PubMed Central

    Chang, Jin Ho; Sun, Lei; Yen, Jesse T.; Shung, K. Kirk

    2009-01-01

    For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution. PMID:19574160

  18. Space Station Freedom extravehicular activity systems evolution study

    NASA Technical Reports Server (NTRS)

    Rouen, Michael

    1990-01-01

    Evaluation of Space Station Freedom (SSF) support of manned exploration is in progress to identify SSF extravehicular activity (EVA) system evolution requirements and capabilities. The output from these studies will provide data to support the preliminary design process to ensure that Space Station EVA system requirements for future missions (including the transportation node) are adequately considered and reflected in the baseline design. The study considers SSF support of future missions and the EVA system baseline to determine adequacy of EVA requirements and capabilities and to identify additional requirements, capabilities, and necessary technology upgrades. The EVA demands levied by formal requirements and indicated by evolutionary mission scenarios are high for the out-years of Space Station Freedom. An EVA system designed to meet the baseline requirements can easily evolve to meet evolution demands with few exceptions. Results to date indicate that upgrades or modifications to the EVA system may be necessary to meet the full range of EVA thermal environments associated with the transportation node. Work continues to quantify the EVA capability in this regard. Evolution mission scenarios with EVA and ground unshielded nuclear propulsion engines are inconsistent with anthropomorphic EVA capabilities.

  19. Data Requirements for Oceanic Processes in the Open Ocean, Coastal Zone, and Cryosphere

    NASA Technical Reports Server (NTRS)

    Nagler, R. G.; Mccandless, S. W., Jr.

    1978-01-01

    The type of information system that is needed to meet the requirements of ocean, coastal, and polar region users was examined. The requisite qualities of the system are: (1) availability, (2) accessibility, (3) responsiveness, (4) utility, (5) continuity, and (6) NASA participation. The system would not displace existing capabilities, but would have to integrate and expand the capabilities of existing systems and resolve the deficiencies that currently exist in producer-to-user information delivery options.

  20. The Ion Propulsion System for the Solar Electric Propulsion Technology Demonstration Mission

    NASA Technical Reports Server (NTRS)

    Herman, Daniel A.; Santiago, Walter; Kamhawi, Hani; Polk, James E.; Snyder, John Steven; Hofer, Richard R.; Parker, J. Morgan

    2015-01-01

    The Asteroid Redirect Robotic Mission is a candidate Solar Electric Propulsion Technology Demonstration Mission whose main objectives are to develop and demonstrate a high-power solar electric propulsion capability for the Agency and return an asteroidal mass for rendezvous and characterization in a companion human-crewed mission. The ion propulsion system must be capable of operating over an 8-year time period and processing up to 10,000 kg of xenon propellant. This high-power solar electric propulsion capability, or an extensible derivative of it, has been identified as a critical part of an affordable, beyond-low-Earth-orbit, manned-exploration architecture. Under the NASA Space Technology Mission Directorate the critical electric propulsion and solar array technologies are being developed. The ion propulsion system being co-developed by the NASA Glenn Research Center and the Jet Propulsion Laboratory for the Asteroid Redirect Vehicle is based on the NASA-developed 12.5 kW Hall Effect Rocket with Magnetic Shielding (HERMeS0 thruster and power processing technologies. This paper presents the conceptual design for the ion propulsion system, the status of the NASA in-house thruster and power processing activity, and an update on flight hardware.

  1. The Ion Propulsion System for the Solar Electric Propulsion Technology Demonstration Mission

    NASA Technical Reports Server (NTRS)

    Herman, Daniel A.; Santiago, Walter; Kamhawi, Hani; Polk, James E.; Snyder, John Steven; Hofer, Richard; Parker, J. Morgan

    2015-01-01

    The Asteroid Redirect Robotic Mission is a candidate Solar Electric Propulsion Technology Demonstration Mission whose main objectives are to develop and demonstrate a high-power solar electric propulsion capability for the Agency and return an asteroidal mass for rendezvous and characterization in a subsequent human-crewed mission. The ion propulsion subsystem must be capable of operating over an 8-year time period and processing up to 10,000 kg of xenon propellant. This high-power solar electric propulsion capability, or an extensible derivative of it, has been identified as an enabling element of an affordable beyond low-earth orbit human-crewed exploration architecture. Under the NASA Space Technology Mission Directorate the critical electric propulsion and solar array technologies are being developed. The ion propulsion system for the Asteroid Redirect Vehicle is based on the NASA-developed 12.5 kW Hall Effect Rocket with Magnetic Shielding thruster and power processing technologies. This paper presents the conceptual design for the ion propulsion system, a status on the NASA in-house thruster and power processing is provided, and an update on acquisition for flight provided.

  2. Foliage penetration by using 4-D point cloud data

    NASA Astrophysics Data System (ADS)

    Méndez Rodríguez, Javier; Sánchez-Reyes, Pedro J.; Cruz-Rivera, Sol M.

    2012-06-01

    Real-time awareness and rapid target detection are critical for the success of military missions. New technologies capable of detecting targets concealed in forest areas are needed in order to track and identify possible threats. Currently, LAser Detection And Ranging (LADAR) systems are capable of detecting obscured targets; however, tracking capabilities are severely limited. Now, a new LADAR-derived technology is under development to generate 4-D datasets (3-D video in a point cloud format). As such, there is a new need for algorithms that are able to process data in real time. We propose an algorithm capable of removing vegetation and other objects that may obfuscate concealed targets in a real 3-D environment. The algorithm is based on wavelets and can be used as a pre-processing step in a target recognition algorithm. Applications of the algorithm in a real-time 3-D system could help make pilots aware of high risk hidden targets such as tanks and weapons, among others. We will be using a 4-D simulated point cloud data to demonstrate the capabilities of our algorithm.

  3. AOIPS - An interactive image processing system. [Atmospheric and Oceanic Information Processing System

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Quann, J. J.; Billingsley, J. B.

    1978-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) was developed to help applications investigators perform required interactive image data analysis rapidly and to eliminate the inefficiencies and problems associated with batch operation. This paper describes the configuration and processing capabilities of AOIPS and presents unique subsystems for displaying, analyzing, storing, and manipulating digital image data. Applications of AOIPS to research investigations in meteorology and earth resources are featured.

  4. 30 CFR 1227.103 - What must a State's delegation proposal contain?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... processing, including compatibility with ONRR automated systems, electronic commerce capabilities, and data storage capabilities; (B) Accessing reference data; (C) Contacting production or royalty reporters; (D...) Maintaining security of confidential and proprietary information; and (H) Providing data to other Federal...

  5. 30 CFR 1227.103 - What must a State's delegation proposal contain?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... processing, including compatibility with ONRR automated systems, electronic commerce capabilities, and data storage capabilities; (B) Accessing reference data; (C) Contacting production or royalty reporters; (D...) Maintaining security of confidential and proprietary information; and (H) Providing data to other Federal...

  6. 30 CFR 1227.103 - What must a State's delegation proposal contain?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... processing, including compatibility with ONRR automated systems, electronic commerce capabilities, and data storage capabilities; (B) Accessing reference data; (C) Contacting production or royalty reporters; (D...) Maintaining security of confidential and proprietary information; and (H) Providing data to other Federal...

  7. A 99 percent purity molecular sieve oxygen generator

    NASA Technical Reports Server (NTRS)

    Miller, G. W.

    1991-01-01

    Molecular sieve oxygen generating systems (MSOGS) have become the accepted method for the production of breathable oxygen on military aircraft. These systems separate oxygen for aircraft engine bleed air by application of pressure swing adsorption (PSA) technology. Oxygen is concentrated by preferential adsorption in nitrogen in a zeolite molecular sieve. However, the inability of current zeolite molecular sieves to discriminate between oxygen and argon results in an oxygen purity limitations of 93-95 percent (both oxygen and argon concentrate). The goal was to develop a new PSA process capable of exceeding the present oxygen purity limitations. A novel molecular sieve oxygen concentrator was developed which is capable of generating oxygen concentrations of up to 99.7 percent directly from air. The process is comprised of four absorbent beds, two containing a zeolite molecular sieve and two containing a carbon molecular sieve. This new process may find use in aircraft and medical breathing systems, and industrial air separation systems. The commercial potential of the process is currently being evaluated.

  8. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  9. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  10. High-resolution (>5 800 time-bandwidth product) shear mode TeO2 deflector

    NASA Astrophysics Data System (ADS)

    Soos, Jolanta I.; Caviris, Nicholas P.; Phuvan, Sonlinh

    1992-12-01

    Acousto-optic deflectors play an important role in optical signal processing systems due to their real time processing capabilities, as well as their conversion capabilities of a function of time to a function of space and time. In this work Brimrose investigated the design and fabrication of state-of-the-art, very large time-bandwidth acousto-optic devices from TeO2 single crystals.

  11. Development of a Fiber Laser Welding Capability for the W76, MC4702 Firing Set

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samayoa, Jose

    2010-05-12

    Development work to implement a new welding system for a Firing Set is presented. The new system is significant because it represents the first use of fiber laser welding technology at the KCP. The work used Six-Sigma tools for weld characterization and to define process performance. Determinations of workable weld parameters and comparison to existing equipment were completed. Replication of existing waveforms was done utilizing an Arbitrary Pulse Generator (APG), which was used to modulate the fiber laser’s exclusive continuous wave (CW) output. Fiber laser weld process capability for a Firing Set is demonstrated.

  12. Adaptive guidance and control for future remote sensing systems

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Myers, J. E.

    1980-01-01

    A unique approach to onboard processing was developed that is capable of acquiring high quality image data for users in near real time. The approach is divided into two steps: the development of an onboard cloud detection system; and the development of a landmark tracker. The results of these two developments are outlined and the requirements of an operational guidance and control system capable of providing continuous estimation of the sensor boresight position are summarized.

  13. The reactive bed plasma system for contamination control

    NASA Technical Reports Server (NTRS)

    Birmingham, Joseph G.; Moore, Robert R.; Perry, Tony R.

    1990-01-01

    The contamination control capabilities of the Reactive Bed Plasma (RBP) system is described by delineating the results of toxic chemical composition studies, aerosol filtration work, and other testing. The RBP system has demonstrated its capabilities to decompose toxic materials and process hazardous aerosols. The post-treatment requirements for the reaction products have possible solutions. Although additional work is required to meet NASA requirements, the RBP may be able to meet contamination control problems aboard the Space Station.

  14. Configuration of electro-optic fire source detection system

    NASA Astrophysics Data System (ADS)

    Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir

    2007-04-01

    The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.

  15. Spacecraft Data Simulator for the test of level zero processing systems

    NASA Technical Reports Server (NTRS)

    Shi, Jeff; Gordon, Julie; Mirchandani, Chandru; Nguyen, Diem

    1994-01-01

    The Microelectronic Systems Branch (MSB) at Goddard Space Flight Center (GSFC) has developed a Spacecraft Data Simulator (SDS) to support the development, test, and verification of prototype and production Level Zero Processing (LZP) systems. Based on a disk array system, the SDS is capable of generating large test data sets up to 5 Gigabytes and outputting serial test data at rates up to 80 Mbps. The SDS supports data formats including NASA Communication (Nascom) blocks, Consultative Committee for Space Data System (CCSDS) Version 1 & 2 frames and packets, and all the Advanced Orbiting Systems (AOS) services. The capability to simulate both sequential and non-sequential time-ordered downlink data streams with errors and gaps is crucial to test LZP systems. This paper describes the system architecture, hardware and software designs, and test data designs. Examples of test data designs are included to illustrate the application of the SDS.

  16. MIRIADS: miniature infrared imaging applications development system description and operation

    NASA Astrophysics Data System (ADS)

    Baxter, Christopher R.; Massie, Mark A.; McCarley, Paul L.; Couture, Michael E.

    2001-10-01

    A cooperative effort between the U.S. Air Force Research Laboratory, Nova Research, Inc., the Raytheon Infrared Operations (RIO) and Optics 1, Inc. has successfully produced a miniature infrared camera system that offers significant real-time signal and image processing capabilities by virtue of its modular design. This paper will present an operational overview of the system as well as results from initial testing of the 'Modular Infrared Imaging Applications Development System' (MIRIADS) configured as a missile early-warning detection system. The MIRIADS device can operate virtually any infrared focal plane array (FPA) that currently exists. Programmable on-board logic applies user-defined processing functions to the real-time digital image data for a variety of functions. Daughterboards may be plugged onto the system to expand the digital and analog processing capabilities of the system. A unique full hemispherical infrared fisheye optical system designed and produced by Optics 1, Inc. is utilized by the MIRIADS in a missile warning application to demonstrate the flexibility of the overall system to be applied to a variety of current and future AFRL missions.

  17. Test/score/report: Simulation techniques for automating the test process

    NASA Technical Reports Server (NTRS)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.

  18. Progress in the Development of a Prototype Reuse Enablement System

    NASA Astrophysics Data System (ADS)

    Marshall, J. J.; Downs, R. R.; Gilliam, L. J.; Wolfe, R. E.

    2008-12-01

    An important part of promoting software reuse is to ensure that reusable software assets are readily available to the software developers who want to use them. Through dialogs with the community, the NASA Earth Science Data Systems Software Reuse Working Group has learned that the lack of a centralized, domain- specific software repository or catalog system addressing the needs of the Earth science community is a major barrier to software reuse within the community. The Working Group has proposed the creation of such a reuse enablement system, which would provide capabilities for contributing and obtaining reusable software, to remove this barrier. The Working Group has recommended the development of a Reuse Enablement System to NASA and has performed a trade study to review systems with similar capabilities and to identify potential platforms for the proposed system. This was followed by an architecture study to determine an expeditious and cost-effective solution for this system. A number of software packages and systems were examined through both creating prototypes and examining existing systems that use the same software packages and systems. Based on the results of the architecture study, the Working Group developed a prototype of the proposed system using the recommended software package, through an iterative process of identifying needed capabilities and improving the system to provide those capabilities. Policies for the operation and maintenance of the system are being established for the system, and the identification of system policies also has contributed to the development process. Additionally, a test plan is being developed for formal testing of the prototype, to ensure that it meets all of the requirements previously developed by the Working Group. This poster summarizes the results of our work to date, focusing on the most recent activities.

  19. Semantic Service Matchmaking in the ATM Domain Considering Infrastructure Capability Constraints

    NASA Astrophysics Data System (ADS)

    Moser, Thomas; Mordinyi, Richard; Sunindyo, Wikan Danar; Biffl, Stefan

    In a service-oriented environment business processes flexibly build on software services provided by systems in a network. A key design challenge is the semantic matchmaking of business processes and software services in two steps: 1. Find for one business process the software services that meet or exceed the BP requirements; 2. Find for all business processes the software services that can be implemented within the capability constraints of the underlying network, which poses a major problem since even for small scenarios the solution space is typically very large. In this chapter we analyze requirements from mission-critical business processes in the Air Traffic Management (ATM) domain and introduce an approach for semi-automatic semantic matchmaking for software services, the “System-Wide Information Sharing” (SWIS) business process integration framework. A tool-supported semantic matchmaking process like SWIS can provide system designers and integrators with a set of promising software service candidates and therefore strongly reduces the human matching effort by focusing on a much smaller space of matchmaking candidates. We evaluate the feasibility of the SWIS approach in an industry use case from the ATM domain.

  20. Cost, capability, and risk for planetary operations

    NASA Technical Reports Server (NTRS)

    Mclaughlin, William I.; Deutsch, Marie J.; Miller, Lanny J.; Wolff, Donna M.; Zawacki, Steven J.

    1992-01-01

    The three key factors for flight projects - cost, capability, and risk - are examined with respect to their interplay, the uplink process, cost drivers, and risk factors. Scientific objectives are translated into a computer program during the uplink process, and examples are given relating to the Voyager Interstellar Mission, Galileo, and the Comet Rendezvous Asteroid Flyby. The development of a multimission sequence system based on these uplinks is described with reference to specific subsystems such as the pointer and the sequence generator. Operational cost drivers include mission, flight-system, and ground-system complexity, uplink traffic, and work force. Operational risks are listed in terms of the mission operations, the environment, and the mission facilities. The uplink process can be analyzed in terms of software development, and spacecraft operability is shown to be an important factor from the initial stages of spacecraft development.

  1. The composite load spectra project

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.; Kurth, R. E.

    1990-01-01

    Probabilistic methods and generic load models capable of simulating the load spectra that are induced in space propulsion system components are being developed. Four engine component types (the transfer ducts, the turbine blades, the liquid oxygen posts and the turbopump oxidizer discharge duct) were selected as representative hardware examples. The composite load spectra that simulate the probabilistic loads for these components are typically used as the input loads for a probabilistic structural analysis. The knowledge-based system approach used for the composite load spectra project provides an ideal environment for incremental development. The intelligent database paradigm employed in developing the expert system provides a smooth coupling between the numerical processing and the symbolic (information) processing. Large volumes of engine load information and engineering data are stored in database format and managed by a database management system. Numerical procedures for probabilistic load simulation and database management functions are controlled by rule modules. Rules were hard-wired as decision trees into rule modules to perform process control tasks. There are modules to retrieve load information and models. There are modules to select loads and models to carry out quick load calculations or make an input file for full duty-cycle time dependent load simulation. The composite load spectra load expert system implemented today is capable of performing intelligent rocket engine load spectra simulation. Further development of the expert system will provide tutorial capability for users to learn from it.

  2. REPORT ON AN ORBITAL MAPPING SYSTEM.

    USGS Publications Warehouse

    Colvocoresses, Alden P.; ,

    1984-01-01

    During June 1984, the International Society for Photogrammetry and Remote Sensing accepted a committee report that defines an Orbital Mapping System (OMS) to follow Landsat and other Earth-sensing systems. The OMS involves the same orbital parameters of Landsats 1, 2, and 3, three wave bands (two in the visible and one in the near infrared) and continuous stereoscopic capability. The sensors involve solid-state linear arrays and data acquisition (including stereo) designed for one-dimensional data processing. It has a resolution capability of 10-m pixels and is capable of producing 1:50,000-scale image maps with 20-m contours. In addition to mapping, the system is designed to monitor the works of man as well as nature and in a cost-effective manner.

  3. Overview of the NASA Wallops Flight Facility Mobile Range Control System

    NASA Technical Reports Server (NTRS)

    Davis, Rodney A.; Semancik, Susan K.; Smith, Donna C.; Stancil, Robert K.

    1999-01-01

    The NASA GSFC's Wallops Flight Facility (WFF) Mobile Range Control System (MRCS) is based on the functionality of the WFF Range Control Center at Wallops Island, Virginia. The MRCS provides real time instantaneous impact predictions, real time flight performance data, and other critical information needed by mission and range safety personnel in support of range operations at remote launch sites. The MRCS integrates a PC telemetry processing system (TELPro), a PC radar processing system (PCDQS), multiple Silicon Graphics display workstations (IRIS), and communication links within a mobile van for worldwide support of orbital, suborbital, and aircraft missions. This paper describes the MRCS configuration; the TELPro's capability to provide single/dual telemetry tracking and vehicle state data processing; the PCDQS' capability to provide real time positional data and instantaneous impact prediction for up to 8 data sources; and the IRIS' user interface for setup/display options. With portability, PC-based data processing, high resolution graphics, and flexible multiple source support, the MRCS system is proving to be responsive to the ever-changing needs of a variety of increasingly complex missions.

  4. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  5. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  6. Pulsed Acoustic Vortex Sensing System : Volume 2, Studies of Improved PAVSS Processing Techniques

    DOT National Transportation Integrated Search

    1977-06-01

    Avco Corporation's Systems Division designed and developed an engineered Pulsed Acoustic Vortex Sensing System (PAVSS). This system is capable of real-time detection, tracking, recording, and graphic display of aircraft trailing vortices. This volume...

  7. A knowledge based expert system for propellant system monitoring at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Jamieson, J. R.; Delaune, C.; Scarl, E.

    1985-01-01

    The Lox Expert System (LES) is the first attempt to build a realtime expert system capable of simulating the thought processes of NASA system engineers, with regard to fluids systems analysis and troubleshooting. An overview of the hardware and software describes the techniques used, and possible applications to other process control systems. LES is now in the advanced development stage, with a full implementation planned for late 1985.

  8. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    NASA Technical Reports Server (NTRS)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  9. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    PubMed

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  10. Information Technology. DOD Needs to Strengthen Management of Its Statutorily Mandated Software and System Process Improvement Efforts

    DTIC Science & Technology

    2009-09-01

    NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI

  11. Signal Processing in Reverberation- A Summary of Performance Capability

    DTIC Science & Technology

    1972-08-30

    34 LkP+52-DOGLB. UNCLASSIFIED N, TM No. TC- 173-72 LC INAVAL UNDERWATER SYSTEMS CENTER Technical Memorandum SIGNAL PROCESSING IN REVERBERATION- A...SUMMARY OF PERFORMANCE CAPABILITY Date: 30 August 1972 Prepared by: -- v i Alert H. Nuttall Office of the Director of Science and Technology 2TICELECTEI...Nuffall Office of the Director of Science and Technology Approved for public release; distribution unlimited UNCLASSIFIED TM No. TC- I7P-72 NAVAL

  12. Evaluation of VICAR software capability for land information support system needs. [Elk River quadrangle, Idaho

    NASA Technical Reports Server (NTRS)

    Yao, S. S. (Principal Investigator)

    1981-01-01

    A preliminary evaluation of the processing capability of the VICAR software for land information support system needs is presented. The geometric and radiometric properties of four sets of LANDSAT data taken over the Elk River, Idaho quadrangle were compared. Storage of data sets, the means of location, pixel resolution, and radiometric and geometric characteristics are described. Recommended modifications of VICAR programs are presented.

  13. Defence Test and Evaluation Roadmap

    DTIC Science & Technology

    2008-01-01

    T&E can be employed to prove, demonstrate or assess the ability of proposed and existing capability systems, new or upgraded, to satisfy specified...t&e T&E is a process to obtain information to support the objective assessment of a Capability System with known confidence, and to confirm whether...for the ADF is a ‘balanced, networked, and deployable force, staffed by dedicated and professional people, that operates within a culture of

  14. Safe Operations of Unmanned Systems for Reconnaissance in Complex Environments Army Technology Objective (SOURCE ATO)

    DTIC Science & Technology

    2011-04-25

    must adapt its planning to vehicle size, shape, wheelbase, wheel and axle configuration, the specific obstacle-crossing capabilities of the vehicle...scalability of the ANS is a consequence of making each sensing modality capable of performing reasonable perception tasks while allowing a wider...autonomous system design achieves flexibility by exploiting redundant sensing modalities where possible, and by a decision-making process that

  15. Computer graphics application in the engineering design integration system

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.

    1975-01-01

    The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.

  16. Transforming Lepidopteran Insect Cells for Improved Protein Processing and Expression

    USDA-ARS?s Scientific Manuscript database

    The lepidopteran insect cells used with the baculovirus expression vector system (BEVS) are capable of synthesizing and accurately processing foreign proteins. However, proteins expressed in baculovirus-infected cells often fail to be completely processed, or are not processed in a manner that meet...

  17. Coherence-generating power of quantum dephasing processes

    NASA Astrophysics Data System (ADS)

    Styliaris, Georgios; Campos Venuti, Lorenzo; Zanardi, Paolo

    2018-03-01

    We provide a quantification of the capability of various quantum dephasing processes to generate coherence out of incoherent states. The measures defined, admitting computable expressions for any finite Hilbert-space dimension, are based on probabilistic averages and arise naturally from the viewpoint of coherence as a resource. We investigate how the capability of a dephasing process (e.g., a nonselective orthogonal measurement) to generate coherence depends on the relevant bases of the Hilbert space over which coherence is quantified and the dephasing process occurs, respectively. We extend our analysis to include those Lindblad time evolutions which, in the infinite-time limit, dephase the system under consideration and calculate their coherence-generating power as a function of time. We further identify specific families of such time evolutions that, although dephasing, have optimal (over all quantum processes) coherence-generating power for some intermediate time. Finally, we investigate the coherence-generating capability of random dephasing channels.

  18. Eddy Current System for Detection of Cracking Beneath Braiding in Corrugated Metal Hose

    NASA Astrophysics Data System (ADS)

    Wincheski, Buzz; Simpson, John; Hall, George

    2009-03-01

    In this paper an eddy current system for the detection of partially-through-the-thickness cracks in corrugated metal hose is presented. Design criteria based upon the geometry and conductivity of the part are developed and applied to the fabrication of a prototype inspection system. Experimental data are used to highlight the capabilities of the system and an image processing technique is presented to improve flaw detection capabilities. A case study for detection of cracking damage in a space shuttle radiator retract flex hoses is also presented.

  19. Eddy Current System for Detection of Cracking Beneath Braiding in Corrugated Metal Hose

    NASA Technical Reports Server (NTRS)

    Wincheski, Buzz; Simpson, John; Hall, George

    2008-01-01

    In this paper an eddy current system for the detection of partially-through-the-thickness cracks in corrugated metal hose is presented. Design criteria based upon the geometry and conductivity of the part are developed and applied to the fabrication of a prototype inspection system. Experimental data are used to highlight the capabilities of the system and an image processing technique is presented to improve flaw detection capabilities. A case study for detection of cracking damage in a space shuttle radiator retract flex hoses is also presented.

  20. Study of Fluid Experiment System (FES)/CAST/Holographic Ground System (HGS)

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Cummings, Rick; Jones, Brian

    1992-01-01

    The use of holographic and schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The HGS facility at MSFC has been primary resource in researching this capability. Consequently, scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS Crystal Growth and the casting and solidification technology (CAST) experiments that were flown on the International Microgravity Laboratory (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment worked in space. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.

  1. Transitioning mine warfare to network-centric sensor analysis: future PMA technologies & capabilities

    NASA Astrophysics Data System (ADS)

    Stack, J. R.; Guthrie, R. S.; Cramer, M. A.

    2009-05-01

    The purpose of this paper is to outline the requisite technologies and enabling capabilities for network-centric sensor data analysis within the mine warfare community. The focus includes both automated processing and the traditional humancentric post-mission analysis (PMA) of tactical and environmental sensor data. This is motivated by first examining the high-level network-centric guidance and noting the breakdown in the process of distilling actionable requirements from this guidance. Examples are provided that illustrate the intuitive and substantial capability improvement resulting from processing sensor data jointly in a network-centric fashion. Several candidate technologies are introduced including the ability to fully process multi-sensor data given only partial overlap in sensor coverage and the ability to incorporate target identification information in stride. Finally the critical enabling capabilities are outlined including open architecture, open business, and a concept of operations. This ability to process multi-sensor data in a network-centric fashion is a core enabler of the Navy's vision and will become a necessity with the increasing number of manned and unmanned sensor systems and the requirement for their simultaneous use.

  2. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    NASA Astrophysics Data System (ADS)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii) optimize process parameters under competing quality requirements such as maximizing the dimple height while minimizing the dimple lower surface area.

  3. ATLAS, an integrated structural analysis and design system. Volume 1: ATLAS user's guide

    NASA Technical Reports Server (NTRS)

    Dreisbach, R. L. (Editor)

    1979-01-01

    Some of the many analytical capabilities provided by the ATLAS Version 4.0 System in the logical sequence are described in which model-definition data are prepared and the subsequent computer job is executed. The example data presented and the fundamental technical considerations that are highlighted can be used as guides during the problem solving process. This guide does not describe the details of the ATLAS capabilities, but provides an introduction to the new user of ATLAS to the level at which the complete array of capabilities described in the ATLAS User's Manual can be exploited fully.

  4. CCSDS Mission Operations Action Service Core Capabilities

    NASA Technical Reports Server (NTRS)

    Reynolds, Walter F.; Lucord, Steven A.; Stevens, John E.

    2009-01-01

    This slide presentation reviews the operations concepts of the command (action) services. Since the consequences of sending the wrong command are unacceptable, the command system provides a collaborative and distributed work environment for flight controllers and operators. The system prescribes a review and approval process where each command is viewed by other individuals before being sent to the vehicle. The action service needs additional capabilities to support he operations concepts of manned space flight. These are : (1) Action Service methods (2) Action attributes (3) Action parameter/argument attributes (4 ) Support for dynamically maintained action data. (5) Publish subscri be capabilities.

  5. Next Generation MODTRAN for Improved Atmospheric Correction of Spectral Imagery

    DTIC Science & Technology

    2016-01-29

    DoD operational and research sensor and data processing systems, particularly those involving the removal of atmospheric effects, commonly referred...atmospheric correction process. Given the ever increasing capabilities of spectral sensors to quickly generate enormous quantities of data, combined...many DoD operational and research sensor and data processing systems, particularly those involving the removal of atmospheric effects, commonly

  6. Enabling Dissimilar Material Joining Using Friction Stir Scribe Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Upadyay, Piyush; Kleinbaum, Sarah

    2017-04-05

    One challenge in adapting welding processes to dissimilar material joining is the diversity of melting temperatures of the different materials. Although the use of mechanical fasteners and adhesives have mostly paved the way for near-term implementation of dissimilar material systems, these processes only accentuate the need for low-cost welding processes capable of joining dissimilar material components regardless of alloy, properties, or melting temperature. Friction stir scribe technology was developed to overcome the challenges of joining dissimilar material components where melting temperatures vary greatly, and properties and/or chemistry are not compatible with more traditional welding processes. Although the friction stir scribemore » process is capable of joining dissimilar metals and metal/polymer systems, a more detailed evaluation of several aluminum/steel joints is presented herein to demonstrate the ability to both chemically and mechanically join dissimilar materials.« less

  7. Enabling Dissimilar Material Joining Using Friction Stir Scribe Technology

    DOE PAGES

    Hovanski, Yuri; Upadyay, Piyush; Kleinbaum, Sarah; ...

    2017-04-05

    One challenge in adapting welding processes to dissimilar material joining is the diversity of melting temperatures of the different materials. Although the use of mechanical fasteners and adhesives have mostly paved the way for near-term implementation of dissimilar material systems, these processes only accentuate the need for low-cost welding processes capable of impartially joining dissimilar material components regardless of alloy, properties, or melting temperature. Friction stir scribe technology was developed to overcome the challenges of joining dissimilar material components where melting temperatures vary greatly, and properties and/or chemistry are not compatible with more traditional welding processes. Finally, although the frictionmore » stir scribe process is capable of joining dissimilar metals and metal/polymer systems, a more detailed evaluation of several aluminum/steel joints is presented herein to demonstrate the ability to both chemically and mechanically join dissimilar materials.« less

  8. Collision avoidance system cost-benefit analysis : volume I - technical manual

    DOT National Transportation Integrated Search

    1981-09-01

    Collision-avoidance systems under development in the U.S.A., Japan and Germany were evaluated. The performance evaluation showed that the signal processing and the control law of a system were the key parameters that decided the system's capability, ...

  9. Development of a material processing plant for lunar soil

    NASA Technical Reports Server (NTRS)

    Goettsch, Ulix; Ousterhout, Karl

    1992-01-01

    Currently there is considerable interest in developing in-situ materials processing plants for both the Moon and Mars. Two of the most important aspects of developing such a materials processing plant is the overall system design and the integration of the different technologies into a reliable, lightweight, and cost-effective unit. The concept of an autonomous materials processing plant that is capable of producing useful substances from lunar regolith was developed. In order for such a materials processing plant to be considered as a viable option, it must be totally self-contained, able to operate autonomously, cost effective, light weight, and fault tolerant. In order to assess the impact of different technologies on the overall systems design and integration, a one-half scale model was constructed that is capable of scooping up (or digging) lunar soil, transferring the soil to a solar furnace, heating the soil in the furnace to liberate the gasses, and transferring the spent soil to a 'tile' processing center. All aspects of the control system are handled by a 386 class PC via D/A, A/D, and DSP (Digital Signal Processor) control cards.

  10. Appendix Y. The Integrated Communications Experiment (ICE) Summary.

    ERIC Educational Resources Information Center

    Coffin, Robert

    This appendix describes the Integrated Communications Experiment (ICE), a comprehensive computer software capability developed for the ComField Project. Each major characteristic of the data processing system is treated separately: natural language processing, flexibility, noninterference with the educational process, multipurposeness,…

  11. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  12. Readiness of the ATLAS Trigger and Data Acquisition system for the first LHC beams

    NASA Astrophysics Data System (ADS)

    Vandelli, W.; Atlas Tdaq Collaboration

    2009-12-01

    The ATLAS Trigger and Data Acquisition (TDAQ) system is based on O(2k) processing nodes, interconnected by a multi-layer Gigabit network, and consists of a combination of custom electronics and commercial products. In its final configuration, O(20k) applications will provide the needed capabilities in terms of event selection, data flow, local storage and data monitoring. In preparation for the first LHC beams, many TDAQ sub-systems already reached the final configuration and roughly one third of the final processing power has been deployed. Therefore, the current system allows for a sensible evaluation of the performance and scaling properties. In this paper we introduce the ATLAS TDAQ system requirements and architecture and we discuss the status of software and hardware component. We moreover present the results of performance measurements validating the system design and providing a figure for the ATLAS data acquisition capabilities in the initial data taking period.

  13. Theory on data processing and instrumentation. [remote sensing

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1978-01-01

    A selection of NASA Earth observations programs are reviewed, emphasizing hardware capabilities. Sampling theory, noise and detection considerations, and image evaluation are discussed for remote sensor imagery. Vision and perception are considered, leading to numerical image processing. The use of multispectral scanners and of multispectral data processing systems, including digital image processing, is depicted. Multispectral sensing and analysis in application with land use and geographical data systems are also covered.

  14. Process of activation of a palladium catalyst system

    DOEpatents

    Sobolevskiy, Anatoly [Orlando, FL; Rossin, Joseph A [Columbus, OH; Knapke, Michael J [Columbus, OH

    2011-08-02

    Improved processes for activating a catalyst system used for the reduction of nitrogen oxides are provided. In one embodiment, the catalyst system is activated by passing an activation gas stream having an amount of each of oxygen, water vapor, nitrogen oxides, and hydrogen over the catalyst system and increasing a temperature of the catalyst system to a temperature of at least 180.degree. C. at a heating rate of from 1-20.degree./min. Use of activation processes described herein leads to a catalyst system with superior NOx reduction capabilities.

  15. Bringing the Ocean to the Precollege Classroom through field Investigations at a National Underwater Laboratory

    DTIC Science & Technology

    1998-09-30

    was to use field experiences to 1) enhance educator capability in science content and skills, 2) immerse school systems in an inquiry-driven, active ... learning process, and 3) establish links to real-time scientific information in support of classroom activities. Participants capability in marine

  16. Assessment of the Orion-SLS Interface Management Process in Achieving the EIA 731.1 Systems Engineering Capability Model Generic Practices Level 3 Criteria

    NASA Technical Reports Server (NTRS)

    Jellicorse, John J.; Rahman, Shamin A.

    2016-01-01

    NASA is currently developing the next generation crewed spacecraft and launch vehicle for exploration beyond earth orbit including returning to the Moon and making the transit to Mars. Managing the design integration of major hardware elements of a space transportation system is critical for overcoming both the technical and programmatic challenges in taking a complex system from concept to space operations. An established method of accomplishing this is formal interface management. In this paper we set forth an argument that the interface management process implemented by NASA between the Orion Multi-Purpose Crew Vehicle (MPCV) and the Space Launch System (SLS) achieves the Level 3 tier of the EIA 731.1 System Engineering Capability Model (SECM) for Generic Practices. We describe the relevant NASA systems and associated organizations, and define the EIA SECM Level 3 Generic Practices. We then provide evidence for our compliance with those practices. This evidence includes discussions of: NASA Systems Engineering Interface (SE) Management standard process and best practices; the tailoring of that process for implementation on the Orion to SLS interface; changes made over time to improve the tailored process, and; the opportunities to take the resulting lessons learned and propose improvements to our institutional processes and best practices. We compare this evidence against the practices to form the rationale for the declared SECM maturity level.

  17. Diagnosis and Prognosis of Weapon Systems

    NASA Technical Reports Server (NTRS)

    Nolan, Mary; Catania, Rebecca; deMare, Gregory

    2005-01-01

    The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.

  18. A flexible flight display research system using a ground-based interactive graphics terminal

    NASA Technical Reports Server (NTRS)

    Hatfield, J. J.; Elkins, H. C.; Batson, V. M.; Poole, W. L.

    1975-01-01

    Requirements and research areas for the air transportation system of the 1980 to 1990's were reviewed briefly to establish the need for a flexible flight display generation research tool. Specific display capabilities required by aeronautical researchers are listed and a conceptual system for providing these capabilities is described. The conceptual system uses a ground-based interactive graphics terminal driven by real-time radar and telemetry data to generate dynamic, experimental flight displays. These displays are scan converted to television format, processed, and transmitted to the cockpits of evaluation aircraft. The attendant advantages of a Flight Display Research System (FDRS) designed to employ this concept are presented. The detailed implementation of an FDRS is described. The basic characteristics of the interactive graphics terminal and supporting display electronic subsystems are presented and the resulting system capability is summarized. Finally, the system status and utilization are reviewed.

  19. Space Station Mission Planning System (MPS) development study. Volume 2

    NASA Technical Reports Server (NTRS)

    Klus, W. J.

    1987-01-01

    The process and existing software used for Spacelab payload mission planning were studied. A complete baseline definition of the Spacelab payload mission planning process was established, along with a definition of existing software capabilities for potential extrapolation to the Space Station. This information was used as a basis for defining system requirements to support Space Station mission planning. The Space Station mission planning concept was reviewed for the purpose of identifying areas where artificial intelligence concepts might offer substantially improved capability. Three specific artificial intelligence concepts were to be investigated for applicability: natural language interfaces; expert systems; and automatic programming. The advantages and disadvantages of interfacing an artificial intelligence language with existing FORTRAN programs or of converting totally to a new programming language were identified.

  20. Development and testing of controller performance evaluation methodology for multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1991-01-01

    Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.

  1. The automated Army ROTC Questionnaire (ARQ)

    NASA Technical Reports Server (NTRS)

    Young, David L. H.

    1991-01-01

    The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.

  2. The Making of a Government LSI - From Warfare Capability to Operational System

    DTIC Science & Technology

    2015-04-30

    continues to evolve and implement Lead System Integrator (LSI) acquisition strategies, they have started to define numerous program initiatives that...employ more integrated engineering and management processes and techniques. These initiatives are developing varying acquisition approaches that define (1...government LSI transformation. Navy Systems Commands have begun adding a higher level of integration into their acquisition process with the

  3. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  4. Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)

    NASA Astrophysics Data System (ADS)

    Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian

    2007-01-01

    High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.

  5. Implementation of a low-cost, commercial orbit determination system

    NASA Technical Reports Server (NTRS)

    Corrigan, Jim

    1994-01-01

    This paper describes the implementation and potential applications of a workstation-based orbit determination system developed by Storm Integration, Inc. called the Precision Orbit Determination System (PODS). PODS is offered as a layered product to the commercially-available Satellite Tool Kit (STK) produced by Analytical Graphics, Inc. PODS also incorporates the Workstation/Precision Orbit Determination (WS/POD) product offered by Van Martin System, Inc. The STK graphical user interface is used to access and invoke the PODS capabilities and to display the results. WS/POD is used to compute a best-fit solution to user-supplied tracking data. PODS provides the capability to simultaneously estimate the orbits of up to 99 satellites based on a wide variety of observation types including angles, range, range rate, and Global Positioning System (GPS) data. PODS can also estimate ground facility locations, Earth geopotential model coefficients, solar pressure and atmospheric drag parameters, and observation data biases. All determined data is automatically incorporated into the STK data base, which allows storage, manipulation and export of the data to other applications. PODS is offered in three levels: Standard, Basic GPS and Extended GPS. Standard allows processing of non-GPS observation types for any number of vehicles and facilities. Basic GPS adds processing of GPS pseudo-ranging data to the Standard capabilities. Extended GPS adds the ability to process GPS carrier phase data.

  6. The Space Systems Environmental Test Facility Database (SSETFD), Website Development Status

    NASA Technical Reports Server (NTRS)

    Snyder, James M.

    2008-01-01

    The Aerospace Corporation has been developing a database of U.S. environmental test laboratory capabilities utilized by the space systems hardware development community. To date, 19 sites have been visited by The Aerospace Corporation and verbal agreements reached to include their capability descriptions in the database. A website is being developed to make this database accessible by all interested government, civil, university and industry personnel. The website will be accessible by all interested in learning more about the extensive collective capability that the US based space industry has to offer. The Environments, Test & Assessment Department within The Aerospace Corporation will be responsible for overall coordination and maintenance of the database. Several US government agencies are interested in utilizing this database to assist in the source selection process for future spacecraft programs. This paper introduces the website by providing an overview of its development, location and search capabilities. It will show how the aerospace community can apply this new tool as a way to increase the utilization of existing lab facilities, and as a starting point for capital expenditure/upgrade trade studies. The long term result is expected to be increased utilization of existing laboratory capability and reduced overall development cost of space systems hardware. Finally, the paper will present the process for adding new participants, and how the database will be maintained.

  7. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  8. Microtechnology management considering test and cost aspects for stacked 3D ICs with MEMS

    NASA Astrophysics Data System (ADS)

    Hahn, K.; Wahl, M.; Busch, R.; Grünewald, A.; Brück, R.

    2018-01-01

    Innovative automotive systems require complex semiconductor devices currently only available in consumer grade quality. The European project TRACE will develop and demonstrate methods, processes, and tools to facilitate usage of Consumer Electronics (CE) components to be deployable more rapidly in the life-critical automotive domain. Consumer electronics increasingly use heterogeneous system integration methods and "More than Moore" technologies, which are capable to combine different circuit domains (Analog, Digital, RF, MEMS) and which are integrated within SiP or 3D stacks. Making these technologies or at least some of the process steps available under automotive electronics requirements is an important goal to keep pace with the growing demand for information processing within cars. The approach presented in this paper aims at a technology management and recommendation system that covers technology data, functional and non-functional constraints, and application scenarios, and that will comprehend test planning and cost consideration capabilities.

  9. An AI approach for scheduling space-station payloads at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Castillo, D.; Ihrie, D.; Mcdaniel, M.; Tilley, R.

    1987-01-01

    The Payload Processing for Space-Station Operations (PHITS) is a prototype modeling tool capable of addressing many Space Station related concerns. The system's object oriented design approach coupled with a powerful user interface provide the user with capabilities to easily define and model many applications. PHITS differs from many artificial intelligence based systems in that it couples scheduling and goal-directed simulation to ensure that on-orbit requirement dates are satisfied.

  10. Collision avoidance system cost-benefit analysis : volume III - appendices F-M

    DOT National Transportation Integrated Search

    1981-09-01

    Collision-avoidance systems under development in the U.S.A., Japan and Germany were evaluated. The performance evaluation showed that the signal processing and the control law of a system were the key parameters that decided the system's capability, ...

  11. Collision avoidance system cost-benefit analysis : volume II - appendices A-E

    DOT National Transportation Integrated Search

    1981-09-01

    Collision-avoidance systems under development in the U.S.A., Japan and Germany were evaluated. The performance evaluation showed that the signal processing and the control law of a system were the key parameters that decided the system's capability, ...

  12. System verification and validation: a fundamental systems engineering task

    NASA Astrophysics Data System (ADS)

    Ansorge, Wolfgang R.

    2004-09-01

    Systems Engineering (SE) is the discipline in a project management team, which transfers the user's operational needs and justifications for an Extremely Large Telescope (ELT) -or any other telescope-- into a set of validated required system performance characteristics. Subsequently transferring these validated required system performance characteris-tics into a validated system configuration, and eventually into the assembled, integrated telescope system with verified performance characteristics and provided it with "objective evidence that the particular requirements for the specified intended use are fulfilled". The latter is the ISO Standard 8402 definition for "Validation". This presentation describes the verification and validation processes of an ELT Project and outlines the key role System Engineering plays in these processes throughout all project phases. If these processes are implemented correctly into the project execution and are started at the proper time, namely at the very beginning of the project, and if all capabilities of experienced system engineers are used, the project costs and the life-cycle costs of the telescope system can be reduced between 25 and 50 %. The intention of this article is, to motivate and encourage project managers of astronomical telescopes and scientific instruments to involve the entire spectrum of Systems Engineering capabilities performed by trained and experienced SYSTEM engineers for the benefit of the project by explaining them the importance of Systems Engineering in the AIV and validation processes.

  13. The E-3 Test Facility at Stennis Space Center: Research and Development Testing for Cryogenic and Storable Propellant Combustion Systems

    NASA Technical Reports Server (NTRS)

    Pazos, John T.; Chandler, Craig A.; Raines, Nickey G.

    2009-01-01

    This paper will provide the reader a broad overview of the current upgraded capabilities of NASA's John C. Stennis Space Center E-3 Test Facility to perform testing for rocket engine combustion systems and components using liquid and gaseous oxygen, gaseous and liquid methane, gaseous hydrogen, hydrocarbon based fuels, hydrogen peroxide, high pressure water and various inert fluids. Details of propellant system capabilities will be highlighted as well as their application to recent test programs and accomplishments. Data acquisition and control, test monitoring, systems engineering and test processes will be discussed as part of the total capability of E-3 to provide affordable alternatives for subscale to full scale testing for many different requirements in the propulsion community.

  14. The Montana experience

    NASA Technical Reports Server (NTRS)

    Dundas, T. R.

    1981-01-01

    The development and capabilities of the Montana geodata system are discussed. The system is entirely dependent on the state's central data processing facility which serves all agencies and is therefore restricted to batch mode processing. The computer graphics equipment is briefly described along with its application to state lands and township mapping and the production of water quality interval maps.

  15. Integration of a three-dimensional process-based hydrological model into the Object Modeling System

    USDA-ARS?s Scientific Manuscript database

    The integration of a spatial process model into an environmental modelling framework can enhance the model’s capabilities. We present the integration of the GEOtop model into the Object Modeling System (OMS) version 3.0 and illustrate its application in a small watershed. GEOtop is a physically base...

  16. Human Spaceflight Safety for the Next Generation on Orbital Space Systems

    NASA Technical Reports Server (NTRS)

    Mango, Edward J.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Commercial Crew Program (CCP) has been chartered to facilitate the development of a United States (U.S.) commercial crew space transportation capability with the goal of achieving safe, reliable, and cost effective access to and from low Earth orbit (LEO) and the International Space Station (ISS) as soon as possible. Once the capability is matured and is available to the Government and other customers, NASA expects to purchase commercial services to meet its ISS crew rotation and emergency return objectives. The primary role of the CCP is to enable and ensure safe human spaceflight and processes for the next generation of earth orbital space systems. The architecture of the Program delineates the process for investment performance in safe orbital systems, Crew Transportation System (CTS) certification, and CTS Flight Readiness. A series of six technical documents build up the architecture to address the top-level CTS requirements and standards. They include Design Reference Missions, with the near term focus on ISS crew services, Certification and Service Requirements, Technical Management Processes, and Technical and Operations Standards Evaluation Processes.

  17. From photons to big-data applications: terminating terabits

    PubMed Central

    2016-01-01

    Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers. PMID:26809573

  18. Automated carbon dioxide cleaning system

    NASA Technical Reports Server (NTRS)

    Hoppe, David T.

    1991-01-01

    Solidified CO2 pellets are an effective blast media for the cleaning of a variety of materials. CO2 is obtained from the waste gas streams generated from other manufacturing processes and therefore does not contribute to the greenhouse effect, depletion of the ozone layer, or the environmental burden of hazardous waste disposal. The system is capable of removing as much as 90 percent of the contamination from a surface in one pass or to a high cleanliness level after multiple passes. Although the system is packaged and designed for manual hand held cleaning processes, the nozzle can easily be attached to the end effector of a robot for automated cleaning of predefined and known geometries. Specific tailoring of cleaning parameters are required to optimize the process for each individual geometry. Using optimum cleaning parameters the CO2 systems were shown to be capable of cleaning to molecular levels below 0.7 mg/sq ft. The systems were effective for removing a variety of contaminants such as lubricating oils, cutting oils, grease, alcohol residue, biological films, and silicone. The system was effective on steel, aluminum, and carbon phenolic substrates.

  19. From photons to big-data applications: terminating terabits.

    PubMed

    Zilberman, Noa; Moore, Andrew W; Crowcroft, Jon A

    2016-03-06

    Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers. © 2016 The Authors.

  20. Differential correction capability of the GTDS using TDRSS data

    NASA Technical Reports Server (NTRS)

    Liu, S. Y.; Soskey, D. G.; Jacintho, J.

    1980-01-01

    A differential correction (DC) capability was implemented in the Goddard Trajectory Determination System (GTDS) to process satellite tracking data acquired via the Tracking and Data Relay Satellite System (TRDRSS). Configuration of the TDRSS is reviewed, observation modeling is presented, and major features of the capability are discussed. The following types of TDRSS data can be processed by GTDS: two way relay range and Doppler measurements, hybrid relay range and Doppler measurements, one way relay Doppler measurements, and differenced one way relay Doppler measurements. These data may be combined with conventional ground based direct tracking data. By using Bayesian weighted least squares techniques, the software allows the simultaneous determination of the trajectories of up to four different satellites - one user satellite and three relay satellites. In addition to satellite trajectories, the following parameters can be optionally solved: for drag coefficient, reflectivity of a satellite for solar radiation pressure, transponder delay, station position, and biases.

  1. Configuration management issues and objectives for a real-time research flight test support facility

    NASA Technical Reports Server (NTRS)

    Yergensen, Stephen; Rhea, Donald C.

    1988-01-01

    Presented are some of the critical issues and objectives pertaining to configuration management for the NASA Western Aeronautical Test Range (WATR) of Ames Research Center. The primary mission of the WATR is to provide a capability for the conduct of aeronautical research flight test through real-time processing and display, tracking, and communications systems. In providing this capability, the WATR must maintain and enforce a configuration management plan which is independent of, but complimentary to, various research flight test project configuration management systems. A primary WATR objective is the continued development of generic research flight test project support capability, wherein the reliability of WATR support provided to all project users is a constant priority. Therefore, the processing of configuration change requests for specific research flight test project requirements must be evaluated within a perspective that maintains this primary objective.

  2. Processing multilevel secure test and evaluation information

    NASA Astrophysics Data System (ADS)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  3. AOIPS water resources data management system

    NASA Technical Reports Server (NTRS)

    Vanwie, P.

    1977-01-01

    The text and computer-generated displays used to demonstrate the AOIPS (Atmospheric and Oceanographic Information Processing System) water resources data management system are investigated. The system was developed to assist hydrologists in analyzing the physical processes occurring in watersheds. It was designed to alleviate some of the problems encountered while investigating the complex interrelationships of variables such as land-cover type, topography, precipitation, snow melt, surface runoff, evapotranspiration, and streamflow rates. The system has an interactive image processing capability and a color video display to display results as they are obtained.

  4. Development and evaluation of a profile negotiation process for integrating aircraft and air traffic control automation

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Denbraven, Wim; Williams, David H.

    1993-01-01

    The development and evaluation of the profile negotiation process (PNP), an interactive process between an aircraft and air traffic control (ATC) that integrates airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible, are described. The PNP was evaluated in a real-time simulation experiment conducted jointly by NASA's Ames and Langley Research Centers. The Ames Center/TRACON Automation System (CTAS) was used to support the ATC environment, and the Langley Transport Systems Research Vehicle (TSRV) piloted cab was used to simulate a 4D Flight Management System (FMS) capable aircraft. Both systems were connected in real time by way of voice and data lines; digital datalink communications capability was developed and evaluated as a means of supporting the air/ground exchange of trajectory data. The controllers were able to consistently and effectively negotiate nominally conflict-free vertical profiles with the 4D-equipped aircraft. The actual profiles flown were substantially closer to the aircraft's preference than would have been possible without the PNP. However, there was a strong consensus among the pilots and controllers that the level of automation of the PNP should be increased to make the process more transparent. The experiment demonstrated the importance of an aircraft's ability to accurately execute a negotiated profile as well as the need for digital datalink to support advanced air/ground data communications. The concept of trajectory space is proposed as a comprehensive approach for coupling the processes of trajectory planning and tracking to allow maximum pilot discretion in meeting ATC constraints.

  5. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  6. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  7. A Proposed Operational Concept for the Defense Communications Operations Support System.

    DTIC Science & Technology

    1986-01-01

    Artificial Intelligence AMA Automatic Message Accounting AMIE AUTODIN Management Index System AMPE Automated Message Processing Exchange ANCS AUTOVON Network...Support IMPRESS Inpact/Restoral System INFORM Information Retrieval System 1OC Initial Operational Capability IRU Intellegent Remote Unit I-S/A AMPE

  8. Low cost solar array project. Task 1: Silicon material, gaseous melt replenishment system

    NASA Technical Reports Server (NTRS)

    Jewett, D. N.; Bates, H. E.; Hill, D. M.

    1979-01-01

    A system to combine silicon formation, by hydrogen reduction of trichlorosilane, with the capability to replenish a crystal growth system is described. A variety of process parameters to allow sizing and specification of gas handling system components was estimated.

  9. Autonomous spacecraft maintenance study group

    NASA Technical Reports Server (NTRS)

    Marshall, M. H.; Low, G. D.

    1981-01-01

    A plan to incorporate autonomous spacecraft maintenance (ASM) capabilities into Air Force spacecraft by 1989 is outlined. It includes the successful operation of the spacecraft without ground operator intervention for extended periods of time. Mechanisms, along with a fault tolerant data processing system (including a nonvolatile backup memory) and an autonomous navigation capability, are needed to replace the routine servicing that is presently performed by the ground system. The state of the art fault handling capabilities of various spacecraft and computers are described, and a set conceptual design requirements needed to achieve ASM is established. Implementations for near term technology development needed for an ASM proof of concept demonstration by 1985, and a research agenda addressing long range academic research for an advanced ASM system for 1990s are established.

  10. The Development of HfO2-Rare Earth Based Oxide Materials and Barrier Coatings for Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Harder, Bryan James

    2014-01-01

    Advanced hafnia-rare earth oxides, rare earth aluminates and silicates have been developed for thermal environmental barrier systems for aerospace propulsion engine and thermal protection applications. The high temperature stability, low thermal conductivity, excellent oxidation resistance and mechanical properties of these oxide material systems make them attractive and potentially viable for thermal protection systems. This paper will focus on the development of the high performance and high temperature capable ZrO2HfO2-rare earth based alloy and compound oxide materials, processed as protective coating systems using state-or-the-art processing techniques. The emphasis has been in particular placed on assessing their temperature capability, stability and suitability for advanced space vehicle entry thermal protection systems. Fundamental thermophysical and thermomechanical properties of the material systems have been investigated at high temperatures. Laser high-heat-flux testing has also been developed to validate the material systems, and demonstrating durability under space entry high heat flux conditions.

  11. Final Report Collaborative Project. Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less

  12. A novel pulsed gas metal arc welding system with direct droplet transfer close-loop control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Q.; Li, P.; Zhang, L.

    1994-12-31

    In pulsed gas metal arc welding (GMAW), a predominant parameter that has to be monitored and controlled in real time for maintaining process stability and ensuring weld quality, is droplet transfer. Based on the close correlation between droplet transfer and arc light radiant flux in GMAW of steel and aluminum, a direct closed-loop droplet transfer control system for pulsed GMAW with arc light sensor has been developed. By sensing the droplet transfer directly via the arc light signal, a pulsed GMAW process with real and exact one-pulse, one-droplet transfer has been achieved. The novel pulsed GMAW machine consists of threemore » parts: a sensing system, a controlling system, and a welding power system. The software used in this control system is capable of data sampling and processing, parameter matching, optimum parameter restoring, and resetting. A novel arc light sensing system has been developed. The sensor is small enough to be clamped to a semiautomatic welding torch. Based on thissensingn system, a closed-loop droplet transfer control system of GMAW of steel and aluminum has been built and a commercial prototype has been made. The system is capable of keeping one-pulse, one-droplet transfer against external interferences. The welding process with this control system has been proved to be stable, quiet, with no spatter, and provide good weld formation.« less

  13. Implementation of a VLSI Level Zero Processing system utilizing the functional component approach

    NASA Technical Reports Server (NTRS)

    Shi, Jianfei; Horner, Ward P.; Grebowsky, Gerald J.; Chesney, James R.

    1991-01-01

    A high rate Level Zero Processing system is currently being prototyped at NASA/Goddard Space Flight Center (GSFC). Based on state-of-the-art VLSI technology and the functional component approach, the new system promises capabilities of handling multiple Virtual Channels and Applications with a combined data rate of up to 20 Megabits per second (Mbps) at low cost.

  14. Defense AT&L Magazine: A Publication of the Defense Acquisition University. Volume 34, Number 3, DAU 184

    DTIC Science & Technology

    2005-01-01

    developed a partnership with the Defense Acquisition University to in- tegrate DISA’s systems engineering processes, software , and network...in place, with processes being implemented: deployment management; systems engineering ; software engineering ; configuration man- agement; test and...CSS systems engineering is a transition partner with Carnegie Mellon University’s Software Engineering Insti- tute and its work on the capability

  15. Impact of self-healing capability on network robustness

    NASA Astrophysics Data System (ADS)

    Shang, Yilun

    2015-04-01

    A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.

  16. Impact of self-healing capability on network robustness.

    PubMed

    Shang, Yilun

    2015-04-01

    A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.

  17. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    NASA Astrophysics Data System (ADS)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  18. Some considerations for various positioning systems and their science capabilities

    NASA Technical Reports Server (NTRS)

    Rey, Charles A.; Merkley, D. R.; Danley, T. J.

    1990-01-01

    Containerless processing of materials at elevated temperatures is discussed with emphasis on high temperature chemistry, thermophysical properties, materials science, and materials processing. Acoustic and electromagnetic positioning of high temperature melts are discussed. Results from recent ground based experiments, including KC-135 testing of an acoustic levitator, are presented. Some current positioning technologies and the potential for enhancing them are considered. Further, a summary of these technologies and their science capabilities for the development of future experiments is given.

  19. A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process and Joint Staff Capability Gap Assessment Process as Related to Pacific Commands (PACOM) Integrated Priority List Submission

    DTIC Science & Technology

    2013-04-01

    University Eugene Rex Jalao, Arizona State University and University of the Philippines Christopher Auger, Lars Baldus, Brian Yoshimoto, J. Robert...Approach to Agile Acquisition Timothy Boyce, Iva Sherman, and Nicholas Roussel Space and Naval Warfare Systems Center Pacific Challenge-Based...Problem Solving as a Mechanism for Adaptive Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative Assessment of the Navy’s

  20. TheHiveDB image data management and analysis framework.

    PubMed

    Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew

    2014-01-06

    The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative.

  1. TheHiveDB image data management and analysis framework

    PubMed Central

    Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew

    2014-01-01

    The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative. PMID:24432000

  2. Biosonar-inspired technology: goals, challenges and insights.

    PubMed

    Müller, Rolf; Kuc, Roman

    2007-12-01

    Bioinspired engineering based on biosonar systems in nature is reviewed and discussed in terms of the merits of different approaches and their results: biosonar systems are attractive technological paragons because of their capabilities, built-in task-specific knowledge, intelligent system integration and diversity. Insights from the diverse set of sensing tasks solved by bats are relevant to a wide range of application areas such as sonar, biomedical ultrasound, non-destructive testing, sensors for autonomous systems and wireless communication. Challenges in the design of bioinspired sonar systems are posed by transducer performance, actuation for sensor mobility, design, actuation and integration of beamforming baffle shapes, echo encoding for signal processing, estimation algorithms and their implementations, as well as system integration and feedback control. The discussed examples of experimental systems have capabilities that include localization and tracking using binaural and multiple-band hearing as well as self-generated dynamic cues, classification of small deterministic and large random targets, beamforming with bioinspired baffle shapes, neuromorphic spike processing, artifact rejection in sonar maps and passing range estimation. In future research, bioinspired engineering could capitalize on some of its strengths to serve as a model system for basic automation methodologies for the bioinspired engineering process.

  3. An approach to knowledge engineering to support knowledge-based simulation of payload ground processing at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Mcmanus, Shawn; Mcdaniel, Michael

    1989-01-01

    Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.

  4. Innovative vitrification for soil remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetta, N.W.; Patten, J.S.; Hart, J.G.

    1995-12-01

    The objective of this DOE demonstration program is to validate the performance and operation of the Vortec Cyclone Melting System (CMS{trademark}) for the processing of LLW contaminated soils found at DOE sites. This DOE vitrification demonstration project has successfully progressed through the first two phases. Phase 1 consisted of pilot scale testing with surrogate wastes and the conceptual design of a process plant operating at a generic DOE site. The objective of Phase 2, which is scheduled to be completed the end of FY 95, is to develop a definitive process plant design for the treatment of wastes at amore » specific DOE facility. During Phase 2, a site specific design was developed for the processing of LLW soils and muds containing TSCA organics and RCRA metal contaminants. Phase 3 will consist of a full scale demonstration at the DOE gaseous diffusion plant located in Paducah, KY. Several DOE sites were evaluated for potential application of the technology. Paducah was selected for the demonstration program because of their urgent waste remediation needs as well as their strong management and cost sharing financial support for the project. During Phase 2, the basic nitrification process design was modified to meet the specific needs of the new waste streams available at Paducah. The system design developed for Paducah has significantly enhanced the processing capabilities of the Vortec vitrification process. The overall system design now includes the capability to shred entire drums and drum packs containing mud, concrete, plastics and PCB`s as well as bulk waste materials. This enhanced processing capability will substantially expand the total DOE waste remediation applications of the technology.« less

  5. IV&V Project Assessment Process Validation

    NASA Technical Reports Server (NTRS)

    Driskell, Stephen

    2012-01-01

    The Space Launch System (SLS) will launch NASA's Multi-Purpose Crew Vehicle (MPCV). This launch vehicle will provide American launch capability for human exploration and travelling beyond Earth orbit. SLS is designed to be flexible for crew or cargo missions. The first test flight is scheduled for December 2017. The SLS SRR/SDR provided insight into the project development life cycle. NASA IV&V ran the standard Risk Based Assessment and Portfolio Based Risk Assessment to identify analysis tasking for the SLS program. This presentation examines the SLS System Requirements Review/System Definition Review (SRR/SDR), IV&V findings for IV&V process validation correlation to/from the selected IV&V tasking and capabilities. It also provides a reusable IEEE 1012 scorecard for programmatic completeness across the software development life cycle.

  6. Multi-kilowatt modularized spacecraft power processing system development

    NASA Technical Reports Server (NTRS)

    Andrews, R. E.; Hayden, J. H.; Hedges, R. T.; Rehmann, D. W.

    1975-01-01

    A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

  7. Crew systems: integrating human and technical subsystems for the exploration of space.

    PubMed

    Connors, M M; Harrison, A A; Summit, J

    1994-07-01

    Space exploration missions will require combining human and technical subsystems into overall "crew systems" capable of performing under the rigorous conditions of outer space. This report describes substantive and conceptual relationships among humans, intelligent machines, and communication systems, and explores how these components may be combined to complement and strengthen one another. We identify key research issues in the combination of humans and technology and examine the role of individual differences, group processes, and environmental conditions. We conclude that a crew system is, in effect, a social cyborg, a living system consisting of multiple individuals whose capabilities are extended by advanced technology.

  8. Crew systems: integrating human and technical subsystems for the exploration of space

    NASA Technical Reports Server (NTRS)

    Connors, M. M.; Harrison, A. A.; Summit, J.

    1994-01-01

    Space exploration missions will require combining human and technical subsystems into overall "crew systems" capable of performing under the rigorous conditions of outer space. This report describes substantive and conceptual relationships among humans, intelligent machines, and communication systems, and explores how these components may be combined to complement and strengthen one another. We identify key research issues in the combination of humans and technology and examine the role of individual differences, group processes, and environmental conditions. We conclude that a crew system is, in effect, a social cyborg, a living system consisting of multiple individuals whose capabilities are extended by advanced technology.

  9. An optoelectronic system for fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Ahmadshahi, M.

    A system capable of retrieving and processing information recorded in fringe patterns is reported. The principal components are described as well as the architecture in which they are assembled. An example of application is given.

  10. ISHM Implementation for Constellation Systems

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Holland, Randy; Schmalzel, John; Duncavage, Dan; Crocker, Alan; Alena, Rick

    2006-01-01

    Integrated System Health Management (ISHM) is a capability that focuses on determining the condition (health) of every element in a complex System (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK) "not just data" to control systems for safe and effective operation. This capability is currently done by large teams of people, primarily from ground, but needs to be embedded on-board systems to a higher degree to enable NASA's new Exploration Mission (long term travel and stay in space), while increasing safety and decreasing life cycle costs of systems (vehicles; platforms; bases or outposts; and ground test, launch, and processing operations). This viewgraph presentation reviews the use of ISHM for the Constellation system.

  11. Advanced Distributed Measurements and Data Processing at the Vibro-Acoustic Test Facility, GRC Space Power Facility, Sandusky, Ohio - an Architecture and an Example

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.; Evans, Richard K.

    2009-01-01

    A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.

  12. Using fuzzy logic to integrate neural networks and knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Yen, John

    1991-01-01

    Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.

  13. Containerless processing of undercooled melts

    NASA Technical Reports Server (NTRS)

    Perepezko, J. H.

    1993-01-01

    The investigation focused on the control of microstructural evolution in Mn-Al, Fe-Ni, Ni-V, and Au-Pb-Sb alloys through the high undercooling levels provided by containerless processing, and provided fundamental new information on the control of nucleation. Solidification analysis was conducted by means of thermal analysis, x-ray diffraction, and metallographic characterization on samples processed in a laboratory scale drop tube system. The Mn-Al alloy system offers a useful model system with the capability of phase separation on an individual particle basis, thus permitting a more complete understanding of the operative kinetics and the key containerless processing variables. This system provided the opportunity of analyzing the nucleation rate as a function of processing conditions and allowed for the quantitative assessment of the relevant processing parameters. These factors are essential in the development of a containerless processing model which has a predictive capability. Similarly, Ni-V is a model system that was used to study duplex partitionless solidification, which is a structure possible only in high under cooling solidification processes. Nucleation kinetics for the competing bcc and fcc phases were studied to determine how this structure can develop and the conditions under which it may occur. The Fe-Ni alloy system was studied to identify microstructural transitions with controlled variations in sample size and composition during containerless solidification. This work was forwarded to develop a microstructure map which delineates regimes of structural evolution and provides a unified analysis of experimental observations. The Au-Pb-Sb system was investigated to characterize the thermodynamic properties of the undercooled liquid phase and to characterize the glass transition under a variety of processing conditions. By analyzing key containerless processing parameters in a ground based drop tube study, a carefully designed flight experiment may be planned to utilize the extended duration microgravity conditions of orbiting spacecraft.

  14. Integrated system for single leg walking

    NASA Astrophysics Data System (ADS)

    Simmons, Reid; Krotkov, Eric; Roston, Gerry

    1990-07-01

    The Carnegie Mellon University Planetary Rover project is developing a six-legged walking robot capable of autonomously navigating, exploring, and acquiring samples in rugged, unknown environments. This report describes an integrated software system capable of navigating a single leg of the robot over rugged terrain. The leg, based on an early design of the Ambler Planetary Rover, is suspended below a carriage that slides along rails. To walk, the system creates an elevation map of the terrain from laser scanner images, plans an appropriate foothold based on terrain and geometric constraints, weaves the leg through the terrain to position it above the foothold, contacts the terrain with the foot, and applies force enough to advance the carriage along the rails. Walking both forward and backward, the system has traversed hundreds of meters of rugged terrain including obstacles too tall to step over, trenches too deep to step in, closely spaced obstacles, and sand hills. The implemented system consists of a number of task-specific processes (two for planning, two for perception, one for real-time control) and a central control process that directs the flow of communication between processes.

  15. Advanced Environmental Barrier Coating Development for SiC-SiC Ceramic Matrix Composite Components

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Harder, Bryan; Hurst, Janet B.; Halbig, Michael Charles; Puleo, Bernadette J.; Costa, Gustavo; Mccue, Terry R.

    2017-01-01

    This presentation reviews the NASA advanced environmental barrier coating (EBC) system development for SiC-SiC Ceramic Matrix Composite (CMC) combustors particularly under the NASA Environmentally Responsible Aviation, Fundamental Aeronautics and Transformative Aeronautics Concepts Programs. The emphases have been placed on the current design challenges of the 2700-3000F capable environmental barrier coatings for low NOX emission combustors for next generation turbine engines by using advanced plasma spray based processes, and the coating processing and integration with SiC-SiC CMCs and component systems. The developments also have included candidate coating composition system designs, degradation mechanisms, performance evaluation and down-selects; the processing optimizations using TriplexPro Air Plasma Spray Low Pressure Plasma Spray (LPPS), Plasma Spray Physical Vapor Deposition and demonstration of EBC-CMC systems. This presentation also highlights the EBC-CMC system temperature capability and durability improvements under the NASA development programs, as demonstrated in the simulated engine high heat flux, combustion environments, in conjunction with high heat flux, mechanical creep and fatigue loading testing conditions.

  16. Heterogeneous delivering capability promotes traffic efficiency in complex networks

    NASA Astrophysics Data System (ADS)

    Zhu, Yan-Bo; Guan, Xiang-Min; Zhang, Xue-Jun

    2015-12-01

    Traffic is one of the most fundamental dynamical processes in networked systems. With the homogeneous delivery capability of nodes, the global dynamic routing strategy proposed by Ling et al. [Phys. Rev. E81, 016113 (2010)] adequately uses the dynamic information during the process and thus it can reach a quite high network capacity. In this paper, based on the global dynamic routing strategy, we proposed a heterogeneous delivery allocation strategy of nodes on scale-free networks with consideration of nodes degree. It is found that the network capacity as well as some other indexes reflecting transportation efficiency are further improved. Our work may be useful for the design of more efficient routing strategies in communication or transportation systems.

  17. Synthetic Analog and Digital Circuits for Cellular Computation and Memory

    PubMed Central

    Purcell, Oliver; Lu, Timothy K.

    2014-01-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536

  18. Apparatus and method for converting biomass to feedstock for biofuel and biochemical manufacturing processes

    DOEpatents

    Kania, John; Qiao, Ming; Woods, Elizabeth M.; Cortright, Randy D.; Myren, Paul

    2015-12-15

    The present invention includes improved systems and methods for producing biomass-derived feedstocks for biofuel and biochemical manufacturing processes. The systems and methods use components that are capable of transferring relatively high concentrations of solid biomass utilizing pressure variations between vessels, and allows for the recovery and recycling of heterogeneous catalyst materials.

  19. Apple Image Processing Educator

    NASA Technical Reports Server (NTRS)

    Gunther, F. J.

    1981-01-01

    A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.

  20. Application and Validation of Workload Assessment Techniques

    DTIC Science & Technology

    1993-03-01

    tech ical report documents the process and outcome of meeting this objective. Procedure: A series of eight separate studies was conducted using three...development process . The task analysis and simulation technique was shown to have the capability to track empirical workload ratings. More research is...operator workload during the systems acquisi- tion process , and (b) a pamphlet for the managers of Army systems that describes the need and some procedures

  1. Global positioning system : challenges in sustaining and upgrading capabilities persist.

    DOT National Transportation Integrated Search

    2010-09-01

    The Global Positioning System (GPS) provides positioning, navigation, and timing (PNT) data to users worldwide. The U.S. Air Force, which is responsible for GPS acquisition, is in the process of modernizing the system. Last year GAO reported that it ...

  2. Major system acquisitions process (A-109)

    NASA Technical Reports Server (NTRS)

    Saric, C.

    1991-01-01

    The Major System examined is a combination of elements (hardware, software, facilities, and services) that function together to produce capabilities required to fulfill a mission need. The system acquisition process is a sequence of activities beginning with documentation of mission need and ending with introduction of major system into operational use or otherwise successful achievement of program objectives. It is concluded that the A-109 process makes sense and provides a systematic, integrated management approach along with appropriate management level involvement and innovative and 'best ideas' from private sector in satisfying mission needs.

  3. Photonics and bioinspiration

    NASA Astrophysics Data System (ADS)

    Lewis, Keith

    2014-10-01

    Biological systems exploiting light have benefitted from thousands of years of genetic evolution and can provide insight to support the development of new approaches for imaging, image processing and communication. For example, biological vision systems can provide significant diversity, yet are able to function with only a minimal degree of neural processing. Examples will be described underlying the processes used to support the development of new concepts for photonic systems, ranging from uncooled bolometers and tunable filters, to asymmetric free-space optical communication systems and new forms of camera capable of simultaneously providing spectral and polarimetric diversity.

  4. Architecture for Survivable Systems Processing (ASSP). Technology benefits for Open System Interconnects

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1992-01-01

    The Architecture for Survivable Systems Processing (ASSP) program is a two phase program whose objective is the derivation, specification, development and validation of an open system architecture capable of supporting advanced processing needs of space, ground, and launch vehicle operations. The output of the first phase is a set of hardware and software standards and specifications defining this architecture at three levels. The second phase will validate these standards and develop the technology necessary to achieve strategic hardness, packaging density, throughput requirements, and interoperability/interchangeability.

  5. Graphical workstation capability for reliability modeling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Koppen, Sandra V.; Haley, Pamela J.

    1992-01-01

    In addition to computational capabilities, software tools for estimating the reliability of fault-tolerant digital computer systems must also provide a means of interfacing with the user. Described here is the new graphical interface capability of the hybrid automated reliability predictor (HARP), a software package that implements advanced reliability modeling techniques. The graphics oriented (GO) module provides the user with a graphical language for modeling system failure modes through the selection of various fault-tree gates, including sequence-dependency gates, or by a Markov chain. By using this graphical input language, a fault tree becomes a convenient notation for describing a system. In accounting for any sequence dependencies, HARP converts the fault-tree notation to a complex stochastic process that is reduced to a Markov chain, which it can then solve for system reliability. The graphics capability is available for use on an IBM-compatible PC, a Sun, and a VAX workstation. The GO module is written in the C programming language and uses the graphical kernal system (GKS) standard for graphics implementation. The PC, VAX, and Sun versions of the HARP GO module are currently in beta-testing stages.

  6. Neural network for processing both spatial and temporal data with time based back-propagation

    NASA Technical Reports Server (NTRS)

    Villarreal, James A. (Inventor); Shelton, Robert O. (Inventor)

    1993-01-01

    Neural networks are computing systems modeled after the paradigm of the biological brain. For years, researchers using various forms of neural networks have attempted to model the brain's information processing and decision-making capabilities. Neural network algorithms have impressively demonstrated the capability of modeling spatial information. On the other hand, the application of parallel distributed models to the processing of temporal data has been severely restricted. The invention introduces a novel technique which adds the dimension of time to the well known back-propagation neural network algorithm. In the space-time neural network disclosed herein, the synaptic weights between two artificial neurons (processing elements) are replaced with an adaptable-adjustable filter. Instead of a single synaptic weight, the invention provides a plurality of weights representing not only association, but also temporal dependencies. In this case, the synaptic weights are the coefficients to the adaptable digital filters. Novelty is believed to lie in the disclosure of a processing element and a network of the processing elements which are capable of processing temporal as well as spacial data.

  7. Development of wireless brain computer interface with embedded multitask scheduling and its application on real-time driver's drowsiness detection and warning.

    PubMed

    Lin, Chin-Teng; Chen, Yu-Chieh; Huang, Teng-Yi; Chiu, Tien-Ting; Ko, Li-Wei; Liang, Sheng-Fu; Hsieh, Hung-Yi; Hsu, Shang-Hwa; Duann, Jeng-Ren

    2008-05-01

    Biomedical signal monitoring systems have been rapidly advanced with electronic and information technologies in recent years. However, most of the existing physiological signal monitoring systems can only record the signals without the capability of automatic analysis. In this paper, we proposed a novel brain-computer interface (BCI) system that can acquire and analyze electroencephalogram (EEG) signals in real-time to monitor human physiological as well as cognitive states, and, in turn, provide warning signals to the users when needed. The BCI system consists of a four-channel biosignal acquisition/amplification module, a wireless transmission module, a dual-core signal processing unit, and a host system for display and storage. The embedded dual-core processing system with multitask scheduling capability was proposed to acquire and process the input EEG signals in real time. In addition, the wireless transmission module, which eliminates the inconvenience of wiring, can be switched between radio frequency (RF) and Bluetooth according to the transmission distance. Finally, the real-time EEG-based drowsiness monitoring and warning algorithms were implemented and integrated into the system to close the loop of the BCI system. The practical online testing demonstrates the feasibility of using the proposed system with the ability of real-time processing, automatic analysis, and online warning feedback in real-world operation and living environments.

  8. Leveraging People-Related Maturity Issues for Achieving Higher Maturity and Capability Levels

    NASA Astrophysics Data System (ADS)

    Buglione, Luigi

    During the past 20 years Maturity Models (MM) become a buzzword in the ICT world. Since the initial Crosby's idea in 1979, plenty of models have been created in the Software & Systems Engineering domains, addressing various perspectives. By analyzing the content of the Process Reference Models (PRM) in many of them, it can be noticed that people-related issues have little weight in the appraisals of the capabilities of organizations while in practice they are considered as significant contributors in traditional process and organizational performance appraisals, as stressed instead in well-known Performance Management models such as MBQA, EFQM and BSC. This paper proposes some ways for leveraging people-related maturity issues merging HR practices from several types of maturity models into the organizational Business Process Model (BPM) in order to achieve higher organizational maturity and capability levels.

  9. Creation of a full color geologic map by computer: A case history from the Port Moller project resource assessment, Alaska Peninsula: A section in Geologic studies in Alaska by the U.S. Geological Survey, 1988

    USGS Publications Warehouse

    Wilson, Frederic H.

    1989-01-01

    Graphics programs on computers can facilitate the compilation and production of geologic maps, including full color maps of publication quality. This paper describes the application of two different programs, GSMAP and ARC/INFO, to the production of a geologic map of the Port Meller and adjacent 1:250,000-scale quadrangles on the Alaska Peninsula. GSMAP was used at first because of easy digitizing on inexpensive computer hardware. Limitations in its editing capability led to transfer of the digital data to ARC/INFO, a Geographic Information System, which has better editing and also added data analysis capability. Although these improved capabilities are accompanied by increased complexity, the availability of ARC/INFO's data analysis capability provides unanticipated advantages. It allows digital map data to be processed as one of multiple data layers for mineral resource assessment. As a result of development of both software packages, it is now easier to apply both software packages to geologic map production. Both systems accelerate the drafting and revision of maps and enhance the compilation process. Additionally, ARC/ INFO's analysis capability enhances the geologist's ability to develop answers to questions of interest that were previously difficult or impossible to obtain.

  10. A Systems Engineering Process Supporting the Development of Operational Requirements Driven Federations

    DTIC Science & Technology

    2008-12-01

    A SYSTEMS ENGINEERING PROCESS SUPPORTING THE DEVELOPMENT OF OPERATIONAL REQUIREMENTS DRIVEN FEDERATIONS Andreas Tolk & Thomas G. Litwin ...c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Tolk, Litwin and Kewley Executive Office (PEO...capabilities and their relative changes 1297 Tolk, Litwin and Kewley based on the system to be evaluated as well, in particular when it comes to

  11. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  12. HALO: a reconfigurable image enhancement and multisensor fusion system

    NASA Astrophysics Data System (ADS)

    Wu, F.; Hickman, D. L.; Parker, Steve J.

    2014-06-01

    Contemporary high definition (HD) cameras and affordable infrared (IR) imagers are set to dramatically improve the effectiveness of security, surveillance and military vision systems. However, the quality of imagery is often compromised by camera shake, or poor scene visibility due to inadequate illumination or bad atmospheric conditions. A versatile vision processing system called HALO™ is presented that can address these issues, by providing flexible image processing functionality on a low size, weight and power (SWaP) platform. Example processing functions include video distortion correction, stabilisation, multi-sensor fusion and image contrast enhancement (ICE). The system is based around an all-programmable system-on-a-chip (SoC), which combines the computational power of a field-programmable gate array (FPGA) with the flexibility of a CPU. The FPGA accelerates computationally intensive real-time processes, whereas the CPU provides management and decision making functions that can automatically reconfigure the platform based on user input and scene content. These capabilities enable a HALO™ equipped reconnaissance or surveillance system to operate in poor visibility, providing potentially critical operational advantages in visually complex and challenging usage scenarios. The choice of an FPGA based SoC is discussed, and the HALO™ architecture and its implementation are described. The capabilities of image distortion correction, stabilisation, fusion and ICE are illustrated using laboratory and trials data.

  13. Improving The Prototyping Process In Department Of Defense Acquisition

    DTIC Science & Technology

    2014-06-01

    System Flow Chart ................................................. 39 Figure 13. TRL Definitions (from ASD [R&E] 2011... ASD (R&E) Assistant Secretary of Defense for Research and Engineering BCL Business Capability Life cycle CDD Capability Development Document CDR...TRL 6 cannot be attained until the technology has been demonstrated in a relevant operational environment ( ASD [R&E] 2011). A technology that has

  14. NASA JSC water monitor system: City of Houston field demonstration

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Jeffers, E. L.; Fricks, D. H.

    1979-01-01

    A water quality monitoring system with on-line and real time operation similar to the function in a spacecraft was investigated. A system with the capability to determine conformance to future high effluent quality standards and to increase the potential for reclamation and reuse of water was designed. Although all system capabilities were not verified in the initial field trial, fully automated operation over a sustained period with only routine manual adjustments was accomplished. Two major points were demonstrated: (1) the water monitor system has great potential in water monitoring and/or process control applications; and (2) the water monitor system represents a vast improvement over conventional (grab sample) water monitoring techniques.

  15. A rapid prototyping/artificial intelligence approach to space station-era information management and access

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Corey, Stephen M.; Snow, John B.

    1989-01-01

    Applications of rapid prototyping and Artificial Intelligence techniques to problems associated with Space Station-era information management systems are described. In particular, the work is centered on issues related to: (1) intelligent man-machine interfaces applied to scientific data user support, and (2) the requirement that intelligent information management systems (IIMS) be able to efficiently process metadata updates concerning types of data handled. The advanced IIMS represents functional capabilities driven almost entirely by the needs of potential users. Space Station-era scientific data projected to be generated is likely to be significantly greater than data currently processed and analyzed. Information about scientific data must be presented clearly, concisely, and with support features to allow users at all levels of expertise efficient and cost-effective data access. Additionally, mechanisms for allowing more efficient IIMS metadata update processes must be addressed. The work reported covers the following IIMS design aspects: IIMS data and metadata modeling, including the automatic updating of IIMS-contained metadata, IIMS user-system interface considerations, including significant problems associated with remote access, user profiles, and on-line tutorial capabilities, and development of an IIMS query and browse facility, including the capability to deal with spatial information. A working prototype has been developed and is being enhanced.

  16. Packaging data products using data grid middleware for Deep Space Mission Systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Ramirez, Paul M.; Chrichton, Daniel J.; Hughes, J. Steven

    2004-01-01

    Deep Space Mission Systems lack the capability to provide end to end tracing of mission data products. These data products are simple products such as telemetry data, processing history, and uplink data.

  17. Data Processing for High School Students

    ERIC Educational Resources Information Center

    Spiegelberg, Emma Jo

    1974-01-01

    Data processing should be taught at the high school level so students may develop a general understanding and appreciation for the capabilities and the limitations of these automated data processing systems. Card machines, wiring, logic, flowcharting, and Cobol programing are to be taught, with behavioral objectives for each section listed. (SC)

  18. An FPGA-based High Speed Parallel Signal Processing System for Adaptive Optics Testbed

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, Y.; Yang, Y.

    In this paper a state-of-the-art FPGA (Field Programmable Gate Array) based high speed parallel signal processing system (SPS) for adaptive optics (AO) testbed with 1 kHz wavefront error (WFE) correction frequency is reported. The AO system consists of Shack-Hartmann sensor (SHS) and deformable mirror (DM), tip-tilt sensor (TTS), tip-tilt mirror (TTM) and an FPGA-based high performance SPS to correct wavefront aberrations. The SHS is composed of 400 subapertures and the DM 277 actuators with Fried geometry, requiring high speed parallel computing capability SPS. In this study, the target WFE correction speed is 1 kHz; therefore, it requires massive parallel computing capabilities as well as strict hard real time constraints on measurements from sensors, matrix computation latency for correction algorithms, and output of control signals for actuators. In order to meet them, an FPGA based real-time SPS with parallel computing capabilities is proposed. In particular, the SPS is made up of a National Instrument's (NI's) real time computer and five FPGA boards based on state-of-the-art Xilinx Kintex 7 FPGA. Programming is done with NI's LabView environment, providing flexibility when applying different algorithms for WFE correction. It also facilitates faster programming and debugging environment as compared to conventional ones. One of the five FPGA's is assigned to measure TTS and calculate control signals for TTM, while the rest four are used to receive SHS signal, calculate slops for each subaperture and correction signal for DM. With this parallel processing capabilities of the SPS the overall closed-loop WFE correction speed of 1 kHz has been achieved. System requirements, architecture and implementation issues are described; furthermore, experimental results are also given.

  19. The X-33 range Operations Control Center

    NASA Technical Reports Server (NTRS)

    Shy, Karla S.; Norman, Cynthia L.

    1998-01-01

    This paper describes the capabilities and features of the X-33 Range Operations Center at NASA Dryden Flight Research Center. All the unprocessed data will be collected and transmitted over fiber optic lines to the Lockheed Operations Control Center for real-time flight monitoring of the X-33 vehicle. By using the existing capabilities of the Western Aeronautical Test Range, the Range Operations Center will provide the ability to monitor all down-range tracking sites for the Extended Test Range systems. In addition to radar tracking and aircraft telemetry data, the Telemetry and Radar Acquisition and Processing System is being enhanced to acquire vehicle command data, differential Global Positioning System corrections and telemetry receiver signal level status. The Telemetry and Radar Acquisition Processing System provides the flexibility to satisfy all X-33 data processing requirements quickly and efficiently. Additionally, the Telemetry and Radar Acquisition Processing System will run a real-time link margin analysis program. The results of this model will be compared in real-time with actual flight data. The hardware and software concepts presented in this paper describe a method of merging all types of data into a common database for real-time display in the Range Operations Center in support of the X-33 program. All types of data will be processed for real-time analysis and display of the range system status to ensure public safety.

  20. Robust flight design for an advanced launch system vehicle

    NASA Astrophysics Data System (ADS)

    Dhand, Sanjeev K.; Wong, Kelvin K.

    Current launch vehicle trajectory design philosophies are generally based on maximizing payload capability. This approach results in an expensive trajectory design process for each mission. Two concepts of robust flight design have been developed to significantly reduce this cost: Standardized Trajectories and Command Multiplier Steering (CMS). These concepts were analyzed for an Advanced Launch System (ALS) vehicle, although their applicability is not restricted to any particular vehicle. Preliminary analysis has demonstrated the feasibility of these concepts at minimal loss in payload capability.

  1. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  2. Stochastic Feedforward Control Technique

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1990-01-01

    Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.

  3. Achieving Space Shuttle Abort-to-Orbit Using the Five-Segment Booster

    NASA Technical Reports Server (NTRS)

    Craft, Joe; Ess, Robert; Sauvageau, Don

    2003-01-01

    The Five-Segment Booster design concept was evaluated by a team that determined the concept to be feasible and capable of achieving the desired abort-to-orbit capability when used in conjunction with increased Space Shuttle main engine throttle capability. The team (NASA Johnson Space Center, NASA Marshall Space Flight Center, ATK Thiokol Propulsion, United Space Alliance, Lockheed-Martin Space Systems, and Boeing) selected the concept that provided abort-to-orbit capability while: 1) minimizing Shuttle system impacts by maintaining the current interface requirements with the orbiter, external tank, and ground operation systems; 2) minimizing changes to the flight-proven design, materials, and processes of the current four-segment Shuttle booster; 3) maximizing use of existing booster hardware; and 4) taking advantage of demonstrated Shuttle main engine throttle capability. The added capability can also provide Shuttle mission planning flexibility. Additional performance could be used to: enable implementation of more desirable Shuttle safety improvements like crew escape, while maintaining current payload capability; compensate for off nominal performance in no-fail missions; and support missions to high altitudes and inclinations. This concept is a low-cost, low-risk approach to meeting Shuttle safety upgrade objectives. The Five-Segment Booster also has the potential to support future heavy-lift missions.

  4. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  5. Design Features and Capabilities of the First Materials Science Research Rack

    NASA Technical Reports Server (NTRS)

    Pettigrew, P. J.; Lehoczky, S. L.; Cobb, S. D.; Holloway, T.; Kitchens, L.

    2003-01-01

    The First Materials Science Research Rack (MSRR-1) aboard the International Space Station (ISS) will offer many unique capabilities and design features to facilitate a wide range of materials science investigations. The initial configuration of MSRR-1 will accommodate two independent Experiment Modules (EMS) and provide the capability for simultaneous on-orbit processing. The facility will provide the common subsystems and interfaces required for the operation of experiment hardware and accommodate telescience capabilities. MSRR1 will utilize an International Standard Payload Rack (ISPR) equipped with an Active Rack Isolation System (ARIS) for vibration isolation of the facility.

  6. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  7. ISAAC Advanced Composites Research Testbed

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey; Stewart, Brian K.; Martin, Robert A.

    2014-01-01

    The NASA Langley Research Center is acquiring a state-of-art composites fabrication capability to support the Center's advanced research and technology mission. The system introduced in this paper is named ISAAC (Integrated Structural Assembly of Advanced Composites). The initial operational capability of ISAAC is automated fiber placement, built around a commercial system from Electroimpact, Inc. that consists of a multi-degree of freedom robot platform, a tool changer mechanism, and a purpose-built fiber placement end effector. Examples are presented of the advanced materials, structures, structural concepts, fabrication processes and technology development that may be enabled using the ISAAC system. The fiber placement end effector may be used directly or with appropriate modifications for these studies, or other end effectors with different capabilities may either be bought or developed with NASA's partners in industry and academia.

  8. Assessment of Space Nuclear Thermal Propulsion Facility and Capability Needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James Werner

    The development of a Nuclear Thermal Propulsion (NTP) system rests heavily upon being able to fabricate and demonstrate the performance of a high temperature nuclear fuel as well as demonstrating an integrated system prior to launch. A number of studies have been performed in the past which identified the facilities needed and the capabilities available to meet the needs and requirements identified at that time. Since that time, many facilities and capabilities within the Department of Energy have been removed or decommissioned. This paper provides a brief overview of the anticipated facility needs and identifies some promising concepts to bemore » considered which could support the development of a nuclear thermal propulsion system. Detailed trade studies will need to be performed to support the decision making process.« less

  9. Real-Time Implementation of Intelligent Actuator Control with a Transducer Health Monitoring Capability

    NASA Technical Reports Server (NTRS)

    Jethwa, Dipan; Selmic, Rastko R.; Figueroa, Fernando

    2008-01-01

    This paper presents a concept of feedback control for smart actuators that are compatible with smart sensors, communication protocols, and a hierarchical Integrated System Health Management (ISHM) architecture developed by NASA s Stennis Space Center. Smart sensors and actuators typically provide functionalities such as automatic configuration, system condition awareness and self-diagnosis. Spacecraft and rocket test facilities are in the early stages of adopting these concepts. The paper presents a concept combining the IEEE 1451-based ISHM architecture with a transducer health monitoring capability to enhance the control process. A control system testbed for intelligent actuator control, with on-board ISHM capabilities, has been developed and implemented. Overviews of the IEEE 1451 standard, the smart actuator architecture, and control based on this architecture are presented.

  10. Graphical Visualization of Human Exploration Capabilities

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description of planned future work to modify the computer program to include additional data and of alternate capability roadmap formats currently under consideration.

  11. Sensing Super-Position: Human Sensing Beyond the Visual Spectrum

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Schipper, John F.

    2007-01-01

    The coming decade of fast, cheap and miniaturized electronics and sensory devices opens new pathways for the development of sophisticated equipment to overcome limitations of the human senses. This paper addresses the technical feasibility of augmenting human vision through Sensing Super-position by mixing natural Human sensing. The current implementation of the device translates visual and other passive or active sensory instruments into sounds, which become relevant when the visual resolution is insufficient for very difficult and particular sensing tasks. A successful Sensing Super-position meets many human and pilot vehicle system requirements. The system can be further developed into cheap, portable, and low power taking into account the limited capabilities of the human user as well as the typical characteristics of his dynamic environment. The system operates in real time, giving the desired information for the particular augmented sensing tasks. The Sensing Super-position device increases the image resolution perception and is obtained via an auditory representation as well as the visual representation. Auditory mapping is performed to distribute an image in time. The three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. This paper details the approach of developing Sensing Super-position systems as a way to augment the human vision system by exploiting the capabilities of Lie human hearing system as an additional neural input. The human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns. The known capabilities of the human hearing system to learn and understand complicated auditory patterns provided the basic motivation for developing an image-to-sound mapping system. The human brain is superior to most existing computer systems in rapidly extracting relevant information from blurred, noisy, and redundant images. From a theoretical viewpoint, this means that the available bandwidth is not exploited in an optimal way. While image-processing techniques can manipulate, condense and focus the information (e.g., Fourier Transforms), keeping the mapping as direct and simple as possible might also reduce the risk of accidentally filtering out important clues. After all, especially a perfect non-redundant sound representation is prone to loss of relevant information in the non-perfect human hearing system. Also, a complicated non-redundant image-to-sound mapping may well be far more difficult to learn and comprehend than a straightforward mapping, while the mapping system would increase in complexity and cost. This work will demonstrate some basic information processing for optimal information capture for headmounted systems.

  12. IRLooK: an advanced mobile infrared signature measurement, data reduction, and analysis system

    NASA Astrophysics Data System (ADS)

    Cukur, Tamer; Altug, Yelda; Uzunoglu, Cihan; Kilic, Kayhan; Emir, Erdem

    2007-04-01

    Infrared signature measurement capability has a key role in the electronic warfare (EW) self protection systems' development activities. In this article, the IRLooK System and its capabilities will be introduced. IRLooK is a truly innovative mobile infrared signature measurement system with all its design, manufacturing and integration accomplished by an engineering philosophy peculiar to ASELSAN. IRLooK measures the infrared signatures of military and civil platforms such as fixed/rotary wing aircrafts, tracked/wheeled vehicles and navy vessels. IRLooK has the capabilities of data acquisition, pre-processing, post-processing, analysis, storing and archiving over shortwave, mid-wave and long wave infrared spectrum by means of its high resolution radiometric sensors and highly sophisticated software analysis tools. The sensor suite of IRLooK System includes imaging and non-imaging radiometers and a spectroradiometer. Single or simultaneous multiple in-band measurements as well as high radiant intensity measurements can be performed. The system provides detailed information on the spectral, spatial and temporal infrared signature characteristics of the targets. It also determines IR Decoy characteristics. The system is equipped with a high quality field proven two-axes tracking mount to facilitate target tracking. Manual or automatic tracking is achieved by using a passive imaging tracker. The system also includes a high quality weather station and field-calibration equipment including cavity and extended area blackbodies. The units composing the system are mounted on flat-bed trailers and the complete system is designed to be transportable by large body aircraft.

  13. Management Sciences Division Annual Report (9th)

    DTIC Science & Technology

    1992-01-01

    41 Actuarial Process Consolidation and Review ....................................... 43 How M alfunction Code Reduction...47 Sun W ork Stations ............................................................................... 48 Actuarial Process Consolidation and...Information System (WSMIS). Dyna-METRIC is used for wartime supply support capability assessments. The Aircraft Sustainability Model ( ASM ) is the

  14. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  15. SPS Energy Conversion Power Management Workshop

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Energy technology concerning photovoltaic conversion, solar thermal conversion systems, and electrical power distribution processing is discussed. The manufacturing processes involving solar cells and solar array production are summarized. Resource issues concerning gallium arsenides and silicon alternatives are reported. Collector structures for solar construction are described and estimates in their service life, failure rates, and capabilities are presented. Theories of advanced thermal power cycles are summarized. Power distribution system configurations and processing components are presented.

  16. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    PubMed

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  17. A new technology for manufacturing scheduling derived from space system operations

    NASA Technical Reports Server (NTRS)

    Hornstein, R. S.; Willoughby, J. K.

    1993-01-01

    A new technology for producing finite capacity schedules has been developed in response to complex requirements for operating space systems such as the Space Shuttle, the Space Station, and the Deep Space Network for telecommunications. This technology has proven its effectiveness in manufacturing environments where popular scheduling techniques associated with Materials Resources Planning (MRPII) and with factory simulation are not adequate for shop-floor work planning and control. The technology has three components. The first is a set of data structures that accommodate an extremely general description of a factory's resources, its manufacturing activities, and the constraints imposed by the environment. The second component is a language and set of software utilities that enable a rapid synthesis of functional capabilities. The third component is an algorithmic architecture called the Five Ruleset Model which accommodates the unique needs of each factory. Using the new technology, systems can model activities that generate, consume, and/or obligate resources. This allows work-in-process (WIP) to be generated and used; it permits constraints to be imposed or intermediate as well as finished goods inventories. It is also possible to match as closely as possible both the current factory state and future conditions such as promise dates. Schedule revisions can be accommodated without impacting the entire production schedule. Applications have been successful in both discrete and process manufacturing environments. The availability of a high-quality finite capacity production planning capability enhances the data management capabilities of MRP II systems. These schedules can be integrated with shop-floor data collection systems and accounting systems. Using the new technology, semi-custom systems can be developed at costs that are comparable to products that do not have equivalent functional capabilities and/or extensibility.

  18. Programmable Ultra-Lightweight System Adaptable Radio Satellite Base Station

    NASA Technical Reports Server (NTRS)

    Varnavas, Kosta; Sims, Herb

    2015-01-01

    With the explosion of the CubeSat, small sat, and nanosat markets, the need for a robust, highly capable, yet affordable satellite base station, capable of telemetry capture and relay, is significant. The Programmable Ultra-Lightweight System Adaptable Radio (PULSAR) is NASA Marshall Space Flight Center's (MSFC's) software-defined digital radio, developed with previous Technology Investment Programs and Technology Transfer Office resources. The current PULSAR will have achieved a Technology Readiness Level-6 by the end of FY 2014. The extensibility of the PULSAR will allow it to be adapted to perform the tasks of a mobile base station capable of commanding, receiving, and processing satellite, rover, or planetary probe data streams with an appropriate antenna.

  19. Timing module for MTCA MCH

    NASA Astrophysics Data System (ADS)

    Gumiński, M.; Kasprowicz, G.

    2016-09-01

    White Rabbit is an extension of Precise Time Protocol for synchronous Ethernet networks. Network created with dedicated WR switches enable synchronisation of WR capable devices with 1 ns precision. MicroTCA on the other hand is open standard defining cost efficient shelves capable of housing AMC modules used for data processing. Presented article give further introduction to WR and MTCA standard. The most important aspects of MTCA system are described, with focus on shelf controller and its functionality. Following part describes timing difficulties in MTCA systems and possible solutions. Main section describes extension module for MCH, capable of implementing White Rabbit node and distributing acquired timing to all modules connected to MTCA. Conclusions are given at the end of the article.

  20. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  1. First Materials Science Research Facility Rack Capabilities and Design Features

    NASA Technical Reports Server (NTRS)

    Cobb, S.; Higgins, D.; Kitchens, L.; Curreri, Peter (Technical Monitor)

    2002-01-01

    The first Materials Science Research Rack (MSRR-1) is the primary facility for U.S. sponsored materials science research on the International Space Station. MSRR-1 is contained in an International Standard Payload Rack (ISPR) equipped with the Active Rack Isolation System (ARIS) for the best possible microgravity environment. MSRR-1 will accommodate dual Experiment Modules and provide simultaneous on-orbit processing operations capability. The first Experiment Module for the MSRR-1, the Materials Science Laboratory (MSL), is an international cooperative activity between NASA's Marshall Space Flight Center (MSFC) and the European Space Agency's (ESA) European Space Research and Technology Center (ESTEC). The MSL Experiment Module will accommodate several on-orbit exchangeable experiment-specific Module Inserts which provide distinct thermal processing capabilities. Module Inserts currently planned for the MSL are a Quench Module Insert, Low Gradient Furnace, and a Solidification with Quench Furnace. The second Experiment Module for the MSRR-1 configuration is a commercial device supplied by MSFC's Space Products Development (SPD) Group. Transparent furnace assemblies include capabilities for vapor transport processes and annealing of glass fiber preforms. This Experiment Module is replaceable on-orbit. This paper will describe facility capabilities, schedule to flight and research opportunities.

  2. Spectrum Situational Awareness Capability: The Military Need and Potential Implementation Issues

    DTIC Science & Technology

    2006-10-01

    Management Sensor Systems Frequency Management EW Systems Frequency Management Allied Battlespace Spectrum Management Restricted Frequency List Frequency...Management Restricted Frequency List Frequency Allocation Table Civil Frequency Use Data Inputs Negotiation and allocation process © Dstl 2006 26th...Management Restricted Frequency List Data Inputs Negotiation and allocation process Frequency Allocation Table SSA ES INT COP etc WWW Spectrum

  3. Using Additive Manufacturing to Print a CubeSat Propulsion System

    NASA Technical Reports Server (NTRS)

    Marshall, William M.

    2015-01-01

    CubeSats are increasingly being utilized for missions traditionally ascribed to larger satellites CubeSat unit (1U) defined as 10 cm x 10 cm x 11 cm. Have been built up to 6U sizes. CubeSats are typically built up from commercially available off-the-shelf components, but have limited capabilities. By using additive manufacturing, mission specific capabilities (such as propulsion), can be built into a system. This effort is part of STMD Small Satellite program Printing the Complete CubeSat. Interest in propulsion concepts for CubeSats is rapidly gaining interest-Numerous concepts exist for CubeSat scale propulsion concepts. The focus of this effort is how to incorporate into structure using additive manufacturing. End-use of propulsion system dictates which type of system to develop-Pulse-mode RCS would require different system than a delta-V orbital maneuvering system. Team chose an RCS system based on available propulsion systems and feasibility of printing using a materials extrusion process. Initially investigated a cold-gas propulsion system for RCS applications-Materials extrusion process did not permit adequate sealing of part to make this a functional approach.

  4. Transformational Systems Concepts and Technologies for Our Future in Space

    NASA Technical Reports Server (NTRS)

    Howell, J. T.; George,P.; Mankins, J. C. (Editor); Christensen, C. B.

    2004-01-01

    NASA is constantly searching for new ideas and approaches yielding opportunities for assuring maximum returns on space infrastructure investments. Perhaps the idea of transformational innovation in developing space systems is long overdue. However, the concept of utilizing modular space system designs combined with stepping-stone development processes has merit and promises to return several times the original investment since each new space system or component is not treated as a unique and/or discrete design and development challenge. New space systems can be planned and designed so that each builds on the technology of previous systems and provides capabilities to support future advanced systems. Subsystems can be designed to use common modular components and achieve economies of scale, production, and operation. Standards, interoperability, and "plug and play" capabilities, when implemented vigorously and consistently, will result in systems that can be upgraded effectively with new technologies. This workshop explored many building-block approaches via way of example across a broad spectrum of technology discipline areas for potentially transforming space systems and inspiring future innovation. Details describing the workshop structure, process, and results are contained in this Conference Publication.

  5. A user need study and system plan for an Arizona Natural Resources Information System report to the Arizona state legislature

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A survey instrument was developed and implemented in order to evaluate the current needs for natural resource information in Arizona and to determine which state agencies have information systems capable of coordinating, accessing and analyzing the data. Data and format requirements were determined for the following categories: air quality, animals, cultural resources, geology, land use, soils, water, vegetation, ownership, and social and economic aspects. Hardware and software capabilities were assessed and a data processing plan was developed. Possible future applications with the next generation LANDSAT were also identified.

  6. NAS: The first year

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Kutler, Paul

    1988-01-01

    Discussed are the capabilities of NASA's Numerical Aerodynamic Simulation (NAS) Program and its application as an advanced supercomputing system for computational fluid dynamics (CFD) research. First, the paper describes the NAS computational system, called the NAS Processing System Network, and the advanced computational capabilities it offers as a consequence of carrying out the NAS pathfinder objective. Second, it presents examples of pioneering CFD research accomplished during NAS's first operational year. Examples are included which illustrate CFD applications for predicting fluid phenomena, complementing and supplementing experimentation, and aiding in design. Finally, pacing elements and future directions for CFD and NAS are discussed.

  7. Guidelines of the Design of Electropyrotechnic Firing Circuit for Unmanned Flight and Ground Test Projects

    NASA Technical Reports Server (NTRS)

    Gonzalez, Guillermo A.; Lucy, Melvin H.; Massie, Jeffrey J.

    2013-01-01

    The NASA Langley Research Center, Engineering Directorate, Electronic System Branch, is responsible for providing pyrotechnic support capabilities to Langley Research Center unmanned flight and ground test projects. These capabilities include device selection, procurement, testing, problem solving, firing system design, fabrication and testing; ground support equipment design, fabrication and testing; checkout procedures and procedure?s training to pyro technicians. This technical memorandum will serve as a guideline for the design, fabrication and testing of electropyrotechnic firing systems. The guidelines will discuss the entire process beginning with requirements definition and ending with development and execution.

  8. Combinatorial Optimization by Amoeba-Based Neurocomputer with Chaotic Dynamics

    NASA Astrophysics Data System (ADS)

    Aono, Masashi; Hirata, Yoshito; Hara, Masahiko; Aihara, Kazuyuki

    We demonstrate a computing system based on an amoeba of a true slime mold Physarum capable of producing rich spatiotemporal oscillatory behavior. Our system operates as a neurocomputer because an optical feedback control in accordance with a recurrent neural network algorithm leads the amoeba's photosensitive branches to search for a stable configuration concurrently. We show our system's capability of solving the traveling salesman problem. Furthermore, we apply various types of nonlinear time series analysis to the amoeba's oscillatory behavior in the problem-solving process. The results suggest that an individual amoeba might be characterized as a set of coupled chaotic oscillators.

  9. Compact holographic optical neural network system for real-time pattern recognition

    NASA Astrophysics Data System (ADS)

    Lu, Taiwei; Mintzer, David T.; Kostrzewski, Andrew A.; Lin, Freddie S.

    1996-08-01

    One of the important characteristics of artificial neural networks is their capability for massive interconnection and parallel processing. Recently, specialized electronic neural network processors and VLSI neural chips have been introduced in the commercial market. The number of parallel channels they can handle is limited because of the limited parallel interconnections that can be implemented with 1D electronic wires. High-resolution pattern recognition problems can require a large number of neurons for parallel processing of an image. This paper describes a holographic optical neural network (HONN) that is based on high- resolution volume holographic materials and is capable of performing massive 3D parallel interconnection of tens of thousands of neurons. A HONN with more than 16,000 neurons packaged in an attache case has been developed. Rotation- shift-scale-invariant pattern recognition operations have been demonstrated with this system. System parameters such as the signal-to-noise ratio, dynamic range, and processing speed are discussed.

  10. Conceptual design of the CZMIL data processing system (DPS): algorithms and software for fusing lidar, hyperspectral data, and digital images

    NASA Astrophysics Data System (ADS)

    Park, Joong Yong; Tuell, Grady

    2010-04-01

    The Data Processing System (DPS) of the Coastal Zone Mapping and Imaging Lidar (CZMIL) has been designed to automatically produce a number of novel environmental products through the fusion of Lidar, spectrometer, and camera data in a single software package. These new products significantly transcend use of the system as a bathymeter, and support use of CZMIL as a complete coastal and benthic mapping tool. The DPS provides a spinning globe capability for accessing data files; automated generation of combined topographic and bathymetric point clouds; a fully-integrated manual editor and data analysis tool; automated generation of orthophoto mosaics; automated generation of reflectance data cubes from the imaging spectrometer; a coupled air-ocean spectral optimization model producing images of chlorophyll and CDOM concentrations; and a fusion based capability to produce images and classifications of the shallow water seafloor. Adopting a multitasking approach, we expect to achieve computation of the point clouds, DEMs, and reflectance images at a 1:1 processing to acquisition ratio.

  11. The Experience Factory: Strategy and Practice

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi

    1995-01-01

    The quality movement, that has had in recent years a dramatic impact on all industrial sectors, has recently reached the system and software industry. Although some concepts of quality management, originally developed for other product types, can be applied to software, its specificity as a product which is developed and not produced requires a special approach. This paper introduces a quality paradigm specifically tailored on the problem of the systems and software industry. Reuse of products, processes and experiences originating from the system life cycle is seen today as a feasible solution to the problem of developing higher quality systems at a lower cost. In fact, quality improvement is very often achieved by defining and developing an appropriate set of strategic capabilities and core competencies to support them. A strategic capability is, in this context, a corporate goal defined by the business position of the organization and implemented by key business processes. Strategic capabilities are supported by core competencies, which are aggregate technologies tailored to the specific needs of the organization in performing the needed business processes. Core competencies are non-transitional, have a consistent evolution, and are typically fueled by multiple technologies. Their selection and development requires commitment, investment and leadership. The paradigm introduced in this paper for developing core competencies is the Quality Improvement Paradigm which consists of six steps: (1) Characterize the environment, (2) Set the goals, (3) Choose the process, (4) Execute the process, (5) Analyze the process data, and (6) Package experience. The process must be supported by a goal oriented approach to measurement and control, and an organizational infrastructure, called Experience Factory. The Experience Factory is a logical and physical organization distinct from the project organizations it supports. Its goal is development and support of core competencies through capitalization and reuse of its cycle experience and products. The paper introduces the major concepts of the proposed approach, discusses their relationship with other approaches used in the industry, and presents a case in which those concepts have been successfully applied.

  12. Method for producing a selectively permeable separation module

    DOEpatents

    Stone, Mark L.; Orme, Christopher J.; Peterson, Eric S.

    2000-03-14

    A method and apparatus is provided for casting a polymeric membrane on the inside surface of porous tubes to provide a permeate filter system capable of withstanding hostile operating conditions and having excellent selectivity capabilities. Any polymer in solution, by either solvent means or melt processing means, is capable of being used in the present invention to form a thin polymer membrane having uniform thickness on the inside surface of a porous tube. Multiple tubes configured as a tubular module can also be coated with the polymer solution. By positioning the longitudinal axis of the tubes in a substantially horizontal position and rotating the tube about the longitudinal axis, the polymer solution coats the inside surface of the porous tubes without substantially infiltrating the pores of the porous tubes, thereby providing a permeate filter system having enhanced separation capabilities.

  13. Preliminary Human Factors Guidelines for Automated Highway System Designers, Second Edition - Volume 2: User-System Transactions

    DOT National Transportation Integrated Search

    1998-04-01

    Human factors can be defined as "designing to match the capabilities and limitations of the human user." The objectives of this human-centered design process are to maximize the effectiveness and efficiency of system performance, ensure a high level ...

  14. Performance of the Landsat-Data Collection System in a Total System Context

    NASA Technical Reports Server (NTRS)

    Paulson, R. W. (Principal Investigator); Merk, C. F.

    1975-01-01

    The author has identified the following significant results. This experiment was, and continues to be, an integration of the LANDSAT-DCS with the data collection and processing system of the Geological Survey. Although an experimental demonstration, it was a successful integration of a satellite relay system that is capable of continental data collection, and an existing governmental nationwide operational data processing and distributing networks. The Survey's data processing system uses a large general purpose computer with insufficient redundancy for 24-hour a day, 7 day a week operation. This is significant, but soluble obstacle to converting the experimental integration of the system to an operational integration.

  15. Interactive degraded document enhancement and ground truth generation

    NASA Astrophysics Data System (ADS)

    Bal, G.; Agam, G.; Frieder, O.; Frieder, G.

    2008-01-01

    Degraded documents are frequently obtained in various situations. Examples of degraded document collections include historical document depositories, document obtained in legal and security investigations, and legal and medical archives. Degraded document images are hard to to read and are hard to analyze using computerized techniques. There is hence a need for systems that are capable of enhancing such images. We describe a language-independent semi-automated system for enhancing degraded document images that is capable of exploiting inter- and intra-document coherence. The system is capable of processing document images with high levels of degradations and can be used for ground truthing of degraded document images. Ground truthing of degraded document images is extremely important in several aspects: it enables quantitative performance measurements of enhancement systems and facilitates model estimation that can be used to improve performance. Performance evaluation is provided using the historical Frieder diaries collection.1

  16. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  17. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  18. Numerical propulsion system simulation: An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  19. Numerical propulsion system simulation - An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  20. Supporting users through integrated retrieval, processing, and distribution systems at the land processes distributed active archive center

    USGS Publications Warehouse

    Kalvelage, T.; Willems, Jennifer

    2003-01-01

    The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.

  1. High resolution image processing on low-cost microcomputers

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1993-01-01

    Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.

  2. Status Report on Modelling and Simulation Capabilities for Nuclear-Renewable Hybrid Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, C.; Epiney, A.; Talbot, P.

    This report summarizes the current status of the modeling and simulation capabilities developed for the economic assessment of Nuclear-Renewable Hybrid Energy Systems (N-R HES). The increasing penetration of variable renewables is altering the profile of the net demand, with which the other generators on the grid have to cope. N-R HES analyses are being conducted to determine the potential feasibility of mitigating the resultant volatility in the net electricity demand by adding industrial processes that utilize either thermal or electrical energy as stabilizing loads. This coordination of energy generators and users is proposed to mitigate the increase in electricity costmore » and cost volatility through the production of a saleable commodity. Overall, the financial performance of a system that is comprised of peaking units (i.e. gas turbine), baseload supply (i.e. nuclear power plant), and an industrial process (e.g. hydrogen plant) should be optimized under the constraint of satisfying an electricity demand profile with a certain level of variable renewable (wind) penetration. The optimization should entail both the sizing of the components/subsystems that comprise the system and the optimal dispatch strategy (output at any given moment in time from the different subsystems). Some of the capabilities here described have been reported separately in [1, 2, 3]. The purpose of this report is to provide an update on the improvement and extension of those capabilities and to illustrate their integrated application in the economic assessment of N-R HES.« less

  3. Clinical Parameters and Tools for Home-Based Assessment of Parkinson's Disease: Results from a Delphi study.

    PubMed

    Ferreira, Joaquim J; Santos, Ana T; Domingos, Josefa; Matthews, Helen; Isaacs, Tom; Duffen, Joy; Al-Jawad, Ahmed; Larsen, Frank; Artur Serrano, J; Weber, Peter; Thoms, Andrea; Sollinger, Stefan; Graessner, Holm; Maetzler, Walter

    2015-01-01

    Parkinson's disease (PD) is a neurodegenerative disorder with fluctuating symptoms. To aid the development of a system to evaluate people with PD (PwP) at home (SENSE-PARK system) there was a need to define parameters and tools to be applied in the assessment of 6 domains: gait, bradykinesia/hypokinesia, tremor, sleep, balance and cognition. To identify relevant parameters and assessment tools of the 6 domains, from the perspective of PwP, caregivers and movement disorders specialists. A 2-round Delphi study was conducted to select a core of parameters and assessment tools to be applied. This process included PwP, caregivers and movement disorders specialists. Two hundred and thirty-three PwP, caregivers and physicians completed the first round questionnaire, and 50 the second. Results allowed the identification of parameters and assessment tools to be added to the SENSE-PARK system. The most consensual parameters were: Falls and Near Falls; Capability to Perform Activities of Daily Living; Interference with Activities of Daily Living; Capability to Process Tasks; and Capability to Recall and Retrieve Information. The most cited assessment strategies included Walkers; the Evaluation of Performance Doing Fine Motor Movements; Capability to Eat; Assessment of Sleep Quality; Identification of Circumstances and Triggers for Loose of Balance and Memory Assessment. An agreed set of measuring parameters, tests, tools and devices was achieved to be part of a system to evaluate PwP at home. A pattern of different perspectives was identified for each stakeholder.

  4. Marshall Space Flight Center Materials and Processes Laboratory

    NASA Technical Reports Server (NTRS)

    Tramel, Terri L.

    2012-01-01

    Marshall?s Materials and Processes Laboratory has been a core capability for NASA for over fifty years. MSFC has a proven heritage and recognized expertise in materials and manufacturing that are essential to enable and sustain space exploration. Marshall provides a "systems-wise" capability for applied research, flight hardware development, and sustaining engineering. Our history of leadership and achievements in materials, manufacturing, and flight experiments includes Apollo, Skylab, Mir, Spacelab, Shuttle (Space Shuttle Main Engine, External Tank, Reusable Solid Rocket Motor, and Solid Rocket Booster), Hubble, Chandra, and the International Space Station. MSFC?s National Center for Advanced Manufacturing, NCAM, facilitates major M&P advanced manufacturing partnership activities with academia, industry and other local, state and federal government agencies. The Materials and Processes Laborato ry has principal competencies in metals, composites, ceramics, additive manufacturing, materials and process modeling and simulation, space environmental effects, non-destructive evaluation, and fracture and failure analysis provide products ranging from materials research in space to fully integrated solutions for large complex systems challenges. Marshall?s materials research, development and manufacturing capabilities assure that NASA and National missions have access to cutting-edge, cost-effective engineering design and production options that are frugal in using design margins and are verified as safe and reliable. These are all critical factors in both future mission success and affordability.

  5. Automatic welding systems for large ship hulls

    NASA Astrophysics Data System (ADS)

    Arregi, B.; Granados, S.; Hascoet, JY.; Hamilton, K.; Alonso, M.; Ares, E.

    2012-04-01

    Welding processes represents about 40% of the total production time in shipbuilding. Although most of the indoor welding work is automated, outdoor operations still require the involvement of numerous operators. To automate hull welding operations is a priority in large shipyards. The objective of the present work is to develop a comprehensive welding system capable of working with several welding layers in an automated way. There are several difficulties for the seam tracking automation of the welding process. The proposed solution is the development of a welding machine capable of moving autonomously along the welding seam, controlling both the position of the torch and the welding parameters to adjust the thickness of the weld bead to the actual gap between the hull plates.

  6. Synthetic analog and digital circuits for cellular computation and memory.

    PubMed

    Purcell, Oliver; Lu, Timothy K

    2014-10-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Dynamic nanoplatforms in biosensor and membrane constitutional systems.

    PubMed

    Mahon, Eugene; Aastrup, Teodor; Barboiu, Mihail

    2012-01-01

    Molecular recognition in biological systems occurs mainly at interfacial environments such as membrane surfaces, enzyme active sites, or the interior of the DNA double helix. At the cell membrane surface, carbohydrate-protein recognition principles apply to a range of specific non-covalent interactions including immune response, cell proliferation, adhesion and death, cell-cell interaction and communication. Protein-protein recognition meanwhile accounts for signalling processes and ion channel structure. In this chapter we aim to describe such constitutional dynamic interfaces for biosensing and membrane transport applications. Constitutionally adaptive interfaces may mimic the recognition capabilities intrinsic to natural recognition processes. We present some recent examples of 2D and 3D constructed sensors and membranes of this type and describe their sensing and transport capabilities.

  8. The Contribution of Cognitive Engineering to the Effective Design and Use of Information Systems.

    ERIC Educational Resources Information Center

    Garg-Janardan, Chaya; Salvendy, Gavriel

    1986-01-01

    Examines the role of human information processing and decision-making capabilities and limitations in the design of effective human-computer interfaces. Several cognitive engineering principles that should guide the design process are outlined. (48 references) (Author/CLB)

  9. Quality and noise measurements in mobile phone video capture

    NASA Astrophysics Data System (ADS)

    Petrescu, Doina; Pincenti, John

    2011-02-01

    The quality of videos captured with mobile phones has become increasingly important particularly since resolutions and formats have reached a level that rivals the capabilities available in the digital camcorder market, and since many mobile phones now allow direct playback on large HDTVs. The video quality is determined by the combined quality of the individual parts of the imaging system including the image sensor, the digital color processing, and the video compression, each of which has been studied independently. In this work, we study the combined effect of these elements on the overall video quality. We do this by evaluating the capture under various lighting, color processing, and video compression conditions. First, we measure full reference quality metrics between encoder input and the reconstructed sequence, where the encoder input changes with light and color processing modifications. Second, we introduce a system model which includes all elements that affect video quality, including a low light additive noise model, ISP color processing, as well as the video encoder. Our experiments show that in low light conditions and for certain choices of color processing the system level visual quality may not improve when the encoder becomes more capable or the compression ratio is reduced.

  10. NATO initial common operational picture capability project

    NASA Astrophysics Data System (ADS)

    Fanti, Laura; Beach, David

    2002-08-01

    The Common Operational Picture (COP) capability can be defined as the ability to display on a single screen integrated views of the Recognized Maritime, Air and Ground Pictures, enriched by other tactical data, such as theater plans, assets, intelligence and logistics information. The purpose of the COP capability is to provide military forces a comprehensive view of the battle space, thereby enhancing situational awareness and the decision-making process across the military command and control spectrum. The availability of a COP capability throughout the command structure is a high priority operational requirement in NATO. A COP capability for NATO is being procured and implemented in an incremental way within the NATO Automated Information System (Bi-SC AIS) Functional Services programme under the coordination of the NATO Consultation, Command and Control Agency (NC3A) Integrated Programme Team 5 (IPT5). The NATO Initial COP (iCOP) capability project, first step of this evolutionary procurement, will provide an initial COP capability to NATO in a highly pragmatic and low-risk fashion, by using existing operational communications infrastructure and NATO systems, i.e. the NATO-Wide Integrated Command and Control Software for Air Operations (ICC), the Maritime Command and Control Information System (MCCIS), and the Joint Operations and Intelligence Information System (JOIIS), which will provide respectively the Recognized Air, Maritime and Ground Pictures. This paper gives an overview of the NATO Initial COP capability project, including its evolutionary implementation approach, and describes the technical solution selected to satisfy the urgent operational requirement in a timely and cost effective manner.

  11. Security Systems Commissioning: An Old Trick for Your New Dog

    ERIC Educational Resources Information Center

    Black, James R.

    2009-01-01

    Sophisticated, software-based security systems can provide powerful tools to support campus security. By nature, such systems are flexible, with many capabilities that can help manage the process of physical protection. However, the full potential of these systems can be overlooked because of unfamiliarity with the products, weaknesses in security…

  12. A high-rate PCI-based telemetry processor system

    NASA Astrophysics Data System (ADS)

    Turri, R.

    2002-07-01

    The high performances reached by the Satellite on-board telemetry generation and transmission, as consequently, will impose the design of ground facilities with higher processing capabilities at low cost to allow a good diffusion of these ground station. The equipment normally used are based on complex, proprietary bus and computing architectures that prevent the systems from exploiting the continuous and rapid increasing in computing power available on market. The PCI bus systems now allow processing of high-rate data streams in a standard PC-system. At the same time the Windows NT operating system supports multitasking and symmetric multiprocessing, giving the capability to process high data rate signals. In addition, high-speed networking, 64 bit PCI-bus technologies and the increase in processor power and software, allow creating a system based on COTS products (which in future may be easily and inexpensively upgraded). In the frame of EUCLID RTP 9.8 project, a specific work element was dedicated to develop the architecture of a system able to acquire telemetry data of up to 600 Mbps. Laben S.p.A - a Finmeccanica Company -, entrusted of this work, has designed a PCI-based telemetry system making possible the communication between a satellite down-link and a wide area network at the required rate.

  13. Application of smart optical fiber sensors for structural load monitoring

    NASA Astrophysics Data System (ADS)

    Davies, Heddwyn; Everall, Lorna A.; Gallon, Andrew M.

    2001-06-01

    This paper describes a smart monitoring system, incorporating optical fiber sensing techniques, capable of providing important structural information to designers and users alike. This technology has wide industrial and commercial application in areas including aerospace, civil, maritime and automotive engineering. In order to demonstrate the capability of the sensing system it has been installed in a 35m free-standing carbon fiber yacht mast, where a complete optical network of strain and temperature sensors were embedded into a composite mast and boom during lay-up. The system was able to monitor the behavior of the composite rig through a range of handling conditions. The resulting strain information can be used by engineers to improve the structural design process. Embedded fiber optic sensors have wide ranging application for structural load monitoring. Due to their small size, optical fiber sensors can be readily embedded into composite materials. Other advantages include their immediate multiplexing capability and immunity to electro-magnetic interference. The capability of this system has been demonstrated within the maritime and industrial environment, but can be adapted for any application.

  14. Clinical Summarization Capabilities of Commercially-available and Internally-developed Electronic Health Records

    PubMed Central

    Laxmisan, A.; McCoy, A.B.; Wright, A.; Sittig, D.F.

    2012-01-01

    Objective Clinical summarization, the process by which relevant patient information is electronically summarized and presented at the point of care, is of increasing importance given the increasing volume of clinical data in electronic health record systems (EHRs). There is a paucity of research on electronic clinical summarization, including the capabilities of currently available EHR systems. Methods We compared different aspects of general clinical summary screens used in twelve different EHR systems using a previously described conceptual model: AORTIS (Aggregation, Organization, Reduction, Interpretation and Synthesis). Results We found a wide variation in the EHRs’ summarization capabilities: all systems were capable of simple aggregation and organization of limited clinical content, but only one demonstrated an ability to synthesize information from the data. Conclusion Improvement of the clinical summary screen functionality for currently available EHRs is necessary. Further research should identify strategies and methods for creating easy to use, well-designed clinical summary screens that aggregate, organize and reduce all pertinent patient information as well as provide clinical interpretations and synthesis as required. PMID:22468161

  15. An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chien, T. T.

    1972-01-01

    An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.

  16. Current and Prospective Li-Ion Battery Recycling and Recovery Processes

    NASA Astrophysics Data System (ADS)

    Heelan, Joseph; Gratz, Eric; Zheng, Zhangfeng; Wang, Qiang; Chen, Mengyuan; Apelian, Diran; Wang, Yan

    2016-10-01

    The lithium ion (Li-ion) battery industry has been growing exponentially since its initial inception in the late 20th century. As battery materials evolve, the applications for Li-ion batteries have become even more diverse. To date, the main source of Li-ion battery use varies from consumer portable electronics to electric/hybrid electric vehicles. However, even with the continued rise of Li-ion battery development and commercialization, the recycling industry is lagging; approximately 95% of Li-ion batteries are landfilled instead of recycled upon reaching end of life. Industrialized recycling processes are limited and only capable of recovering secondary raw materials, not suitable for direct reuse in new batteries. Most technologies are also reliant on high concentrations of cobalt to be profitable, and intense battery sortation is necessary prior to processing. For this reason, it is critical that a new recycling process be commercialized that is capable of recovering more valuable materials at a higher efficiency. A new technology has been developed by the researchers at Worcester Polytechnic Institute which is capable of recovering LiNi x Mn y Co z O2 cathode material from a hydrometallurgical process, making the recycling system as a whole more economically viable. By implementing a flexible recycling system that is closed-loop, recycling of Li-ion batteries will become more prevalent saving millions of pounds of batteries from entering the waste stream each year.

  17. An Extreme Learning Machine-Based Neuromorphic Tactile Sensing System for Texture Recognition.

    PubMed

    Rasouli, Mahdi; Chen, Yi; Basu, Arindam; Kukreja, Sunil L; Thakor, Nitish V

    2018-04-01

    Despite significant advances in computational algorithms and development of tactile sensors, artificial tactile sensing is strikingly less efficient and capable than the human tactile perception. Inspired by efficiency of biological systems, we aim to develop a neuromorphic system for tactile pattern recognition. We particularly target texture recognition as it is one of the most necessary and challenging tasks for artificial sensory systems. Our system consists of a piezoresistive fabric material as the sensor to emulate skin, an interface that produces spike patterns to mimic neural signals from mechanoreceptors, and an extreme learning machine (ELM) chip to analyze spiking activity. Benefiting from intrinsic advantages of biologically inspired event-driven systems and massively parallel and energy-efficient processing capabilities of the ELM chip, the proposed architecture offers a fast and energy-efficient alternative for processing tactile information. Moreover, it provides the opportunity for the development of low-cost tactile modules for large-area applications by integration of sensors and processing circuits. We demonstrate the recognition capability of our system in a texture discrimination task, where it achieves a classification accuracy of 92% for categorization of ten graded textures. Our results confirm that there exists a tradeoff between response time and classification accuracy (and information transfer rate). A faster decision can be achieved at early time steps or by using a shorter time window. This, however, results in deterioration of the classification accuracy and information transfer rate. We further observe that there exists a tradeoff between the classification accuracy and the input spike rate (and thus energy consumption). Our work substantiates the importance of development of efficient sparse codes for encoding sensory data to improve the energy efficiency. These results have a significance for a wide range of wearable, robotic, prosthetic, and industrial applications.

  18. Multispectral Image Processing for Plants

    NASA Technical Reports Server (NTRS)

    Miles, Gaines E.

    1991-01-01

    The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.

  19. Toward an optimisation technique for dynamically monitored environment

    NASA Astrophysics Data System (ADS)

    Shurrab, Orabi M.

    2016-10-01

    The data fusion community has introduced multiple procedures of situational assessments; this is to facilitate timely responses to emerging situations. More directly, the process refinement of the Joint Directors of Laboratories (JDL) is a meta-process to assess and improve the data fusion task during real-time operation. In other wording, it is an optimisation technique to verify the overall data fusion performance, and enhance it toward the top goals of the decision-making resources. This paper discusses the theoretical concept of prioritisation. Where the analysts team is required to keep an up to date with the dynamically changing environment, concerning different domains such as air, sea, land, space and cyberspace. Furthermore, it demonstrates an illustration example of how various tracking activities are ranked, simultaneously into a predetermined order. Specifically, it presents a modelling scheme for a case study based scenario, where the real-time system is reporting different classes of prioritised events. Followed by a performance metrics for evaluating the prioritisation process of situational awareness (SWA) domain. The proposed performance metrics has been designed and evaluated using an analytical approach. The modelling scheme represents the situational awareness system outputs mathematically, in the form of a list of activities. Such methods allowed the evaluation process to conduct a rigorous analysis of the prioritisation process, despite any constrained related to a domain-specific configuration. After conducted three levels of assessments over three separates scenario, The Prioritisation Capability Score (PCS) has provided an appropriate scoring scheme for different ranking instances, Indeed, from the data fusion perspectives, the proposed metric has assessed real-time system performance adequately, and it is capable of conducting a verification process, to direct the operator's attention to any issue, concerning the prioritisation capability of situational awareness domain.

  20. Water cooler towers and other man-made aquatic systems as environmental collection systems for agents of concern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brigmon, Robin; Kingsley, Mark T.

    An apparatus and process of using existing process water sources such as cooling towers, fountains, and waterfalls is provided in which the water sources are utilized as monitoring system for the detection of environmental agents which may be present in the environment. The process water is associated with structures and have an inherent filtering or absorbing capability available in the materials and therefore can be used as a rapid screening tool for quality and quantitative assessment of environmental agents.

  1. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Highfill, J. H., III

    1976-01-01

    The design of a microwave landing system (MLS) aircraft receiver, capable of optimal performance in multipath environments found in air terminal areas, is reported. Special attention was given to the angle tracking problem of the receiver and includes tracking system design considerations, study and application of locally optimum estimation involving multipath adaptive reception and then envelope processing, and microcomputer system design. Results show processing is competitive in this application with i-f signal processing performance-wise and is much more simple and cheaper. A summary of the signal model is given.

  2. Ground standoff mine detection system (GSTAMIDS) engineering, manufacturing, and development (EMD) Block 0

    NASA Astrophysics Data System (ADS)

    Pressley, Jackson R.; Pabst, Donald; Sower, Gary D.; Nee, Larry; Green, Brian; Howard, Peter

    2001-10-01

    The United States Army has contracted EG&G Technical Services to build the GSTAMIDS EMD Block 0. This system autonomously detects and marks buried anti-tank land mines from an unmanned vehicle. It consists of a remotely operated host vehicle, standard teleoperation system (STS) control, mine detection system (MDS) and a control vehicle. Two complete systems are being fabricated, along with a third MDS. The host vehicle for Block 0 is the South African Meerkat that has overpass capability for anti-tank mines, as well as armor anti-mine blast protection and ballistic protection. It is operated via the STS radio link from within the control vehicle. The Main Computer System (MCS), located in the control vehicle, receives sensor data from the MDS via a high speed radio link, processes and fuses the data to make a decision of a mine detection, and sends the information back to the host vehicle for a mark to be placed on the mine location. The MCS also has the capability to interface into the FBCB2 system via SINGARS radio. The GSTAMIDS operator station and the control vehicle communications system also connect to the MCS. The MDS sensors are mounted on the host vehicle and include Ground Penetrating Radar (GPR), Pulsed Magnetic Induction (PMI) metal detector, and (as an option) long-wave infrared (LWIR). A distributed processing architecture is used so that pre-processing is performed on data at the sensor level before transmission to the MCS, minimizing required throughput. Nine (9) channels each of GPR and PMI are mounted underneath the meerkat to provide a three-meter detection swath. Two IR cameras are mounted on the upper sides of the Meerkat, providing a field of view of the required swath with overlap underneath the vehicle. Also included on the host vehicle are an Internal Navigation System (INS), Global Positioning System (GPS), and radio communications for remote control and data transmission. The GSTAMIDS Block 0 is designed as a modular, expandable system with sufficient bandwidth and processing capability for incorporation of additional sensor systems in future Blocks. It is also designed to operate in adverse weather conditions and to be transportable around the world.

  3. Assured Human-Autonomy Interaction through Machine Self-Confidence

    NASA Astrophysics Data System (ADS)

    Aitken, Matthew

    Autonomous systems employ many layers of approximations in order to operate in increasingly uncertain and unstructured environments. The complexity of these systems makes it hard for a user to understand the systems capabilities, especially if the user is not an expert. However, if autonomous systems are to be used efficiently, their users must trust them appropriately. This purpose of this work is to implement and assess an 'assurance' that an autonomous system can provide to the user to elicit appropriate trust. Specifically, the autonomous system's perception of its own capabilities is reported to the user as the self-confidence assurance. The self-confidence assurance should allow the user to more quickly and accurately assess the autonomous system's capabilities, generating appropriate trust in the autonomous system. First, this research defines self-confidence and discusses what the self-confidence assurance is attempting to communicate to the user. Then it provides a framework for computing the autonomous system's self-confidence as a function of self-confidence factors which correspond to individual elements in the autonomous system's process. In order to explore this idea, self-confidence is implemented on an autonomous system that uses a mixed observability Markov decision process model to solve a pursuit-evasion problem on a road network. The implementation of a factor assessing the goodness of the autonomy's expected performance is focused on in particular. This work highlights some of the issues and considerations in the design of appropriate metrics for the self-confidence factors, and provides the basis for future research for computing self-confidence in autonomous systems.

  4. The Perception of Human Resources Enterprise Architecture within the Department of Defense

    ERIC Educational Resources Information Center

    Delaquis, Richard Serge

    2012-01-01

    The Clinger Cohen Act of 1996 requires that all major Federal Government Information Technology (IT) systems prepare an Enterprise Architecture prior to IT acquisitions. Enterprise Architecture, like house blueprints, represents the system build, capabilities, processes, and data across the enterprise of IT systems. Enterprise Architecture is used…

  5. Hipe, Hipe, Hooray

    NASA Astrophysics Data System (ADS)

    Ott, Stephan; Herschel Science Ground Segment Consortium

    2010-05-01

    The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched 14th of May 2009. With a 3.5 m telescope, it is the largest space telescope ever launched. Herschel's three instruments (HIFI, PACS, and SPIRE) perform photometry and spectroscopy in the 55 - 672 micron range and will deliver exciting science for the astronomical community during at least three years of routine observations. Since 2nd of December 2009 Herschel has been performing and processing observations in routine science mode. The development of the Herschel Data Processing System started eight years ago to support the data analysis for Instrument Level Tests. To fulfil the expectations of the astronomical community, additional resources were made available to implement a freely distributable Data Processing System capable of interactively and automatically reducing Herschel data at different processing levels. The system combines data retrieval, pipeline execution and scientific analysis in one single environment. The Herschel Interactive Processing Environment (HIPE) is the user-friendly face of Herschel Data Processing. The software is coded in Java and Jython to be platform independent and to avoid the need for commercial licenses. It is distributed under the GNU Lesser General Public License (LGPL), permitting everyone to access and to re-use its code. We will summarise the current capabilities of the Herschel Data Processing System and give an overview about future development milestones and plans, and how the astronomical community can contribute to HIPE. The Herschel Data Processing System is a joint development by the Herschel Science Ground Segment Consortium, consisting of ESA, the NASA Herschel Science Center, and the HIFI, PACS and SPIRE consortium members.

  6. Advanced Constituents and Processes for Ceramic Composite Engine Components

    NASA Technical Reports Server (NTRS)

    Yun, H. M.; DiCarlo, J. A.; Bhatt, R. T.

    2004-01-01

    The successful replacement of metal alloys by ceramic matrix composites (CMC) in hot-section engine components will depend strongly on optimizing the processes and properties of the CMC microstructural constituents so that they can synergistically provide the total CMC system with improved temperature capability and with the key properties required by the components for long-term structural service. This presentation provides the results of recent activities at NASA aimed at developing advanced silicon carbide (Sic) fiber-reinforced hybrid Sic matrix composite systems that can operate under mechanical loading and oxidizing conditions for hundreds of hours at 2400 and 2600 F, temperatures well above current metal capability. These SiC/SiC composite systems are lightweight (-30% metal density) and, in comparison to monolithic ceramics and carbon fiber-reinforced ceramic composites, are able to reliably retain their structural properties for long times under aggressive engine environments. It is shown that the improved temperature capability of the SiC/SiC systems is related first to the NASA development of the Sylramic-iBN Sic fiber, which displays high thermal stability, creep resistance, rupture resistance, and thermal conductivity, and possesses an in-situ grown BN surface layer for added environmental durability. This fiber is simply derived from Sylramic Sic fiber type that is currently produced at ATK COI Ceramics. Further capability is then derived by using chemical vapor infiltration (CVI) to form the initial portion of the hybrid Sic matrix. Because of its high creep resistance and thermal conductivity, the CVI Sic matrix is a required base constituent for all the high temperature SiC/SiC systems. By subsequently thermo- mechanical-treating the CMC preform, which consists of the S ylramic-iBN fibers and CVI Sic matrix, process-related defects in the matrix are removed, further improving matrix and CMC creep resistance and conductivity.

  7. Microcomputer-based artificial vision support system for real-time image processing for camera-driven visual prostheses

    NASA Astrophysics Data System (ADS)

    Fink, Wolfgang; You, Cindy X.; Tarbell, Mark A.

    2010-01-01

    It is difficult to predict exactly what blind subjects with camera-driven visual prostheses (e.g., retinal implants) can perceive. Thus, it is prudent to offer them a wide variety of image processing filters and the capability to engage these filters repeatedly in any user-defined order to enhance their visual perception. To attain true portability, we employ a commercial off-the-shelf battery-powered general purpose Linux microprocessor platform to create the microcomputer-based artificial vision support system (μAVS2) for real-time image processing. Truly standalone, μAVS2 is smaller than a deck of playing cards, lightweight, fast, and equipped with USB, RS-232 and Ethernet interfaces. Image processing filters on μAVS2 operate in a user-defined linear sequential-loop fashion, resulting in vastly reduced memory and CPU requirements during execution. μAVS2 imports raw video frames from a USB or IP camera, performs image processing, and issues the processed data over an outbound Internet TCP/IP or RS-232 connection to the visual prosthesis system. Hence, μAVS2 affords users of current and future visual prostheses independent mobility and the capability to customize the visual perception generated. Additionally, μAVS2 can easily be reconfigured for other prosthetic systems. Testing of μAVS2 with actual retinal implant carriers is envisioned in the near future.

  8. Microcomputer-based artificial vision support system for real-time image processing for camera-driven visual prostheses.

    PubMed

    Fink, Wolfgang; You, Cindy X; Tarbell, Mark A

    2010-01-01

    It is difficult to predict exactly what blind subjects with camera-driven visual prostheses (e.g., retinal implants) can perceive. Thus, it is prudent to offer them a wide variety of image processing filters and the capability to engage these filters repeatedly in any user-defined order to enhance their visual perception. To attain true portability, we employ a commercial off-the-shelf battery-powered general purpose Linux microprocessor platform to create the microcomputer-based artificial vision support system (microAVS(2)) for real-time image processing. Truly standalone, microAVS(2) is smaller than a deck of playing cards, lightweight, fast, and equipped with USB, RS-232 and Ethernet interfaces. Image processing filters on microAVS(2) operate in a user-defined linear sequential-loop fashion, resulting in vastly reduced memory and CPU requirements during execution. MiccroAVS(2) imports raw video frames from a USB or IP camera, performs image processing, and issues the processed data over an outbound Internet TCP/IP or RS-232 connection to the visual prosthesis system. Hence, microAVS(2) affords users of current and future visual prostheses independent mobility and the capability to customize the visual perception generated. Additionally, microAVS(2) can easily be reconfigured for other prosthetic systems. Testing of microAVS(2) with actual retinal implant carriers is envisioned in the near future.

  9. An efficient approach to integrated MeV ion imaging.

    PubMed

    Nikbakht, T; Kakuee, O; Solé, V A; Vosuoghi, Y; Lamehi-Rachti, M

    2018-03-01

    An ionoluminescence (IL) spectral imaging system, besides the common MeV ion imaging facilities such as µ-PIXE and µ-RBS, is implemented at the Van de Graaff laboratory of Tehran. A versatile processing software is required to handle the large amount of data concurrently collected in µ-IL and common MeV ion imaging measurements through the respective methodologies. The open-source freeware PyMca, with image processing and multivariate analysis capabilities, is employed to simultaneously process common MeV ion imaging and µ-IL data. Herein, the program was adapted to support the OM_DAQ listmode data format. The appropriate performance of the µ-IL data acquisition system is confirmed through a case study. Moreover, the capabilities of the software for simultaneous analysis of µ-PIXE and µ-RBS experimental data are presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. End-of-Life Care Planning in Accountable Care Organizations: Associations with Organizational Characteristics and Capabilities.

    PubMed

    Ahluwalia, Sangeeta C; Harris, Benjamin J; Lewis, Valerie A; Colla, Carrie H

    2018-06-01

    To measure the extent to which accountable care organizations (ACOs) have adopted end-of-life (EOL) care planning processes and characterize those ACOs that have established processes related to EOL. This study uses data from three waves (2012-2015) of the National Survey of ACOs. Respondents were 397 ACOs participating in Medicare, Medicaid, and commercial ACO contracts. This is a cross-sectional survey study using multivariate ordered logit regression models. We measured the extent to which the ACO had adopted EOL care planning processes as well as organizational characteristics, including care management, utilization management, health informatics, and shared decision-making capabilities, palliative care, and patient-centered medical home experience. Twenty-one percent of ACOs had few or no EOL care planning processes, 60 percent had some processes, and 19.6 percent had advanced processes. ACOs with a hospital in their system (OR: 3.07; p = .01), and ACOs with advanced care management (OR: 1.43; p = .02), utilization management (OR: 1.58, p = .00), and shared decision-making capabilities (OR: 16.3, p = .000) were more likely to have EOL care planning processes than those with no hospital or few to no capabilities. There remains considerable room for today's ACOs to increase uptake of EOL care planning, possibly by leveraging existing care management, utilization management, and shared decision-making processes. © Health Research and Educational Trust.

  11. Early Performance Results from the GOES-R Product Generation System

    NASA Astrophysics Data System (ADS)

    Marley, S.; Weiner, A.; Kalluri, S. N.; Hansen, D.; Dittberner, G.

    2013-12-01

    Enhancements to remote sensing capabilities for the next generation of Geostationary Operational Environmental Satellite (GOES R-series) scheduled to be launched in 2015 require high performance computing capabilities to output meteorological observations and products at low latency compared to the legacy processing systems. GOES R-series (GOES-R, -S, -T, and -U) represents a generational change in both spacecraft and instrument capability, and the GOES Re-Broadcast (GRB) data which contains calibrated and navigated radiances from all the instruments will be at a data rate of 31 Mb/sec compared to the current 2.11 Mb/sec from existing GOES satellites. To keep up with the data processing rates, the Product Generation (PG) system in the ground segment is designed on a Service Based Architecture (SBA). Each algorithm is executed as a service and subscribes to the data it needs to create higher level products via an enterprise service bus. Various levels of product data are published and retrieved from a data fabric. Together, the SBA and the data fabric provide a flexible, scalable, high performance architecture that meets the needs of product processing now and can grow to accommodate new algorithms in the future. The algorithms are linked together in a precedence chain starting from Level 0 to Level 1b and higher order Level 2 products that are distributed to data distribution nodes for external users. Qualification testing for more than half the product algorithms has so far been completed the PG system.

  12. Issues in knowledge representation to support maintainability: A case study in scientific data preparation

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Kandt, R. Kirk; Roden, Joseph; Burleigh, Scott; King, Todd; Joy, Steve

    1992-01-01

    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings.

  13. Implementing Information Assurance - Beyond Process

    DTIC Science & Technology

    2009-01-01

    disabled or properly configured. Tools and scripts are available to expedite the configuration process on some platforms, For example, approved Windows...in the System Security Plan (SSP) or Information Security Plan (lSP). Any PPSs not required for operation by the system must be disabled , This...Services must be disabled , Implementing an 1M capability within the boundary carries many policy and documentation requirements. Usemame and passwords

  14. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    NASA Astrophysics Data System (ADS)

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  15. Central Data Processing System (CDPS) user's manual: Solar heating and cooling program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The software and data base management system required to assess the performance of solar heating and cooling systems installed at multiple sites is presented. The instrumentation data associated with these systems is collected, processed, and presented in a form which supported continuity of performance evaluation across all applications. The CDPS consisted of three major elements: communication interface computer, central data processing computer, and performance evaluation data base. Users of the performance data base were identified, and procedures for operation, and guidelines for software maintenance were outlined. The manual also defined the output capabilities of the CDPS in support of external users of the system.

  16. A Polymer Visualization System with Accurate Heating and Cooling Control and High-Speed Imaging

    PubMed Central

    Wong, Anson; Guo, Yanting; Park, Chul B.; Zhou, Nan Q.

    2015-01-01

    A visualization system to observe crystal and bubble formation in polymers under high temperature and pressure has been developed. Using this system, polymer can be subjected to a programmable thermal treatment to simulate the process in high pressure differential scanning calorimetry (HPDSC). With a high-temperature/high-pressure view-cell unit, this system enables in situ observation of crystal formation in semi-crystalline polymers to complement thermal analyses with HPDSC. The high-speed recording capability of the camera not only allows detailed recording of crystal formation, it also enables in situ capture of plastic foaming processes with a high temporal resolution. To demonstrate the system’s capability, crystal formation and foaming processes of polypropylene/carbon dioxide systems were examined. It was observed that crystals nucleated and grew into spherulites, and they grew at faster rates as temperature decreased. This observation agrees with the crystallinity measurement obtained with the HPDSC. Cell nucleation first occurred at crystals’ boundaries due to CO2 exclusion from crystal growth fronts. Subsequently, cells were nucleated around the existing ones due to tensile stresses generated in the constrained amorphous regions between networks of crystals. PMID:25915031

  17. Self-contained microfluidic systems: a review.

    PubMed

    Boyd-Moss, Mitchell; Baratchi, Sara; Di Venere, Martina; Khoshmanesh, Khashayar

    2016-08-16

    Microfluidic systems enable rapid diagnosis, screening and monitoring of diseases and health conditions using small amounts of biological samples and reagents. Despite these remarkable features, conventional microfluidic systems rely on bulky expensive external equipment, which hinders their utility as powerful analysis tools outside of research laboratories. 'Self-contained' microfluidic systems, which contain all necessary components to facilitate a complete assay, have been developed to address this limitation. In this review, we provide an in-depth overview of self-contained microfluidic systems. We categorise these systems based on their operating mechanisms into three major groups: passive, hand-powered and active. Several examples are provided to discuss the structure, capabilities and shortcomings of each group. In particular, we discuss the self-contained microfluidic systems enabled by active mechanisms, due to their unique capability for running multi-step and highly controllable diagnostic assays. Integration of self-contained microfluidic systems with the image acquisition and processing capabilities of smartphones, especially those equipped with accessory optical components, enables highly sensitive and quantitative assays, which are discussed. Finally, the future trends and possible solutions to expand the versatility of self-contained, stand-alone microfluidic platforms are outlined.

  18. A numerical investigation of a thermodielectric power generation system

    NASA Astrophysics Data System (ADS)

    Sklar, Akiva A.

    The performance of a novel micro-thermodielectric power generation system was investigated in order to determine if thermodielectric power generation can be practically employed and if its performance can compete with current portable power generation technologies. Thermodielectric power generation is a direct energy conversion technology that converts heat directly into high voltage direct current. It requires dielectric (i.e., capacitive) materials whose charge storing capabilities are a function of temperature. This property can be exploited by heating these materials after they are charged; as their temperature increases, their charge storage capability decreases, forcing them to eject a portion of their surface charge. This ejected charge can then be supplied to an appropriate electronic storage device. There are several advantages associated with thermodielectric energy conversion; first, it requires heat addition at relatively low conventional power generation temperatures, i.e., less than 600 °K, and second, devices that utilize it have the potential for excellent power density and device reliability. The predominant disadvantage of using this power generation technique is that the device must operate in an unsteady manner; this can lead to substantial heat transfer losses that limit the device's thermal efficiency. The studied power generation system was designed so that the power generating components of the system (i.e., the thermodielectric materials) are integrated within a micro-scale heat exchange apparatus designed specifically to provide the thermodielectric materials with the unsteady heating and cooling necessary for efficient power generation. This apparatus is designed to utilize a liquid as a working fluid in order to maximize its heat transfer capabilities, minimize the size of the heat exchanger, and maximize the power density of the power generation system. The thermodielectric materials are operated through a power generation cycle that consists of four processes; the first process is a charging process, during which an electric field is applied to a thermodielectric material, causing it to acquire electrical charge on its surface (this process is analogous to the isentropic compression process of a Brayton cycle). The second process is a heating process in which the temperature of the dielectric material is increased via heat transfer from an external source. During this process, the thermodielectric material is forced to eject a portion of its surface charge because its charge storing capability decreases as the temperature increases; the ejected charge is intended for capture by external circuitry connected to the thermodielectric material, where it can be routed to an electrochemical storage device or an electromechanical device requiring high voltage direct current. The third process is a discharging process, during which the applied electric field is reduced to its initial strength (analogous to the isentropic expansion process of a Brayton cycle). The final process is a cooling process in which the temperature of the dielectric material is decreased via heat transfer from an external source, returning it to its initial temperature. Previously, predicting the performance of a thermodielectric power generator was hindered by a poor understanding of the material's thermodynamic properties and the effect unsteady heat transfer losses have on system performance. In order to improve predictive capabilities in this study, a thermodielectric equation of state was developed that relates the strength of the applied electric field, the amount of surface charge stored by the thermodielectric material, and its temperature. This state equation was then used to derive expressions for the material's thermodynamic states (internal energy, entropy), which were subsequently used to determine the optimum material properties for power generation. Next, a numerical simulation code was developed to determine the heat transfer capabilities of a micro-scale parallel plate heat recuperator (MPPHR), a device designed specifically to (a) provide the unsteady heating and cooling necessary for thermodielectric power generation and (b) minimize the unsteady heat transfer losses of the system. The simulation code was used to find the optimum heat transfer and heat recuperation regimes of the MPPHR. The previously derived thermodynamic equations that describe the behavior of the thermodielectric materials were then incorporated into the model for the walls of the parallel plate channel in the numerical simulation code, creating a tool capable of determining the thermodynamic performance of an MTDPG, in terms of the thermal efficiency, percent Carnot efficiency, and energy/power density. A detailed parameterization of the MTDPG with the simulation code yielded the critical non-dimensional numbers that determine the relationship between the heat exchange/recuperation abilities of the flow and the power generation capabilities of the thermodielectric materials. These relationships were subsequently used to optimize the performance of an MTDPG with an operating temperature range of 300--500 °K. The optimization predicted that the MTDPG could provide a thermal efficiency of 29.7 percent with the potential to reach 34 percent. These thermal efficiencies correspond to 74.2 and 85 percent of the Carnot efficiency, respectively. The power density of this MTDPG depends on the operating frequency and can exceed 1,000,000 W/m3.

  19. A methodology for evaluation of an interactive multispectral image processing system

    NASA Technical Reports Server (NTRS)

    Kovalick, William M.; Newcomer, Jeffrey A.; Wharton, Stephen W.

    1987-01-01

    Because of the considerable cost of an interactive multispectral image processing system, an evaluation of a prospective system should be performed to ascertain if it will be acceptable to the anticipated users. Evaluation of a developmental system indicated that the important system elements include documentation, user friendliness, image processing capabilities, and system services. The criteria and evaluation procedures for these elements are described herein. The following factors contributed to the success of the evaluation of the developmental system: (1) careful review of documentation prior to program development, (2) construction and testing of macromodules representing typical processing scenarios, (3) availability of other image processing systems for referral and verification, and (4) use of testing personnel with an applications perspective and experience with other systems. This evaluation was done in addition to and independently of program testing by the software developers of the system.

  20. An integrated autonomous rendezvous and docking system architecture using Centaur modern avionics

    NASA Technical Reports Server (NTRS)

    Nelson, Kurt

    1991-01-01

    The avionics system for the Centaur upper stage is in the process of being modernized with the current state-of-the-art in strapdown inertial guidance equipment. This equipment includes an integrated flight control processor with a ring laser gyro based inertial guidance system. This inertial navigation unit (INU) uses two MIL-STD-1750A processors and communicates over the MIL-STD-1553B data bus. Commands are translated into load activation through a Remote Control Unit (RCU) which incorporates the use of solid state relays. Also, a programmable data acquisition system replaces separate multiplexer and signal conditioning units. This modern avionics suite is currently being enhanced through independent research and development programs to provide autonomous rendezvous and docking capability using advanced cruise missile image processing technology and integrated GPS navigational aids. A system concept was developed to combine these technologies in order to achieve a fully autonomous rendezvous, docking, and autoland capability. The current system architecture and the evolution of this architecture using advanced modular avionics concepts being pursued for the National Launch System are discussed.

  1. Top-level modeling of an als system utilizing object-oriented techniques

    NASA Astrophysics Data System (ADS)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  2. Advances and Computational Tools towards Predictable Design in Biological Engineering

    PubMed Central

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated. PMID:25161694

  3. Structural equation modeling and natural systems

    USGS Publications Warehouse

    Grace, James B.

    2006-01-01

    This book, first published in 2006, presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems.

  4. Portable data collection terminal in the automated power consumption measurement system

    NASA Astrophysics Data System (ADS)

    Vologdin, S. V.; Shushkov, I. D.; Bysygin, E. K.

    2018-01-01

    Aim of efficiency increasing, automation process of electric energy data collection and processing is very important at present time. High cost of classic electric energy billing systems prevent from its mass application. Udmurtenergo Branch of IDGC of Center and Volga Region developed electronic automated system called “Mobile Energy Billing” based on data collection terminals. System joins electronic components based on service-oriented architecture, WCF services. At present time all parts of Udmurtenergo Branch electric network are connected to “Mobile Energy Billing” project. System capabilities are expanded due to flexible architecture.

  5. NASA information sciences and human factors program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Data Systems Program consists of research and technology devoted to controlling, processing, storing, manipulating, and analyzing space-derived data. The objectives of the program are to provide the technology advancements needed to enable affordable utilization of space-derived data, to increase substantially the capability for future missions of on-board processing and recording and to provide high-speed, high-volume computational systems that are anticipated for missions such as the evolutionary Space Station and Earth Observing System.

  6. Using location tracking data to assess efficiency in established clinical workflows.

    PubMed

    Meyer, Mark; Fairbrother, Pamela; Egan, Marie; Chueh, Henry; Sandberg, Warren S

    2006-01-01

    Location tracking systems are becoming more prevalent in clinical settings yet applications still are not common. We have designed a system to aid in the assessment of clinical workflow efficiency. Location data is captured from active RFID tags and processed into usable data. These data are stored and presented visually with trending capability over time. The system allows quick assessments of the impact of process changes on workflow, and isolates areas for improvement.

  7. Laboratory process control using natural language commands from a personal computer

    NASA Technical Reports Server (NTRS)

    Will, Herbert A.; Mackin, Michael A.

    1989-01-01

    PC software is described which provides flexible natural language process control capability with an IBM PC or compatible machine. Hardware requirements include the PC, and suitable hardware interfaces to all controlled devices. Software required includes the Microsoft Disk Operating System (MS-DOS) operating system, a PC-based FORTRAN-77 compiler, and user-written device drivers. Instructions for use of the software are given as well as a description of an application of the system.

  8. Utilizing New Audiovisual Resources

    ERIC Educational Resources Information Center

    Miller, Glen

    1975-01-01

    The University of Arizona's Agriculture Department has found that video cassette systems and 8 mm films are excellent audiovisual aids to classroom instruction at the high school level in small gasoline engines. Each system is capable of improving the instructional process for motor skill development. (MW)

  9. Improving transportation systems management and operations (TSM&O), capability maturity model workshop white paper : business processes.

    DOT National Transportation Integrated Search

    2015-04-01

    Research done through the Second Strategic Highway Research Program (SHRP 2) determined that agencies with the most effective transportation systems management and operations (TSM&O) activities were differentiated not by budgets or technical skills a...

  10. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  11. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  12. A Hybrid Robotic Control System Using Neuroblastoma Cultures

    NASA Astrophysics Data System (ADS)

    Ferrández, J. M.; Lorente, V.; Cuadra, J. M.; Delapaz, F.; Álvarez-Sánchez, José Ramón; Fernández, E.

    The main objective of this work is to analyze the computing capabilities of human neuroblastoma cultured cells and to define connection schemes for controlling a robot behavior. Multielectrode Array (MEA) setups have been designed for direct culturing neural cells over silicon or glass substrates, providing the capability to stimulate and record simultaneously populations of neural cells. This paper describes the process of growing human neuroblastoma cells over MEA substrates and tries to modulate the natural physiologic responses of these cells by tetanic stimulation of the culture. We show that the large neuroblastoma networks developed in cultured MEAs are capable of learning: establishing numerous and dynamic connections, with modifiability induced by external stimuli and we propose an hybrid system for controlling a robot to avoid obstacles.

  13. On adaptive robustness approach to Anti-Jam signal processing

    NASA Astrophysics Data System (ADS)

    Poberezhskiy, Y. S.; Poberezhskiy, G. Y.

    An effective approach to exploiting statistical differences between desired and jamming signals named adaptive robustness is proposed and analyzed in this paper. It combines conventional Bayesian, adaptive, and robust approaches that are complementary to each other. This combining strengthens the advantages and mitigates the drawbacks of the conventional approaches. Adaptive robustness is equally applicable to both jammers and their victim systems. The capabilities required for realization of adaptive robustness in jammers and victim systems are determined. The employment of a specific nonlinear robust algorithm for anti-jam (AJ) processing is described and analyzed. Its effectiveness in practical situations has been proven analytically and confirmed by simulation. Since adaptive robustness can be used by both sides in electronic warfare, it is more advantageous for the fastest and most intelligent side. Many results obtained and discussed in this paper are also applicable to commercial applications such as communications in unregulated or poorly regulated frequency ranges and systems with cognitive capabilities.

  14. Custom FPGA processing for real-time fetal ECG extraction and identification.

    PubMed

    Torti, E; Koliopoulos, D; Matraxia, M; Danese, G; Leporati, F

    2017-01-01

    Monitoring the fetal cardiac activity during pregnancy is of crucial importance for evaluating fetus health. However, there is a lack of automatic and reliable methods for Fetal ECG (FECG) monitoring that can perform this elaboration in real-time. In this paper, we present a hardware architecture, implemented on the Altera Stratix V FPGA, capable of separating the FECG from the maternal ECG and to correctly identify it. We evaluated our system using both synthetic and real tracks acquired from patients beyond the 20th pregnancy week. This work is part of a project aiming at developing a portable system for FECG continuous real-time monitoring. Its characteristics of reduced power consumption, real-time processing capability and reduced size make it suitable to be embedded in the overall system, that is the first proposed exploiting Blind Source Separation with this technology, to the best of our knowledge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Real-Time Mapping alert system; characteristics and capabilities

    USGS Publications Warehouse

    Torres, L.A.; Lambert, S.C.; Liebermann, T.D.

    1995-01-01

    The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water-related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field sampling sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. The current alert status at monitoring sites within a state or region is of critical importance during floods, hurricanes, and other extreme hydrologic events. This report describes the characteristics and capabilities of a series of computer programs for real-time mapping of hydrologic data. The software provides interactive graphics display and query of hydrologic information from the network in a real-time, map-based, menu-driven environment.

  16. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  17. Space Station tethered waste disposal

    NASA Technical Reports Server (NTRS)

    Rupp, Charles C.

    1988-01-01

    The Shuttle Transportation System (STS) launches more payload to the Space Station than can be returned creating an accumulation of waste. Several methods of deorbiting the waste are compared including an OMV, solid rocket motors, and a tether system. The use of tethers is shown to offer the unique potential of having a net savings in STS launch requirement. Tether technology is being developed which can satisfy the deorbit requirements but additional effort is required in waste processing, packaging, and container design. The first step in developing this capability is already underway in the Small Expendable Deployer System program. A developmental flight test of a tether initiated recovery system is seen as the second step in the evolution of this capability.

  18. Conceptual design for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Gratzer, Louis B.

    1989-01-01

    The designers of aircraft and more recently, aerospace vehicles have always struggled with the problems of evolving their designs to produce a machine which would perform its assigned task(s) in some optimum fashion. Almost invariably this involved dealing with more variables and constraints than could be handled in any computationally feasible way. With the advent of the electronic digital computer, the possibilities for introducing more variable and constraints into the initial design process led to greater expectations for improvement in vehicle (system) efficiency. The creation of the large scale systems necessary to achieve optimum designs has, for many reason, proved to be difficult. From a technical standpoint, significant problems arise in the development of satisfactory algorithms for processing of data from the various technical disciplines in a way that would be compatible with the complex optimization function. Also, the creation of effective optimization routines for multi-variable and constraint situations which could lead to consistent results has lagged. The current capability for carrying out the conceptual design of an aircraft on an interdisciplinary bases was evaluated to determine the need for extending this capability, and if necessary, to recommend means by which this could be carried out. Based on a review of available documentation and individual consultations, it appears that there is extensive interest at Langley Research Center as well as in the aerospace community in providing a higher level of capability that meets the technical challenges. By implication, the current design capability is inadequate and it does not operate in a way that allows the various technical disciplines to participate and cooperately interact in the design process. Based on this assessment, it was concluded that substantial effort should be devoted to developing a computer-based conceptual design system that would provide the capability needed for the near-term as well as framework for development of more advanced methods to serve future needs.

  19. Polysilicon planarization and plug recess etching in a decoupled plasma source chamber using two endpoint techniques

    NASA Astrophysics Data System (ADS)

    Kaplita, George A.; Schmitz, Stefan; Ranade, Rajiv; Mathad, Gangadhara S.

    1999-09-01

    The planarization and recessing of polysilicon to form a plug are processes of increasing importance in silicon IC fabrication. While this technology has been developed and applied to DRAM technology using Trench Storage Capacitors, the need for such processes in other IC applications (i.e. polysilicon studs) has increased. Both planarization and recess processes usually have stringent requirements on etch rate, recess uniformity, and selectivity to underlying films. Additionally, both processes generally must be isotropic, yet must not expand any seams that might be present in the polysilicon fill. These processes should also be insensitive to changes in exposed silicon area (pattern factor) on the wafer. A SF6 plasma process in a polysilicon DPS (Decoupled Plasma Source) reactor has demonstrated the capability of achieving the above process requirements for both planarization and recess etch. The SF6 process in the decoupled plasma source reactor exhibited less sensitivity to pattern factor than in other types of reactors. Control of these planarization and recess processes requires two endpoint systems to work sequentially in the same recipe: one for monitoring the endpoint when blanket polysilicon (100% Si loading) is being planarized and one for monitoring the recess depth while the plug is being recessed (less than 10% Si loading). The planarization process employs an optical emission endpoint system (OES). An interferometric endpoint system (IEP), capable of monitoring lateral interference, is used for determining the recess depth. The ability of using either or both systems is required to make these plug processes manufacturable. Measuring the recess depth resulting from the recess process can be difficult, costly and time- consuming. An Atomic Force Microscope (AFM) can greatly alleviate these problems and can serve as a critical tool in the development of recess processes.

  20. Engineering study for the functional design of a multiprocessor system

    NASA Technical Reports Server (NTRS)

    Miller, J. S.; Vandever, W. H.; Stanten, S. F.; Avakian, A. E.; Kosmala, A. L.

    1972-01-01

    The results are presented of a study to generate a functional system design of a multiprocessing computer system capable of satisfying the computational requirements of a space station. These data management system requirements were specified to include: (1) real time control, (2) data processing and storage, (3) data retrieval, and (4) remote terminal servicing.

  1. The Alaska experience

    NASA Technical Reports Server (NTRS)

    Mutter, D. L.

    1981-01-01

    The management responsibilities of the Alaska Department of Natural Resources are summarized and the establishment of a geoprocessor system is described. Specific capabilities were defined based on surveys of potential users and pre-existing systems. The procurement process, the initially purchased equipment, and system upgrading are described. Cost, installation and maintenance, site location, training, and staffing of the system are examined.

  2. The Alaska experience

    NASA Astrophysics Data System (ADS)

    Mutter, D. L.

    1981-09-01

    The management responsibilities of the Alaska Department of Natural Resources are summarized and the establishment of a geoprocessor system is described. Specific capabilities were defined based on surveys of potential users and pre-existing systems. The procurement process, the initially purchased equipment, and system upgrading are described. Cost, installation and maintenance, site location, training, and staffing of the system are examined.

  3. Education Technology Policy for a 21st Century Learning System. Policy Brief 13-3

    ERIC Educational Resources Information Center

    Kerchner, Charles Taylor

    2013-01-01

    Internet-related technology has the capacity to change the learning production system in three important ways. First, it creates the capacity to move from the existing batch processing system of teaching and learning to a much more individualized learning system capable of matching instructional style and pace to a student's needs. Second,…

  4. Software System Architecture Modeling Methodology for Naval Gun Weapon Systems

    DTIC Science & Technology

    2010-12-01

    Weapon System HAR Hazard Action Report HERO Hazards of Electromagnetic Radiation to Ordnance IOC Initial Operational Capability... radiation to ordnance ; and combinations therein. Equipment, systems, or procedures and processes whose malfunction would hazard the safe manufacturing...NDI Non-Development Item OPEVAL Operational Evaluation ORDALTS Ordnance Alterations O&SHA Operating and Support Hazard Analysis PDA

  5. Washington Community Colleges Factbook. Addendum B: A Description of the Community College Management Information System.

    ERIC Educational Resources Information Center

    Meier, Terre; Bundy, Larry

    The Management Information System (MIS) of the Washington State system of community colleges was designed to be responsive to legislative and district requests for information and to enhance the State Board's capabilities to manage the community college system and integrate its budgeting and planning processes. The MIS consists of seven…

  6. Optical information processing for NASA's space exploration

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin; Ochoa, Ellen; Juday, Richard

    1990-01-01

    The development status of optical processing techniques under development at NASA-JPL, NASA-Ames, and NASA-Johnson, is evaluated with a view to their potential applications in future NASA planetary exploration missions. It is projected that such optical processing systems can yield major reductions in mass, volume, and power requirements relative to exclusively electronic systems of comparable processing capabilities. Attention is given to high-order neural networks for distortion-invariant classification and pattern recognition, multispectral imaging using an acoustooptic tunable filter, and an optical matrix processor for control problems.

  7. QuickStrike ASOC Battlefield Simulation: Preparing the War Fighter to Win

    NASA Technical Reports Server (NTRS)

    Jones, Richard L.

    2010-01-01

    The QuickStrike ASOC (Air Support Operations Center) Battlefield Simulation fills a crucial gap in USAF and United Kingdom Close Air Support (CAS) and airspace manager training. The system now provides six squadrons with the capability to conduct total-mission training events whenever the personnel and time are available. When the 111th ASOC returned from their first deployment to Afghanistan they realized the training available prior to deployment was inadequate. They sought an organic training capability focused on the ASOC mission that was low cost, simple to use, adaptable, and available now. Using a commercial off-the-shelf simulation, they developed a complete training system by adapting the simulation to their training needs. Through more than two years of spiral development, incorporating lessons learned, the system has matured, and can now realistically replicate the Tactical Operations Center (TOC) in Kabul, Afghanistan, the TOC supporting the mission in Iraq, or can expand to support a major conflict scenario. The training system provides a collaborative workspace for the training audience and exercise control group via integrated software and workstations that can easily adapt to new mission reqUirements and TOC configurations. The system continues to mature. Based on inputs from the war fighter, new capabilities have been incorporated to add realism and simplify the scenario development process. The QuickStrike simulation can now import TBMCS Air Tasking Order air mission data and can provide air and ground tracks to a common operating picture; presented through either C2PC or JADOCS. This oranic capability to practice team processes and tasks and to conduct mission rehearsals proved its value in the 111 h ASOS's next deployment. The ease of scenario development and the simple to learn and intuitive gamelike interface enables the squadrons to develop and share scenarios incorporating lessons learned from every deployment. These war fighters have now filled the training gap and have the capability they need to train to win.

  8. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  9. Multimission image processing and science data visualization

    NASA Technical Reports Server (NTRS)

    Green, William B.

    1993-01-01

    The Operational Science Analysis (OSA) Functional area supports science instrument data display, analysis, visualization and photo processing in support of flight operations of planetary spacecraft managed by the Jet Propulsion Laboratory (JPL). This paper describes the data products generated by the OSA functional area, and the current computer system used to generate these data products. The objectives on a system upgrade now in process are described. The design approach to development of the new system are reviewed, including use of the Unix operating system and X-Window display standards to provide platform independence, portability, and modularity within the new system, is reviewed. The new system should provide a modular and scaleable capability supporting a variety of future missions at JPL.

  10. Systems Security Engineering Capability Maturity Model SSE-CMM Model Description Document

    DTIC Science & Technology

    1999-04-01

    management is the process of accessing and quantifying risk , and establishing an acceptable level of risk for the organization. Managing risk is an...Process of assessing and quantifying risk and establishing acceptable level of risk for the organization. [IEEE 13335-1:1996] Security Engineering

  11. HIPE, HIPE, Hooray!

    NASA Astrophysics Data System (ADS)

    Ott, S.

    2011-07-01

    (On behalf of all contributors to the Herschel mission) The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched 14th of May 2009. With a 3.5 m telescope, it is the largest space telescope ever launched. Herschel's three instruments (HIFI, PACS, and SPIRE) perform photometry and spectroscopy in the 55-671 micron range and will deliver exciting science for the astronomical community during at least three years of routine observations. Starting October 2009 Herschel has been performing and processing observations in routine science mode. The development of the Herschel Data Processing System (HIPE) started nine years ago to support the data analysis for Instrument Level Tests. To fulfil the expectations of the astronomical community, additional resources were made available to implement a freely distributable Data Processing System capable of interactively and automatically reducing Herschel data at different processing levels. The system combines data retrieval, pipeline execution, data quality checking and scientific analysis in one single environment. HIPE is the user-friendly face of Herschel interactive Data Processing. The software is coded in Java and Jython to be platform independent and to avoid the need for commercial licenses. It is distributed under the GNU Lesser General Public License (LGPL), permitting everyone to access and to re-use its code. We will summarise the current capabilities of the Herschel Data Processing system, highlight how the Herschel Data Processing system supported the Herschel observatory to meet the challenges of this large project, give an overview about future development milestones and plans, and how the astronomical community can contribute to HIPE.

  12. NASA research in aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Beheim, M. A.

    1982-01-01

    A broad overview of the scope of research presently being supported by NASA in aircraft propulsion is presented with emphasis on Lewis Research Center activities related to civil air transports, CTOL and V/STOL systems. Aircraft systems work is performed to identify the requirements for the propulsion system that enhance the mission capabilities of the aircraft. This important source of innovation and creativity drives the direction of propulsion research. In a companion effort, component research of a generic nature is performed to provide a better basis for design and provides an evolutionary process for technological growth that increases the capabilities of all types of aircraft. Both are important.

  13. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  14. Data Processing for NASA's TDRSS DAMA Channel

    NASA Technical Reports Server (NTRS)

    Long, Christopher C.; Horan, Stephen

    1996-01-01

    A concept for the addition of a Demand Assignment Multiple Access (DAMA) service to NASA's current Space Network (SN) is developed. Specifically, the design of a receiver for the DAMA channel is outlined. Also, an outline of the procedures taken to process the received service request is presented. The modifications to the (SN) system are minimal. The post reception processing is accomplished using standard commercial off the shelf (COTS) packages. The result is a random access system capable of receiving requests for service.

  15. NeuroSeek dual-color image processing infrared focal plane array

    NASA Astrophysics Data System (ADS)

    McCarley, Paul L.; Massie, Mark A.; Baxter, Christopher R.; Huynh, Buu L.

    1998-09-01

    Several technologies have been developed in recent years to advance the state of the art of IR sensor systems including dual color affordable focal planes, on-focal plane array biologically inspired image and signal processing techniques and spectral sensing techniques. Pacific Advanced Technology (PAT) and the Air Force Research Lab Munitions Directorate have developed a system which incorporates the best of these capabilities into a single device. The 'NeuroSeek' device integrates these technologies into an IR focal plane array (FPA) which combines multicolor Midwave IR/Longwave IR radiometric response with on-focal plane 'smart' neuromorphic analog image processing. The readout and processing integrated circuit very large scale integration chip which was developed under this effort will be hybridized to a dual color detector array to produce the NeuroSeek FPA, which will have the capability to fuse multiple pixel-based sensor inputs directly on the focal plane. Great advantages are afforded by application of massively parallel processing algorithms to image data in the analog domain; the high speed and low power consumption of this device mimic operations performed in the human retina.

  16. Challenges and Successes Managing Airborne Science Data for CARVE

    NASA Astrophysics Data System (ADS)

    Hardman, S. H.; Dinardo, S. J.; Lee, E. C.

    2014-12-01

    The Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission collects detailed measurements of important greenhouse gases on local to regional scales in the Alaskan Arctic and demonstrates new remote sensing and improved modeling capabilities to quantify Arctic carbon fluxes and carbon cycle-climate processes. Airborne missions offer a number of challenges when it comes to collecting and processing the science data and CARVE is no different. The biggest challenge relates to the flexibility of the instrument payload. Within the life of the mission, instruments may be removed from or added to the payload, or even reconfigured on a yearly, monthly or daily basis. Although modification of the instrument payload provides a distinct advantage for airborne missions compared to spaceborne missions, it does tend to wreak havoc on the underlying data system when introducing changes to existing data inputs or new data inputs that require modifications to the pipeline for processing the data. In addition to payload flexibility, it is not uncommon to find unsupported files in the field data submission. In the case of CARVE, these include video files, photographs taken during the flight and screen shots from terminal displays. These need to captured, saved and somehow integrated into the data system. The CARVE data system was built on a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This well-tested and proven infrastructure allows the CARVE data system to be easily adapted in order to handle the challenges posed by the CARVE mission and to successfully process, manage and distribute the mission's science data. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration

  17. Use of artificial intelligence in analytical systems for the clinical laboratory

    PubMed Central

    Truchaud, Alain; Ozawa, Kyoichi; Pardue, Harry; Schnipelsky, Paul

    1995-01-01

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

  18. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  19. NASA's Earth Observing System Data and Information System - EOSDIS

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.

    2011-01-01

    This slide presentation reviews the work of NASA's Earth Observing System Data and Information System (EOSDIS), a petabyte-scale archive of environmental data that supports global climate change research. The Earth Science Data Systems provide end-to-end capabilities to deliver data and information products to users in support of understanding the Earth system. The presentation contains photographs from space of recent events, (i.e., the effects of the tsunami in Japan, and the wildfires in Australia.) It also includes details of the Data Centers that provide the data to EOSDIS and Science Investigator-led Processing Systems. Information about the Land, Atmosphere Near-real-time Capability for EOS (LANCE) and some of the uses that the system has made possible are reviewed. Also included is information about how to access the data, and evolutionary plans for the future of the system.

  20. Weather data dissemination to aircraft

    NASA Technical Reports Server (NTRS)

    Mcfarland, Richard H.; Parker, Craig B.

    1990-01-01

    Documentation exists that shows weather to be responsible for approximately 40 percent of all general aviation accidents with fatalities. Weather data products available on the ground are becoming more sophisticated and greater in number. Although many of these data are critical to aircraft safety, they currently must be transmitted verbally to the aircraft. This process is labor intensive and provides a low rate of information transfer. Consequently, the pilot is often forced to make life-critical decisions based on incomplete and outdated information. Automated transmission of weather data from the ground to the aircraft can provide the aircrew with accurate data in near-real time. The current National Airspace System Plan calls for such an uplink capability to be provided by the Mode S Beacon System data link. Although this system has a very advanced data link capability, it will not be capable of providing adequate weather data to all airspace users in its planned configuration. This paper delineates some of the important weather data uplink system requirements, and describes a system which is capable of meeting these requirements. The proposed system utilizes a run-length coding technique for image data compression and a hybrid phase and amplitude modulation technique for the transmission of both voice and weather data on existing aeronautical Very High Frequency (VHF) voice communication channels.

  1. Real-time capability of GEONET system and its application to crust monitoring

    NASA Astrophysics Data System (ADS)

    Yamagiwa, Atsushi; Hatanaka, Yuki; Yutsudo, Toru; Miyahara, Basara

    2006-03-01

    The GPS Earth Observation Network system (GEONET) has been playing an important role in monitoring the crustal deformation of Japan. Since its start of operation, the requirements for accuracy and timeliness have become higher and higher. On the other hand, recent broadband communication infrastructure has had capability to realize real-time crust monitoring and to aid the development of a location-based service. In early 2003, the Geographical Survey Institute (GSI) upgraded the GEONET system to meet new requirements. The number of stations became 1200 in total by March, 2003. The antennas were unified to the choke ring antennas of Dorne Margolin T-type and the receivers were replaced with new ones that are capable of real-time observation and data transfer. The new system uses IP-connection through IP-VPN (Internet Protocol Virtual Private Network) for data transfer, which is provided by communication companies. The Data Processing System, which manages the observation data and analyses in GEONET, has 7 units. GEONET carries out three kinds of routine analyses and an analysis of RTK-type for emergencies. The new system has shown its capability for real-time crust monitoring, for example, the precise and rapid detection of coseismic (and post-seismic) motion caused by 2003 Tokachi-Oki earthquake.

  2. Medical Data Architecture Project Status

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Middour, C.; Lindsey, A.; Marker, N.; Wolfe, S.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2017-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current International Space Station (ISS) medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable an increasingly autonomous crew than the current ISS paradigm. The MDA will develop capabilities that support automated data collection, and the necessary functionality and challenges in executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. To attain this goal, the first year of the MDA project focused on reducing technical risk, developing documentation and instituting iterative development processes that established the basis for the first version of MDA software (or Test Bed 1). Test Bed 1 is based on a nominal operations scenario authored by the ExMC Element Scientist. This narrative was decomposed into a Concept of Operations that formed the basis for Test Bed 1 requirements. These requirements were successfully vetted through the MDA Test Bed 1 System Requirements Review, which permitted the MDA project to begin software code development and component integration. This paper highlights the MDA objectives, development processes, and accomplishments, and identifies the fiscal year 2017 milestones and deliverables in the upcoming year.

  3. Factors Affecting Relationships between the Contextual Variables and the Information Characteristics of Accounting Information Systems.

    ERIC Educational Resources Information Center

    Choe, Jong-Min; Lee, Jinjoo

    1993-01-01

    Reports on a study of accounting information systems that explored the interactions among influence factors (e.g., user participation in the development process, top management support, capability of information systems personnel, and existence of steering committees), contextual variables (e.g., organizational structure and task characteristics),…

  4. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  5. Situational awareness in the commercial aircraft cockpit - A cognitive perspective

    NASA Technical Reports Server (NTRS)

    Adams, Marilyn J.; Pew, Richard W.

    1990-01-01

    A cognitive theory is presented that has relevance for the definition and assessment of situational awareness in the cockpit. The theory asserts that maintenance of situation awareness is a constructive process that demands mental resources in competition with ongoing task performance. Implications of this perspective for assessing and improving situational awareness are discussed. It is concluded that the goal of inserting advanced technology into any system is that it results in an increase in the effectiveness, timeliness, and safety with which the system's activities can be accomplished. The inherent difficulties of the multitask situation are very often compounded by the introduction of automation. To maximize situational awareness, the dynamics and capabilities of such technologies must be designed with thorough respect for the dynamics and capabilities of human information-processing.

  6. A closed-loop air revitalization process technology demonstrator

    NASA Astrophysics Data System (ADS)

    Mulloth, Lila; Perry, Jay; Luna, Bernadette; Kliss, Mark

    Demonstrating a sustainable, reliable life support system process design that possesses the capability to close the oxygen cycle to the greatest extent possible is required for extensive surface exploration of the Moon and Mars by humans. A conceptual closed-loop air revitalization system process technology demonstrator that combines the CO2 removal, recovery, and reduction and oxygen generation operations in a single compact envelope is described. NASA has developed, and in some cases flown, process technologies for capturing metabolic CO2 from air, reducing CO2 to H2O and CH4, electrolyzing H2O to O2, and electrolyzing CO2 to O2 and CO among a number of candidates. Traditionally, these processes either operate in parallel with one another or have not taken full benefit of a unit operation-based design approach to take complete advantage of the synergy between individual technologies. The appropriate combination of process technologies must capitalize on the advantageous aspects of individual technologies while eliminating or transforming the features that limit their feasibility when considered alone. Such a process technology integration approach also provides advantages of optimized mass, power and volume characteristics for the hardware embodiment. The conceptual air revitalization system process design is an ideal technology demonstrator for the critically needed closed-loop life support capabilities for long duration human exploration of the lunar surface and extending crewed space exploration toward Mars. The conceptual process design incorporates low power CO2 removal, process gas drying, and advanced engineered adsorbents being developed by NASA and industry.

  7. INTEGRATED POWER GENERATION SYSTEMS FOR COAL MINE WASTE METHANE UTILIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peet M. Soot; Dale R. Jesse; Michael E. Smith

    2005-08-01

    An integrated system to utilize the waste coal mine methane (CMM) at the Federal No. 2 Coal Mine in West Virginia was designed and built. The system includes power generation, using internal combustion engines, along with gas processing equipment to upgrade sub-quality waste methane to pipeline quality standards. The power generation has a nominal capacity of 1,200 kw and the gas processing system can treat about 1 million cubic feet per day (1 MMCFD) of gas. The gas processing is based on the Northwest Fuel Development, Inc. (NW Fuel) proprietary continuous pressure swing adsorption (CPSA) process that can remove nitrogenmore » from CMM streams. The two major components of the integrated system are synergistic. The byproduct gas stream from the gas processing equipment can be used as fuel for the power generating equipment. In return, the power generating equipment provides the nominal power requirements of the gas processing equipment. This Phase III effort followed Phase I, which was comprised of a feasibility study for the project, and Phase II, where the final design for the commercial-scale demonstration was completed. The fact that NW Fuel is desirous of continuing to operate the equipment on a commercial basis provides the validation for having advanced the project through all of these phases. The limitation experienced by the project during Phase III was that the CMM available to operate the CPSA system on a commercial basis was not of sufficiently high quality. NW Fuel's CPSA process is limited in its applicability, requiring a relatively high quality of gas as the feed to the process. The CPSA process was demonstrated during Phase III for a limited time, during which the processing capabilities met the expected results, but the process was never capable of providing pipeline quality gas from the available low quality CMM. The NW Fuel CPSA process is a low-cost ''polishing unit'' capable of removing a few percent nitrogen. It was never intended to process CMM streams containing high levels of nitrogen, as is now the case at the Federal No.2 Mine. Even lacking the CPSA pipeline delivery demonstration, the project was successful in laying the groundwork for future commercial applications of the integrated system. This operation can still provide a guide for other coal mines which need options for utilization of their methane resources. The designed system can be used as a complete template, or individual components of the system can be segregated and utilized separately at other mines. The use of the CMM not only provides an energy fuel from an otherwise wasted resource, but it also yields an environmental benefit by reducing greenhouse gas emissions. The methane has twenty times the greenhouse effect as compared to carbon dioxide, which the combustion of the methane generates. The net greenhouse gas emission mitigation is substantial.« less

  8. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  9. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  10. Power Processing for a Conceptual Project Prometheus Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Scina, Joseph E., Jr.; Aulisio, Michael; Gerber, Scott S.; Hewitt, Frank; Miller, Leonard; Elbuluk, Malik; Pinero, Luis R. (Technical Monitor)

    2005-01-01

    NASA has proposed a bold mission to orbit and explore the moons of Jupiter. This mission, known as the Jupiter Icy Moons Orbiter (JIMO), would significantly increase NASA s capability to explore deep space by making use of high power electric propulsion. One electric propulsion option under study for JIMO is an ion propulsion system. An early version of an ion propulsion system was successfully used on NASA's Deep Space 1 mission. One concept for an ion thruster system capable of meeting the current JIMO mission requirement would have individual thrusters that are 16 to 25 kW each and require voltages as high as 8.0 kV. The purpose of this work is to develop power processing schemes for delivering the high voltage power to the spacecraft ion thrusters based upon a three-phase AC distribution system. In addition, a proposed DC-DC converter topology is presented for an ion thruster ancillary supply based upon a DC distribution system. All specifications discussed in this paper are for design convenience and are speculative in nature.

  11. Speech Processing.

    DTIC Science & Technology

    1983-05-01

    The VDE system developed had the capability of recognizing up to 248 separate words in syntactic structures. 4 The two systems described are isolated...size, weight, and power consumption of VDE devices (See Fig. 19). 8. DUU and NATU Advisory Groups on Voice Technology At the present time, two major

  12. The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eden, H.F.; Mooers, C.N.K.

    1990-06-01

    The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less

  13. An embedded vision system for an unmanned four-rotor helicopter

    NASA Astrophysics Data System (ADS)

    Lillywhite, Kirt; Lee, Dah-Jye; Tippetts, Beau; Fowers, Spencer; Dennis, Aaron; Nelson, Brent; Archibald, James

    2006-10-01

    In this paper an embedded vision system and control module is introduced that is capable of controlling an unmanned four-rotor helicopter and processing live video for various law enforcement, security, military, and civilian applications. The vision system is implemented on a newly designed compact FPGA board (Helios). The Helios board contains a Xilinx Virtex-4 FPGA chip and memory making it capable of implementing real time vision algorithms. A Smooth Automated Intelligent Leveling daughter board (SAIL), attached to the Helios board, collects attitude and heading information to be processed in order to control the unmanned helicopter. The SAIL board uses an electrolytic tilt sensor, compass, voltage level converters, and analog to digital converters to perform its operations. While level flight can be maintained, problems stemming from the characteristics of the tilt sensor limits maneuverability of the helicopter. The embedded vision system has proven to give very good results in its performance of a number of real-time robotic vision algorithms.

  14. PScan 1.0: flexible software framework for polygon based multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Li, Yongxiao; Lee, Woei Ming

    2016-12-01

    Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.

  15. Coincidence ion imaging with a fast frame camera

    NASA Astrophysics Data System (ADS)

    Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen

    2014-12-01

    A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.

  16. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  17. Proceedings of the Goddard Space Flight Center Workshop on Robotics for Commercial Microelectronic Processes in Space

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Potential applications of robots for cost effective commercial microelectronic processes in space were studied and the associated robotic requirements were defined. Potential space application areas include advanced materials processing, bulk crystal growth, and epitaxial thin film growth and related processes. All possible automation of these processes was considered, along with energy and environmental requirements. Aspects of robot capabilities considered include system intelligence, ROM requirements, kinematic and dynamic specifications, sensor design and configuration, flexibility and maintainability. Support elements discussed included facilities, logistics, ground support, launch and recovery, and management systems.

  18. Measuring the In-Process Figure, Final Prescription, and System Alignment of Large Optics and Segmented Mirrors Using Lidar Metrology

    NASA Technical Reports Server (NTRS)

    Ohl, Raymond; Slotwinski, Anthony; Eegholm, Bente; Saif, Babak

    2011-01-01

    The fabrication of large optics is traditionally a slow process, and fabrication capability is often limited by measurement capability. W hile techniques exist to measure mirror figure with nanometer precis ion, measurements of large-mirror prescription are typically limited to submillimeter accuracy. Using a lidar instrument enables one to measure the optical surface rough figure and prescription in virtuall y all phases of fabrication without moving the mirror from its polis hing setup. This technology improves the uncertainty of mirror presc ription measurement to the micron-regime.

  19. VIM: A Platform for Violent Intent Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Schryver, Jack C.; Whitney, Paul D.

    2009-03-31

    Radical and contentious political/religious activism may or may not evolve into violent behavior depending on contextual factors related to social, political, cultural and infrastructural conditions. Significant theoretical advances have been made in understanding these contextual factors and the import of their interrelations. However, there has been relative little progress in the development of processes and capabilities which leverage such theoretical advances to automate the anticipatory analysis of violent intent. In this paper, we describe a framework which implements such processes and capabilities, and discuss the implications of using the resulting system to assess the emergence of radicalization leading to violence.

  20. Space Shuttle Orbiter oxygen partial pressure sensing and control system improvements

    NASA Technical Reports Server (NTRS)

    Frampton, Robert F.; Hoy, Dennis M.; Kelly, Kevin J.; Walleshauser, James J.

    1992-01-01

    A program aimed at developing a new PPO2 oxygen sensor and a replacement amplifier for the Space Shuttle Orbiter is described. Experimental design methodologies used in the test and modeling process made it possible to enhance the effectiveness of the program and to reduce its cost. Significant cost savings are due to the increased lifetime of the basic sensor cell, the maximization of useful sensor life through an increased amplifier gain adjustment capability, the use of streamlined production processes for the manufacture of the assemblies, and the refurbishment capability of the replacement sensor.

  1. Development of a multi-disciplinary ERTS user program in the state of Ohio. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Baldridge, P. E.; Weber, C.; Schaal, G.; Wilhelm, C.; Wurelic, G. E.; Stephan, J. G.; Ebbert, T. F.; Smail, H. E.; Mckeon, J.; Schmidt, N. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. A current uniform land inventory was derived, in part, from LANDSAT data. The State has the ability to convert processed land information from LANDSAT to Ohio Capability Analysis Program (OCAP). The OCAP is a computer information and mapping system comprised of various programs used to digitally store, analyze, and display land capability information. More accurate processing of LANDSAT data could lead to reasonably accurate, useful land allocations models. It was feasible to use LANDSAT data to investigate minerals, pollution, land use, and resource inventory.

  2. OPTICAL PROCESSING OF INFORMATION: Potential applications of quasi-cw partially coherent radiation in optical data recording and processing

    NASA Astrophysics Data System (ADS)

    Volkov, L. V.; Larkin, A. I.

    1994-04-01

    Theoretical and experimental investigations are reported of the potential applications of quasi-cw partially coherent radiation in optical systems based on diffraction—interference principles. It is shown that the spectral characteristics of quasi-cw radiation influence the data-handling capabilities of a holographic correlator and of a partially coherent holographic system for data acquisition. Relevant experimental results are reported.

  3. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.

  4. Cogeneration technology alternatives study. Volume 6: Computer data

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The potential technical capabilities of energy conversion systems in the 1985 - 2000 time period were defined with emphasis on systems using coal, coal-derived fuels or alternate fuels. Industrial process data developed for the large energy consuming industries serve as a framework for the cogeneration applications. Ground rules for the study were established and other necessary equipment (balance-of-plant) was defined. This combination of technical information, energy conversion system data ground rules, industrial process information and balance-of-plant characteristics was analyzed to evaluate energy consumption, capital and operating costs and emissions. Data in the form of computer printouts developed for 3000 energy conversion system-industrial process combinations are presented.

  5. NASA-JSC antenna near-field measurement system

    NASA Technical Reports Server (NTRS)

    Cooke, W. P.; Friederich, P. G.; Jenkins, B. M.; Jameson, C. R.; Estrada, J. P.

    1988-01-01

    Work was completed on the near-field range control software. The capabilities of the data processing software were expanded with the addition of probe compensation. In addition, the user can process the measured data from the same computer terminal used for range control. The design of the laser metrology system was completed. It provides precise measruement of probe location during near-field measurements as well as position data for control of the translation beam and probe cart. A near-field range measurement system was designed, fabricated, and tested.

  6. Earth physicist describes US nuclear test monitoring system

    NASA Astrophysics Data System (ADS)

    1986-01-01

    The U. S. capabilities to monitor underground nuclear weapons tests in the USSR was examined. American methods used in monitoring the underground nuclear tests are enumerated. The U. S. technical means of monitoring Solviet nuclear weapons testing, and whether it is possible to conduct tests that could not be detected by these means are examined. The worldwide seismic station network in 55 countries available to the U. S. for seismic detection and measurement of underground nuclear explosions, and also the systems of seismic research observatories in 15 countries and seismic grouping stations in 12 countries are outlined including the advanced computerized data processing capabilities of these facilities. The level of capability of the U. S. seismic system for monitoring nuclear tests, other, nonseismic means of monitoring, such as hydroacoustic and recording of effects in the atmosphere, ionosphere, and the Earth's magnetic field, are discussed.

  7. A neural approach for improving the measurement capability of an electronic nose

    NASA Astrophysics Data System (ADS)

    Chimenti, M.; DeRossi, D.; Di Francesco, F.; Domenici, C.; Pieri, G.; Pioggia, G.; Salvetti, O.

    2003-06-01

    Electronic noses, instruments for automatic recognition of odours, are typically composed of an array of partially selective sensors, a sampling system, a data acquisition device and a data processing system. For the purpose of evaluating the quality of olive oil, an electronic nose based on an array of conducting polymer sensors capable of discriminating olive oil aromas was developed. The selection of suitable pattern recognition techniques for a particular application can enhance the performance of electronic noses. Therefore, an advanced neural recognition algorithm for improving the measurement capability of the device was designed and implemented. This method combines multivariate statistical analysis and a hierarchical neural-network architecture based on self-organizing maps and error back-propagation. The complete system was tested using samples composed of characteristic olive oil aromatic components in refined olive oil. The results obtained have shown that this approach is effective in grouping aromas into different categories representative of their chemical structure.

  8. Neural Correlates of Olfactory Learning: Critical Role of Centrifugal Neuromodulation

    ERIC Educational Resources Information Center

    Fletcher, Max L.; Chen, Wei R.

    2010-01-01

    The mammalian olfactory system is well established for its remarkable capability of undergoing experience-dependent plasticity. Although this process involves changes at multiple stages throughout the central olfactory pathway, even the early stages of processing, such as the olfactory bulb and piriform cortex, can display a high degree of…

  9. Information systems on human resources for health: a global review

    PubMed Central

    2012-01-01

    Background Although attainment of the health-related Millennium Development Goals relies on countries having adequate numbers of human resources for health (HRH) and their appropriate distribution, global understanding of the systems used to generate information for monitoring HRH stock and flows, known as human resources information systems (HRIS), is minimal. While HRIS are increasingly recognized as integral to health system performance assessment, baseline information regarding their scope and capability around the world has been limited. We conducted a review of the available literature on HRIS implementation processes in order to draw this baseline. Methods Our systematic search initially retrieved 11 923 articles in four languages published in peer-reviewed and grey literature. Following the selection of those articles which detailed HRIS implementation processes, reviews of their contents were conducted using two-person teams, each assigned to a national system. A data abstraction tool was developed and used to facilitate objective assessment. Results Ninety-five articles with relevant HRIS information were reviewed, mostly from the grey literature, which comprised 84 % of all documents. The articles represented 63 national HRIS and two regionally integrated systems. Whereas a high percentage of countries reported the capability to generate workforce supply and deployment data, few systems were documented as being used for HRH planning and decision-making. Of the systems examined, only 23 % explicitly stated they collect data on workforce attrition. The majority of countries experiencing crisis levels of HRH shortages (56 %) did not report data on health worker qualifications or professional credentialing as part of their HRIS. Conclusion Although HRIS are critical for evidence-based human resource policy and practice, there is a dearth of information about these systems, including their current capabilities. The absence of standardized HRIS profiles (including documented processes for data collection, management, and use) limits understanding of the availability and quality of information that can be used to support effective and efficient HRH strategies and investments at the national, regional, and global levels. PMID:22546089

  10. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  11. Parallel evolution of image processing tools for multispectral imagery

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Brumby, Steven P.; Perkins, Simon J.; Porter, Reid B.; Theiler, James P.; Young, Aaron C.; Szymanski, John J.; Bloch, Jeffrey J.

    2000-11-01

    We describe the implementation and performance of a parallel, hybrid evolutionary-algorithm-based system, which optimizes image processing tools for feature-finding tasks in multi-spectral imagery (MSI) data sets. Our system uses an integrated spatio-spectral approach and is capable of combining suitably-registered data from different sensors. We investigate the speed-up obtained by parallelization of the evolutionary process via multiple processors (a workstation cluster) and develop a model for prediction of run-times for different numbers of processors. We demonstrate our system on Landsat Thematic Mapper MSI , covering the recent Cerro Grande fire at Los Alamos, NM, USA.

  12. Data systems elements technology assessment and system specifications, issue no. 2. [nasa programs

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The ability to satisfy the objectives of future NASA Office of Applications programs is dependent on technology advances in a number of areas of data systems. The hardware and software technology of end-to-end systems (data processing elements through ground processing, dissemination, and presentation) are examined in terms of state of the art, trends, and projected developments in the 1980 to 1985 timeframe. Capability is considered in terms of elements that are either commercially available or that can be implemented from commercially available components with minimal development.

  13. An adaptive process-based cloud infrastructure for space situational awareness applications

    NASA Astrophysics Data System (ADS)

    Liu, Bingwei; Chen, Yu; Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik; Rubin, Bruce

    2014-06-01

    Space situational awareness (SSA) and defense space control capabilities are top priorities for groups that own or operate man-made spacecraft. Also, with the growing amount of space debris, there is an increase in demand for contextual understanding that necessitates the capability of collecting and processing a vast amount sensor data. Cloud computing, which features scalable and flexible storage and computing services, has been recognized as an ideal candidate that can meet the large data contextual challenges as needed by SSA. Cloud computing consists of physical service providers and middleware virtual machines together with infrastructure, platform, and software as service (IaaS, PaaS, SaaS) models. However, the typical Virtual Machine (VM) abstraction is on a per operating systems basis, which is at too low-level and limits the flexibility of a mission application architecture. In responding to this technical challenge, a novel adaptive process based cloud infrastructure for SSA applications is proposed in this paper. In addition, the details for the design rationale and a prototype is further examined. The SSA Cloud (SSAC) conceptual capability will potentially support space situation monitoring and tracking, object identification, and threat assessment. Lastly, the benefits of a more granular and flexible cloud computing resources allocation are illustrated for data processing and implementation considerations within a representative SSA system environment. We show that the container-based virtualization performs better than hypervisor-based virtualization technology in an SSA scenario.

  14. Computer-aided analysis and design of the shape rolling process for producing turbine engine airfoils

    NASA Technical Reports Server (NTRS)

    Lahoti, G. D.; Akgerman, N.; Altan, T.

    1978-01-01

    Mild steel (AISI 1018) was selected as model cold-rolling material and Ti-6Al-4V and INCONEL 718 were selected as typical hot-rolling and cold-rolling alloys, respectively. The flow stress and workability of these alloys were characterized and friction factor at the roll/workpiece interface was determined at their respective working conditions by conducting ring tests. Computer-aided mathematical models for predicting metal flow and stresses, and for simulating the shape-rolling process were developed. These models utilize the upper-bound and the slab methods of analysis, and are capable of predicting the lateral spread, roll-separating force, roll torque and local stresses, strains and strain rates. This computer-aided design (CAD) system is also capable of simulating the actual rolling process and thereby designing roll-pass schedule in rolling of an airfoil or similar shape. The predictions from the CAD system were verified with respect to cold rolling of mild steel plates. The system is being applied to cold and hot isothermal rolling of an airfoil shape, and will be verified with respect to laboratory experiments under controlled conditions.

  15. Noncontact temperature measurement: Requirements and applications for metals and alloys research

    NASA Technical Reports Server (NTRS)

    Perepezko, J. H.

    1988-01-01

    Temperature measurement is an essential capability for almost all areas of metals and alloys research. In the microgravity environment many of the science priorities that have been identified for metals and alloys also require noncontact temperature measurement capability. For example, in order to exploit the full potential of containerless processing, it is critical to have available a suitable noncontact temperature measurement system. This system is needed to track continuously the thermal history, including melt undercooling and rapid recalescence, of relatively small metal spheres during free-fall motion in drop tube systems. During containerless processing with levitation-based equipment, accurate noncontact temperature measurement is required to monitor one or more quasi-static samples with sufficient spatial and thermal resolution to follow the progress of solidification fronts originating in undercooled melts. In crystal growth, thermal migration, coarsening and other experiments high resolution thermal maps would be a valuable asset in the understanding and modeling of solidification processes, fluid flows and microstructure development. The science and applications requirements place several constraints on the spatial resolution, response time and accuracy of suitable instrumentation.

  16. Exploration of a Capability-Focused Aerospace System of Systems Architecture Alternative with Bilayer Design Space, Based on RST-SOM Algorithmic Methods

    PubMed Central

    Li, Zhifei; Qin, Dongliang

    2014-01-01

    In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation. PMID:24790572

  17. Efficient Power-Transfer Capability Analysis of the TET System Using the Equivalent Small Parameter Method.

    PubMed

    Yanzhen Wu; Hu, A P; Budgett, D; Malpas, S C; Dissanayake, T

    2011-06-01

    Transcutaneous energy transfer (TET) enables the transfer of power across the skin without direct electrical connection. It is a mechanism for powering implantable devices for the lifetime of a patient. For maximum power transfer, it is essential that TET systems be resonant on both the primary and secondary sides, which requires considerable design effort. Consequently, a strong need exists for an efficient method to aid the design process. This paper presents an analytical technique appropriate to analyze complex TET systems. The system's steady-state solution in closed form with sufficient accuracy is obtained by employing the proposed equivalent small parameter method. It is shown that power-transfer capability can be correctly predicted without tedious iterative simulations or practical measurements. Furthermore, for TET systems utilizing a current-fed push-pull soft switching resonant converter, it is found that the maximum energy transfer does not occur when the primary and secondary resonant tanks are "tuned" to the nominal resonant frequency. An optimal turning point exists, corresponding to the system's maximum power-transfer capability when optimal tuning capacitors are applied.

  18. Exploration of a capability-focused aerospace system of systems architecture alternative with bilayer design space, based on RST-SOM algorithmic methods.

    PubMed

    Li, Zhifei; Qin, Dongliang; Yang, Feng

    2014-01-01

    In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation.

  19. Material requirements for bio-inspired sensing systems

    NASA Astrophysics Data System (ADS)

    Biggins, Peter; Lloyd, Peter; Salmond, David; Kusterbeck, Anne

    2008-10-01

    The aim of developing bio-inspired sensing systems is to try and emulate the amazing sensitivity and specificity observed in the natural world. These capabilities have evolved, often for specific tasks, which provide the organism with an advantage in its fight to survive and prosper. Capabilities cover a wide range of sensing functions including vision, temperature, hearing, touch, taste and smell. For some functions, the capabilities of natural systems are still greater than that achieved by traditional engineering solutions; a good example being a dog's sense of smell. Furthermore, attempting to emulate aspects of biological optics, processing and guidance may lead to more simple and effective devices. A bio-inspired sensing system is much more than the sensory mechanism. A system will need to collect samples, especially if pathogens or chemicals are of interest. Other functions could include the provision of power, surfaces and receptors, structure, locomotion and control. In fact it is possible to conceive of a complete bio-inspired system concept which is likely to be radically different from more conventional approaches. This concept will be described and individual component technologies considered.

  20. Demonstration of Plasma Arc Environmental Technology Applications for the Demilitrization of DOD Stockpiles

    NASA Technical Reports Server (NTRS)

    Smith, Ed; Dee, P. E.; Zaghloul, Hany; Filius, Krag; Rivers, Tim

    2000-01-01

    Since 1989 the US Army Construction Engineering Research Laboratories (USACERL) have been active participants in the research and development towards establishing Plasma Arc Technology (PAT) as an efficient, economical, and safe hazardous waste immobilization tool. A plasma torch capable of generating high temperatures makes this technology a viable and powerful tool for the thermal destruction of various military industrial waste streams into an innocuous ceramic material no longer requiring hazardous waste landfill disposal. The emerging plasma environmental thermal treatment process has been used to safely and efficiently meet the waste disposal needs for various demilitarized components disposal needs, such as: (1) pyrotechnic smoke assemblies, (2) thermal batteries, (3) proximity fuses, (4) cartridge actuated devices (CADs), and (5) propellant actuated devices (PADs). MSE Technology Applications, Inc., (MSE) has proposed and fabricated a Mobile Plasma Treatment System to be a technology demonstrator for pilotscale mobile plasma waste processing. The system is capable of providing small-scale waste remediation services, and conducting waste stream applicability demonstrations. The Mobile Plasma Treatment System's innovative concept provides the flexibility to treat waste streams at numerous sites and sites with only a limited quantity of waste, yet too hazardous to transport to a regional fixed facility. The system was designed to be operated as skid mounted modules; consisting of a furnace module, controls module, offgas module, and ancillary systems module. All system components have been integrated to be operated from a single control station with both semi-continuous feeding and batch slag-pouring capability.

  1. Demonstration of Plasma Arc Environmental Technology Applications for the Demilitarization of DOD Stockpiles

    NASA Technical Reports Server (NTRS)

    Smith, Ed; Zaghloul, Hany; Filius, Krag; Rivers, Tim

    2000-01-01

    Since 1989 the U.S. Army Construction Engineering Research Laboratories (USACERL) have been active participants in the research and development toward establishing Plasma Arc Technology (PAT) as an efficient, economical, and safe hazardous waste immobilization tool. A plasma torch capable of generating high temperatures makes this technology a viable and powerful tool for the thermal destruction of various military industrial waste streams into an innocuous ceramic material no longer requiring hazardous waste landfill (Class 1) disposal. The emerging pl asma environmental thermal treatment process, has been used to safely and efficiently meet the waste disposal needs for various demilitarized components disposal needs, such as: pyrotechnic smoke assemblies, thermal batteries, proximity fuses, cartridge actuated devices (CAD's), and propellant actuated devices (PAD's). MSE Technology Applications, Inc., (MSE) has proposed and fabricated a Mobile Plasma Treatment System to be a technology demonstrator for pilot-scale mobile plasma waste processing. The system is capable of providing small-scale waste remediation services, and conducting waste stream applicability demonstrations. The Mobile Plasma Treatment System's innovative concept provides the flexibility to treat waste streams at numerous sites and sites with only a limited quantity of waste, yet too hazardous to transport to a regional fixed facility. The system was designed to be operated as skid mounted modules; consisting of a furnace module, controls module, offgas module, and ancillary systems module. All system components have been integrated to be operated from a single control station with both semi-continuous feeding and batch slag-pouring capability.

  2. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    PubMed Central

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116

  3. A real-time capable software-defined receiver using GPU for adaptive anti-jam GPS sensors.

    PubMed

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.

  4. Network command processing system overview

    NASA Technical Reports Server (NTRS)

    Nam, Yon-Woo; Murphy, Lisa D.

    1993-01-01

    The Network Command Processing System (NCPS) developed for the National Aeronautics and Space Administration (NASA) Ground Network (GN) stations is a spacecraft command system utilizing a MULTIBUS I/68030 microprocessor. This system was developed and implemented at ground stations worldwide to provide a Project Operations Control Center (POCC) with command capability for support of spacecraft operations such as the LANDSAT, Shuttle, Tracking and Data Relay Satellite, and Nimbus-7. The NCPS consolidates multiple modulation schemes for supporting various manned/unmanned orbital platforms. The NCPS interacts with the POCC and a local operator to process configuration requests, generate modulated uplink sequences, and inform users of the ground command link status. This paper presents the system functional description, hardware description, and the software design.

  5. The automated multi-stage substructuring system for NASTRAN

    NASA Technical Reports Server (NTRS)

    Field, E. I.; Herting, D. N.; Herendeen, D. L.; Hoesly, R. L.

    1975-01-01

    The substructuring capability developed for eventual installation in Level 16 is now operational in a test version of NASTRAN. Its features are summarized. These include the user-oriented, Case Control type control language, the automated multi-stage matrix processing, the independent direct access data storage facilities, and the static and normal modes solution capabilities. A complete problem analysis sequence is presented with card-by-card description of the user input.

  6. Technetium recovery from high alkaline solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, Charles A.

    2016-07-12

    Disclosed are methods for recovering technetium from a highly alkaline solution. The highly alkaline solution can be a liquid waste solution from a nuclear waste processing system. Methods can include combining the solution with a reductant capable of reducing technetium at the high pH of the solution and adding to or forming in the solution an adsorbent capable of adsorbing the precipitated technetium at the high pH of the solution.

  7. Real-time embedded atmospheric compensation for long-range imaging using the average bispectrum speckle method

    NASA Astrophysics Data System (ADS)

    Curt, Petersen F.; Bodnar, Michael R.; Ortiz, Fernando E.; Carrano, Carmen J.; Kelmelis, Eric J.

    2009-02-01

    While imaging over long distances is critical to a number of security and defense applications, such as homeland security and launch tracking, current optical systems are limited in resolving power. This is largely a result of the turbulent atmosphere in the path between the region under observation and the imaging system, which can severely degrade captured imagery. There are a variety of post-processing techniques capable of recovering this obscured image information; however, the computational complexity of such approaches has prohibited real-time deployment and hampers the usability of these technologies in many scenarios. To overcome this limitation, we have designed and manufactured an embedded image processing system based on commodity hardware which can compensate for these atmospheric disturbances in real-time. Our system consists of a reformulation of the average bispectrum speckle method coupled with a high-end FPGA processing board, and employs modular I/O capable of interfacing with most common digital and analog video transport methods (composite, component, VGA, DVI, SDI, HD-SDI, etc.). By leveraging the custom, reconfigurable nature of the FPGA, we have achieved performance twenty times faster than a modern desktop PC, in a form-factor that is compact, low-power, and field-deployable.

  8. Information services and information processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  9. Cost Benefit Analysis of Enterprise Resource Planning System for the Naval Postgraduate School

    DTIC Science & Technology

    2002-06-01

    Department-wide introduction and use of appropriate commercial financial practices and reporting • Develop a strategic plan for implementing a business... Development of a process innovation approach given the current capabilities of the system, recommend possible alternatives to close gaps. E

  10. DSN telemetry system data records

    NASA Technical Reports Server (NTRS)

    Gatz, E. C.

    1976-01-01

    The DSN telemetry system now includes the capability to provide a complete magnetic tape record, within 24 hours of reception, of all telemetry data received from a spacecraft. This record, the intermediate data record, is processed and generated almost entirely automatically, and provides a detailed accounting of any missing data.

  11. Microware: Hard, Soft, and Firm.

    ERIC Educational Resources Information Center

    Hutten, Leah R.

    1984-01-01

    Because a microcomputer system can be an expensive acquisition, the purchase decision needs to be made carefully. Steps to purchasing a microcomputer include: evaluating the data and word processing needs of the office, determining needs being met with existing capabilities, and making a cost-benefit comparison among systems. (MLW)

  12. Preliminary human factors guidelines for automated highway system designers. Volume 1 : guidelines for AHS designers

    DOT National Transportation Integrated Search

    1998-04-01

    Human factors can be defined as "designing to match the capabilities and limitations of the human user." The objectives of this human-centered design process are to maximize the effectiveness and efficiency of system performance, ensure a high level ...

  13. Optical memories in digital computing

    NASA Technical Reports Server (NTRS)

    Alford, C. O.; Gaylord, T. K.

    1979-01-01

    High capacity optical memories with relatively-high data-transfer rate and multiport simultaneous access capability may serve as basis for new computer architectures. Several computer structures that might profitably use memories are: a) simultaneous record-access system, b) simultaneously-shared memory computer system, and c) parallel digital processing structure.

  14. The Application of a Trade Study Methodology to Determine Which Capabilities to Implement in a Test Facility Data Acquisition System Upgrade

    NASA Technical Reports Server (NTRS)

    McDougal, Kristopher J.

    2008-01-01

    More and more test programs are requiring high frequency measurements. Marshall Space Flight Center s Cold Flow Test Facility has an interest in acquiring such data. The acquisition of this data requires special hardware and capabilities. This document provides a structured trade study approach for determining which additional capabilities of a VXI-based data acquisition system should be utilized to meet the test facility objectives. The paper is focused on the trade study approach detailing and demonstrating the methodology. A case is presented in which a trade study was initially performed to provide a recommendation for the data system capabilities. Implementation details of the recommended alternative are briefly provided as well as the system s performance during a subsequent test program. The paper then addresses revisiting the trade study with modified alternatives and attributes to address issues that arose during the subsequent test program. Although the model does not identify a single best alternative for all sensitivities, the trade study process does provide a much better understanding. This better understanding makes it possible to confidently recommend Alternative 3 as the preferred alternative.

  15. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D environment has considerable potential in the field of software engineering.

  16. Space station systems analysis study. Part 2, Volume 2. [technical report

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Specific system options are defined and identified for a cost effective space station capable of orderly growth with regard to both function and orbit location. Selected program options are analyzed and configuration concepts are developed to meet objectives for the satellite power system, earth servicing, space processing, and supporting activities. Transportation systems are analyzed for both LEO and GEO orbits.

  17. Interoperable Architecture for Command and Control

    DTIC Science & Technology

    2014-06-01

    defined objective. Elements can include other systems, people, processes, technology and other support elements (Adapted from [9]). Enterprise System...An enterprise is an intentionally created entity of human endeavour with a certain purpose. An enterprise could be considered a type of system [7]. In...this case the enterprise is a Defence Enterprise System required by government as a tool to maintain national sovereignty. Capability

  18. Interactive Image Analysis System Design,

    DTIC Science & Technology

    1982-12-01

    This report describes a design for an interactive image analysis system (IIAS), which implements terrain data extraction techniques. The design... analysis system. Additionally, the system is fully capable of supporting many generic types of image analysis and data processing, and is modularly...employs commercially available, state of the art minicomputers and image display devices with proven software to achieve a cost effective, reliable image

  19. The Human Nervous System: A Framework for Teaching and the Teaching Brain

    ERIC Educational Resources Information Center

    Rodriguez, Vanessa

    2013-01-01

    The teaching brain is a new concept that mirrors the complex, dynamic, and context-dependent nature of the learning brain. In this article, I use the structure of the human nervous system and its sensing, processing, and responding components as a framework for a re-conceptualized teaching system. This teaching system is capable of responses on an…

  20. ROBUS-2: A Fault-Tolerant Broadcast Communication System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Miner, Paul S.

    2005-01-01

    The Reliable Optical Bus (ROBUS) is the core communication system of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER), a general-purpose fault-tolerant integrated modular architecture currently under development at NASA Langley Research Center. The ROBUS is a time-division multiple access (TDMA) broadcast communication system with medium access control by means of time-indexed communication schedule. ROBUS-2 is a developmental version of the ROBUS providing guaranteed fault-tolerant services to the attached processing elements (PEs), in the presence of a bounded number of faults. These services include message broadcast (Byzantine Agreement), dynamic communication schedule update, clock synchronization, and distributed diagnosis (group membership). The ROBUS also features fault-tolerant startup and restart capabilities. ROBUS-2 is tolerant to internal as well as PE faults, and incorporates a dynamic self-reconfiguration capability driven by the internal diagnostic system. This version of the ROBUS is intended for laboratory experimentation and demonstrations of the capability to reintegrate failed nodes, dynamically update the communication schedule, and tolerate and recover from correlated transient faults.

  1. The Buffer Diagnostic Prototype: A fault isolation application using CLIPS

    NASA Technical Reports Server (NTRS)

    Porter, Ken

    1994-01-01

    This paper describes problem domain characteristics and development experiences from using CLIPS 6.0 in a proof-of-concept troubleshooting application called the Buffer Diagnostic Prototype. The problem domain is a large digital communications subsystems called the real-time network (RTN), which was designed to upgrade the launch processing system used for shuttle support at KSC. The RTN enables up to 255 computers to share 50,000 data points with millisecond response times. The RTN's extensive built-in test capability but lack of any automatic fault isolation capability presents a unique opportunity for a diagnostic expert system application. The Buffer Diagnostic Prototype addresses RTN diagnosis with a multiple strategy approach. A novel technique called 'faulty causality' employs inexact qualitative models to process test results. Experimental knowledge provides a capability to recognize symptom-fault associations. The implementation utilizes rule-based and procedural programming techniques, including a goal-directed control structure and simple text-based generic user interface that may be reusable for other rapid prototyping applications. Although limited in scope, this project demonstrates a diagnostic approach that may be adapted to troubleshoot a broad range of equipment.

  2. (abstract) The EOS SAR Mission: A New Approach

    NASA Technical Reports Server (NTRS)

    Way, JoBea

    1993-01-01

    The goal of the Earth Orbiting System Synthetic Aperture Radar (EOS SAR) program is to help develop the modeling and observational capabilities to predict and monitor terrestrial and oceanic processes that are either causing global change or resulting from global change. Specifically, the EOS SAR will provide important geophysical products to the EOS data set to improve our understanding of the state and functioning of the Earth system. The strategy for the EOS SAR program is to define the instrument requirements based on required input to geophysical algorithms, provide the processing capability and algorithms to generate such products on the required spatial (global) and temporal (3-5 days) scales, and to provide the spaceborne instrumentation with international partnerships. Initially this partnership has been with Germany; currently we are exploring broader international partnerships. A MultiSAR approach to the EOS SAR which includes a number of SARs provided by Japan, ESA, Germany, Canada, and the US in synergistic orbits could be used to attain a truly global monitoring capability using multifrequency polarimetric signatures. These concepts and several options for mission scenarios will be presented.

  3. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  4. VISSR Atmospheric Sounder (VAS) Research Review

    NASA Technical Reports Server (NTRS)

    Greaves, J. R. (Editor)

    1983-01-01

    The VAS, an experimental instrument flown onboard Geostationary Operational Environmental Satellite (GOES), is capable of achieving mutlispectral imagery of atmospheric temperature, water vapor, and cloudiness patterns over short time intervals. In addition, this instrument provides an atmospheric sounding capability from geosynchronous orbit. The VAS demonstration is an effort for evaluating the VAS instrument's performance, and for demonstrating the capabilities of a VAS prototype system to provide useful geosynchronous satellite data for supporting weather forecasts and atmospheric research. The demonstration evaluates the performance of the VAS Instruments on GOES-4-5, and -6, develops research oriented and prototype/operational VAS data processing systems, determines the accuracy of certain basic and derived meteorological parameters that can be obtained from the VAS instrument, and assesses the utility of VAS derived information in analyzing severe weather situations.

  5. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  6. ASAP progress and expenditure report for the month of December 1--31, 1995. Joint UK/US radar program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Twogood, R.E.; Brase, J.M.; Chambers, D.H.

    1996-01-19

    The RAR/SAR is a high-priority radar system for the joint US/UK Program. Based on previous experiment results and coordination with the UK, specifications needed for future radar experiments were identified as follows: dual polarimetric (HH and VV) with medium to high resolution in SAR mode. Secondary airborne installation requirements included; high power (circa 10kw) and SLIER capability to emulate Tupelev-134 type system; initially x-band but easily extendible to other frequencies. In FY96 we intended to enhance the radar system`s capabilities by providing a second polarization (VV), spotlight imaging mode, extended frequency of operation to include S- band, increase power, andmore » interface to an existing infrared sensor. Short term objectives are: continue to evaluate and characterize the radar system; upgrade navigation and real-time processing capability to refine motion compensation; upgrade to dual polarimetry (add VV); and develop a ``spotlight`` mode capability. Accomplishments this reporting period: design specifications for the SAR system polarimetric upgrade are complete. The upgrade is ready to begin the procurement cycle when funds become available. System characterization is one of the highest priority tasks for the SAR. Although the radar is dedicated for our use, Hughes is waiting for contract funding before allowing us access to the hardware« less

  7. Implementing an International Consultation on Earth System Research Priorities Using Web 2.0 Tools

    NASA Astrophysics Data System (ADS)

    Goldfarb, L.; Yang, A.

    2009-12-01

    Leah Goldfarb, Paul Cutler, Andrew Yang*, Mustapha Mokrane, Jacinta Legg and Deliang Chen The scientific community has been engaged in developing an international strategy on Earth system research. The initial consultation in this “visioning” process focused on gathering suggestions for Earth system research priorities that are interdisciplinary and address the most pressing societal issues. It was implemented this through a website that utilized Web 2.0 capabilities. The website (http://www.icsu-visioning.org/) collected input from 15 July to 1 September 2009. This consultation was the first in which the international scientific community was asked to help shape the future of a research theme. The site attracted over 7000 visitors from 133 countries, more than 1000 of whom registered and took advantage of the site’s functionality to contribute research questions (~300 questions), comment on posts, and/or vote on questions. To facilitate analysis of results, the site captured a small set of voluntary information about each contributor and their contribution. A group of ~50 international experts were invited to analyze the inputs at a “Visioning Earth System Research” meeting held in September 2009. The outcome of this meeting—a prioritized list of research questions to be investigated over the next decade—was then posted on the visioning website for additional comment from the community through an online survey tool. In general, many lessons were learned in the development and implementation of this website, both in terms of the opportunities offered by Web 2.0 capabilities and the application of these capabilities. It is hoped that this process may serve as a model for other scientific communities. The International Council for Science (ICSU) in cooperation with the International Social Science Council (ISSC) is responsible for organizing this Earth system visioning process.

  8. Damage Precursor Identification via Microstructure-Sensitive Nondestructive Evaluation

    NASA Astrophysics Data System (ADS)

    Wisner, Brian John

    Damage in materials is a complex and stochastic process bridging several time and length scales. This dissertation focuses on investigating the damage process in a particular class of precipitate-hardened aluminum alloys which is widely used in automotive and aerospace applications. Most emphasis in the literature has been given either on their ductility for manufacturing purposes or fracture for performance considerations. In this dissertation, emphasis is placed on using nondestructive evaluation (NDE) combined with mechanical testing and characterization methods applied at a scale where damage incubation and initiation is occurring. Specifically, a novel setup built inside a Scanning Electron Microscope (SEM) and retrofitted to be combined with characterization and NDE capabilities was developed with the goal to track the early stages of the damage process in this type of material. The characterization capabilities include Electron Backscatter Diffraction (EBSD) and Energy Dispersive Spectroscopy (EDS) in addition to X-ray micro-computed tomography (μ-CT) and nanoindentation, in addition to microscopy achieved by the Secondary Electron (SE) and Back Scatter Electron (BSE) detectors. The mechanical testing inside the SEM was achieved with the use of an appropriate stage that fitted within its chamber and is capable of applying both axial and bending monotonic and cyclic loads. The NDE capabilities, beyond the microscopy and μ-CT, include the methods of Acoustic Emission and Digital Image Correlation (DIC). This setup was used to identify damage precursors in this material system and their evolution over time and space. The experimental results were analyzed by a custom signal processing scheme that involves both feature-based analyses as well as a machine learning method to relate recorded microstructural data to damage in this material. Extensions of the presented approach to include information from computational methods as well as its applicability to other material systems are discussed.

  9. Lunar Landing Trajectory Design for Onboard Hazard Detection and Avoidance

    NASA Technical Reports Server (NTRS)

    Paschall, Steve; Brady, Tye; Sostaric, Ron

    2009-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing the software and hardware technology needed to support a safe and precise landing for the next generation of lunar missions. ALHAT provides this capability through terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard detection system to select safe landing locations, and an Autonomous Guidance, Navigation, and Control (AGNC) capability to process these measurements and safely direct the vehicle to a landing location. This paper focuses on the key trajectory design issues relevant to providing an onboard Hazard Detection and Avoidance (HDA) capability for the lander. Hazard detection can be accomplished by the crew visually scanning the terrain through a window, a sensor system imaging the terrain, or some combination of both. For ALHAT, this hazard detection activity is provided by a sensor system, which either augments the crew s perception or entirely replaces the crew in the case of a robotic landing. Detecting hazards influences the trajectory design by requiring the proper perspective, range to the landing site, and sufficient time to view the terrain. Following this, the trajectory design must provide additional time to process this information and make a decision about where to safely land. During the final part of the HDA process, the trajectory design must provide sufficient margin to enable a hazard avoidance maneuver. In order to demonstrate the effects of these constraints on the landing trajectory, a tradespace of trajectory designs was created for the initial ALHAT Design Analysis Cycle (ALDAC-1) and each case evaluated with these HDA constraints active. The ALHAT analysis process, described in this paper, narrows down this tradespace and subsequently better defines the trajectory design needed to support onboard HDA. Future ALDACs will enhance this trajectory design by balancing these issues and others in an overall system design process.

  10. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach.

    PubMed

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-06-01

    Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.

  11. Capability 9.3 Assembly and Deployment

    NASA Technical Reports Server (NTRS)

    Dorsey, John

    2005-01-01

    Large space systems are required for a range of operational, commercial and scientific missions objectives however, current launch vehicle capacities substantially limit the size of space systems (on-orbit or planetary). Assembly and Deployment is the process of constructing a spacecraft or system from modules which may in turn have been constructed from sub-modules in a hierarchical fashion. In-situ assembly of space exploration vehicles and systems will require a broad range of operational capabilities, including: Component transfer and storage, fluid handling, construction and assembly, test and verification. Efficient execution of these functions will require supporting infrastructure, that can: Receive, store and protect (materials, components, etc.); hold and secure; position, align and control; deploy; connect/disconnect; construct; join; assemble/disassemble; dock/undock; and mate/demate.

  12. Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data

    NASA Technical Reports Server (NTRS)

    Likens, W. C.; Wrigley, R. C.

    1984-01-01

    Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.

  13. From ecological test site to geographic information system: lessons for the 1980's

    USGS Publications Warehouse

    Alexander, Robert H.

    1981-01-01

    Geographic information systems were common elements in two kinds of interdisciplinary regional demonstration projects in the 1970's. Ecological test sits attempted to provide for more efficient remote-sensing data delivery for regional environmental management. Regional environmental systems analysis attempted to formally describe and model the interacting regional social and environmental processes, including the resource-use decision making process. Lessons for the 1980's are drawn from recent evaluations and assessments of these programs, focusing on cost, rates of system development and technology transfer, program coordination, integrative analysis capability, and the involvement of system users and decision makers.

  14. The Advanced Technology Development Center (ATDC)

    NASA Technical Reports Server (NTRS)

    Clements, G. R.; Willcoxon, R. (Technical Monitor)

    2001-01-01

    NASA is building the Advanced Technology Development Center (ATDC) to provide a 'national resource' for the research, development, demonstration, testing, and qualification of Spaceport and Range Technologies. The ATDC will be located at Space Launch Complex 20 (SLC-20) at Cape Canaveral Air Force Station (CCAFS) in Florida. SLC-20 currently provides a processing and launch capability for small-scale rockets; this capability will be augmented with additional ATDC facilities to provide a comprehensive and integrated in situ environment. Examples of Spaceport Technologies that will be supported by ATDC infrastructure include densified cryogenic systems, intelligent automated umbilicals, integrated vehicle health management systems, next-generation safety systems, and advanced range systems. The ATDC can be thought of as a prototype spaceport where industry, government, and academia, in partnership, can work together to improve safety of future space initiatives. The ATDC is being deployed in five separate phases. Major ATDC facilities will include a Liquid Oxygen Area; a Liquid Hydrogen Area, a Liquid Nitrogen Area, and a multipurpose Launch Mount; 'Iron Rocket' Test Demonstrator; a Processing Facility with a Checkout and Control System; and Future Infrastructure Developments. Initial ATDC development will be completed in 2006.

  15. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    NASA Technical Reports Server (NTRS)

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  16. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  17. Conversion of LARSYS III.1 to an IBM 370 computer

    NASA Technical Reports Server (NTRS)

    Williams, G. N.; Leggett, J.; Hascall, G. A.

    1975-01-01

    A software system for processing multispectral aircraft or satellite data (LARSYS) was designed and written at the Laboratory for Applications of Remote Sensing at Purdue University. This system, being implemented on an IBM 360/67 computer utilizing the Cambridge Monitor System, is of an interactive nature. TAMU LARSYS maintains the essential capabilities of Purdue's LARSYS. The machine configuration for which it has been converted is an IBM-compatible Amdahl 470V/6 computer utilizing the time sharing option of the currently implemented OS/VS2 Operating System. Due to TSO limitations, the NASA-JSC deliverable TAMU LARSYS is comprised of two parts. Part one is a TSO Control Card Checker for LARSYS control cards, and part two is a batch version of LARSYS. Used together, they afford most of the capabilities of the original LARSYS III.1. Additionally, two programs have been written by TAMU to support LARSYS processing. The first is an ERTS-to-MIST conversion program used to convert ERTS data to the LARSYS input form, the MIST tape. The second is a system runtable code which maintains tape/file location information for the MIST data sets.

  18. Ordering Design Tasks Based on Coupling Strengths

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Bloebaum, C. L.

    1994-01-01

    The design process associated with large engineering systems requires an initial decomposition of the complex system into modules of design tasks which are coupled through the transference of output data. In analyzing or optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the system solution. Many decomposition approaches assume the capability is available to determine what design tasks and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature for DeMAID (Design Manager's Aid for Intelligent Decomposition) will allow the design manager to use coupling strength information to find a proper sequence for ordering the design tasks. In addition, these coupling strengths aid in deciding if certain tasks or couplings could be removed (or temporarily suspended) from consideration to achieve computational savings without a significant loss of system accuracy. New rules are presented and two small test cases are used to show the effects of using coupling strengths in this manner.

  19. Ordering design tasks based on coupling strengths

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Bloebaum, Christina L.

    1994-01-01

    The design process associated with large engineering systems requires an initial decomposition of the complex system into modules of design tasks which are coupled through the transference of output data. In analyzing or optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the system solution. Many decomposition approaches assume the capability is available to determine what design tasks and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature for DeMAID (Design Manager's Aid for Intelligent Decomposition) will allow the design manager to use coupling strength information to find a proper sequence for ordering the design tasks. In addition, these coupling strengths aid in deciding if certain tasks or couplings could be removed (or temporarily suspended) from consideration to achieve computational savings without a significant loss of system accuracy. New rules are presented and two small test cases are used to show the effects of using coupling strengths in this manner.

  20. Monitoring real-time navigation processes using the automated reasoning tool (ART)

    NASA Technical Reports Server (NTRS)

    Maletz, M. C.; Culbert, C. J.

    1985-01-01

    An expert system is described for monitoring and controlling navigation processes in real-time. The ART-based system features data-driven computation, accommodation of synchronous and asynchronous data, temporal modeling for individual time intervals and chains of time intervals, and hypothetical reasoning capabilities that consider alternative interpretations of the state of navigation processes. The concept is illustrated in terms of the NAVEX system for monitoring and controlling the high speed ground navigation console for Mission Control at Johnson Space Center. The reasoning processes are outlined, including techniques used to consider alternative data interpretations. Installation of the system has permitted using a single operator, instead of three, to monitor the ascent and entry phases of a Shuttle mission.

  1. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  2. Wood transportation systems-a spin-off of a computerized information and mapping technique

    Treesearch

    William W. Phillips; Thomas J. Corcoran

    1978-01-01

    A computerized mapping system originally developed for planning the control of the spruce budworm in Maine has been extended into a tool for planning road net-work development and optimizing transportation costs. A budgetary process and a mathematical linear programming routine are used interactively with the mapping and information retrieval capabilities of the system...

  3. Digital Avionics Information System (DAIS): Mid-1980's Maintenance Task Analysis. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    The fundamental objective of the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study is to provide the Air Force with an enhanced in-house capability to incorporate LCC considerations during all stages of the system acquisition process. The purpose of this report is to describe the technical approach, results, and conclusions…

  4. NOUS: A Knowledge Graph Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knowledge graphs represent information as entities and relationships between them. For tasks such as natural language question answering or automated analysis of text, a knowledge graph provides valuable context to establish the specific type of entities being discussed. It allow us to derive better context about newly arriving information and leads to intelligent reasoning capabilities. We address two primary needs: A) Automated construction of knowledge graphs is a technically challenging, expensive process; and B) The ability to synthesize new information by monitoring newly emerging knowledge is a transformational capability that does not exist in state of the art systems.

  5. A VHDL Core for Intrinsic Evolution of Discrete Time Filters with Signal Feedback

    NASA Technical Reports Server (NTRS)

    Gwaltney, David A.; Dutton, Kenneth

    2005-01-01

    The design of an Evolvable Machine VHDL Core is presented, representing a discrete-time processing structure capable of supporting control system applications. This VHDL Core is implemented in an FPGA and is interfaced with an evolutionary algorithm implemented in firmware on a Digital Signal Processor (DSP) to create an evolvable system platform. The salient features of this architecture are presented. The capability to implement IIR filter structures is presented along with the results of the intrinsic evolution of a filter. The robustness of the evolved filter design is tested and its unique characteristics are described.

  6. SIRU development. Volume 3: Software description and program documentation

    NASA Technical Reports Server (NTRS)

    Oehrle, J.

    1973-01-01

    The development and initial evaluation of a strapdown inertial reference unit (SIRU) system are discussed. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. The basic SIRU software coding system used in the DDP-516 computer is documented.

  7. Development INTERDATA 8/32 computer system

    NASA Technical Reports Server (NTRS)

    Sonett, C. P.

    1983-01-01

    The capabilities of the Interdata 8/32 minicomputer were examined regarding data and word processing, editing, retrieval, and budgeting as well as data management demands of the user groups in the network. Based on four projected needs: (1) a hands on (open shop) computer for data analysis with large core and disc capability; (2) the expected requirements of the NASA data networks; (3) the need for intermittent large core capacity for theoretical modeling; (4) the ability to access data rapidly either directly from tape or from core onto hard copy, the system proved useful and adequate for the planned requirements.

  8. Multiwell cell culture plate format with integrated microfluidic perfusion system

    NASA Astrophysics Data System (ADS)

    Domansky, Karel; Inman, Walker; Serdy, Jim; Griffith, Linda G.

    2006-01-01

    A new cell culture analog has been developed. It is based on the standard multiwell cell culture plate format but it provides perfused three-dimensional cell culture capability. The new capability is achieved by integrating microfluidic valves and pumps into the plate. The system provides a means to conduct high throughput assays for target validation and predictive toxicology in the drug discovery and development process. It can be also used for evaluation of long-term exposure to drugs or environmental agents or as a model to study viral hepatitis, cancer metastasis, and other diseases and pathological conditions.

  9. Space processing applications payload equipment study. Volume 2C: Data acquisition and process control

    NASA Technical Reports Server (NTRS)

    Kayton, M.; Smith, A. G.

    1974-01-01

    The services provided by the Spacelab Information Management System are discussed. The majority of the services are provided by the common-support subsystems in the Support Module furnished by the Spacelab manufacturer. The information processing requirements for the space processing applications (SPA) are identified. The requirements and capabilities for electric power, display and control panels, recording and telemetry, intercom, and closed circuit television are analyzed.

  10. High-Efficiency Nested Hall Thrusters for Robotic Solar System Exploration

    NASA Technical Reports Server (NTRS)

    Hofer, Richard R.

    2013-01-01

    This work describes the scaling and design attributes of Nested Hall Thrusters (NHT) with extremely large operational envelopes, including a wide range of throttleability in power and specific impulse at high efficiency (>50%). NHTs have the potential to provide the game changing performance, powerprocessing capabilities, and cost effectiveness required to enable missions that cannot otherwise be accomplished. NHTs were first identified in the electric propulsion community as a path to 100- kW class thrusters for human missions. This study aimed to identify the performance capabilities NHTs can provide for NASA robotic and human missions, with an emphasis on 10-kW class thrusters well-suited for robotic exploration. A key outcome of this work has been the identification of NHTs as nearly constant-efficiency devices over large power throttling ratios, especially in direct-drive power systems. NHT systems sized for robotic solar system exploration are predicted to be capable of high-efficiency operation over nearly their entire power throttling range. A traditional Annular Hall Thruster (AHT) consists of a single annular discharge chamber where the propellant is ionized and accelerated. In an NHT, multiple annular channels are concentrically stacked. The channels can be operated in unison or individually depending on the available power or required performance. When throttling an AHT, performance must be sacrificed since a single channel cannot satisfy the diverse design attributes needed to maintain high thrust efficiency. NHTs can satisfy these requirements by varying which channels are operated and thereby offer significant benefits in terms of thruster performance, especially under deep power throttling conditions where the efficiency of an AHT suffers since a single channel can only operate efficiently (>50%) over a narrow power throttling ratio (3:1). Designs for 10-kW class NHTs were developed and compared with AHT systems. Power processing systems were considered using either traditional Power Processing Units (PPU) or Direct Drive Units (DDU). In a PPU-based system, power from the solar arrays is transformed from the low voltage of the arrays to the high voltage needed by the thruster. In a DDU-based system, power from the solar arrays is fed to the thruster without conversion. DDU-based systems are attractive for their simplicity since they eliminate the most complex and expensive part of the propulsion system. The results point to the strong potential of NHTs operating with either PPUs or DDUs to benefit robotic and human missions through their unprecedented power and specific impulse throttling capabilities. NHTs coupled to traditional PPUs are predicted to offer high-efficiency (>50%) power throttling ratios 320% greater than present capabilities, while NHTs with direct-drive power systems (DDU) could exceed existing capabilities by 340%. Because the NHT-DDU approach is implicitly low-cost, NHT-DDU technology has the potential to radically reduce the cost of SEP-enabled NASA missions while simultaneously enabling unprecedented performance capability.

  11. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  12. An Agile Systems Engineering Process: The Missing Link?

    DTIC Science & Technology

    2011-05-01

    has a num- ber of standards available such as ISO 12207 , ISO 9001 and the Capability Maturity Model Integrated (CMMI®) [24,25,26]. The CMMI was a...addressing activities throughout the products lifecycle [24]. ISO 12207 “contains processes, activities and tasks that are to be applied during...the acquisition of a system that contains software” [26]. A limitation identified within ISO 12207 is that it does not specify details on how to

  13. A systems approach for data compression and latency reduction in cortically controlled brain machine interfaces.

    PubMed

    Oweiss, Karim G

    2006-07-01

    This paper suggests a new approach for data compression during extracutaneous transmission of neural signals recorded by high-density microelectrode array in the cortex. The approach is based on exploiting the temporal and spatial characteristics of the neural recordings in order to strip the redundancy and infer the useful information early in the data stream. The proposed signal processing algorithms augment current filtering and amplification capability and may be a viable replacement to on chip spike detection and sorting currently employed to remedy the bandwidth limitations. Temporal processing is devised by exploiting the sparseness capabilities of the discrete wavelet transform, while spatial processing exploits the reduction in the number of physical channels through quasi-periodic eigendecomposition of the data covariance matrix. Our results demonstrate that substantial improvements are obtained in terms of lower transmission bandwidth, reduced latency and optimized processor utilization. We also demonstrate the improvements qualitatively in terms of superior denoising capabilities and higher fidelity of the obtained signals.

  14. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  15. Putting Integrated Systems Health Management Capabilities to Work: Development of an Advanced Caution and Warning System for Next-Generation Crewed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Mccann, Robert S.; Spirkovska, Lilly; Smith, Irene

    2013-01-01

    Integrated System Health Management (ISHM) technologies have advanced to the point where they can provide significant automated assistance with real-time fault detection, diagnosis, guided troubleshooting, and failure consequence assessment. To exploit these capabilities in actual operational environments, however, ISHM information must be integrated into operational concepts and associated information displays in ways that enable human operators to process and understand the ISHM system information rapidly and effectively. In this paper, we explore these design issues in the context of an advanced caution and warning system (ACAWS) for next-generation crewed spacecraft missions. User interface concepts for depicting failure diagnoses, failure effects, redundancy loss, "what-if" failure analysis scenarios, and resolution of ambiguity groups are discussed and illustrated.

  16. Spacecraft optical disk recorder memory buffer control

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.

    1993-01-01

    This paper discusses the research completed under the NASA-ASEE summer faculty fellowship program. The project involves development of an Application Specific Integrated Circuit (ASIC) to be used as a Memory Buffer Controller (MBC) in the Spacecraft Optical Disk System (SODR). The SODR system has demanding capacity and data rate specifications requiring specialized electronics to meet processing demands. The system is being designed to support Gigabit transfer rates with Terabit storage capability. The complete SODR system is designed to exceed the capability of all existing mass storage systems today. The ASIC development for SODR consist of developing a 144 pin CMOS device to perform format conversion and data buffering. The final simulations of the MBC were completed during this summer's NASA-ASEE fellowship along with design preparations for fabrication to be performed by an ASIC manufacturer.

  17. Performance analysis of a multispectral framing camera for detecting mines in the littoral zone and beach zone

    NASA Astrophysics Data System (ADS)

    Louchard, Eric; Farm, Brian; Acker, Andrew

    2008-04-01

    BAE Systems Sensor Systems Identification & Surveillance (IS) has developed, under contract with the Office of Naval Research, a multispectral airborne sensor system and processing algorithms capable of detecting mine-like objects in the surf zone and land mines in the beach zone. BAE Systems has used this system in a blind test at a test range established by the Naval Surface Warfare Center - Panama City Division (NSWC-PCD) at Eglin Air Force Base. The airborne and ground subsystems used in this test are described, with graphical illustrations of the detection algorithms. We report on the performance of the system configured to operate with a human operator analyzing data on a ground station. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone. Surface float detection and proud land mine detection capability is also demonstrated. Our analysis shows that this BAE Systems-developed multispectral airborne sensor provides a robust technical foundation for a viable system for mine counter-measures, and would be a valuable asset for use prior to an amphibious assault.

  18. Imaging characteristics of photogrammetric camera systems

    USGS Publications Warehouse

    Welch, R.; Halliday, J.

    1973-01-01

    In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.

  19. Robust algebraic image enhancement for intelligent control systems

    NASA Technical Reports Server (NTRS)

    Lerner, Bao-Ting; Morrelli, Michael

    1993-01-01

    Robust vision capability for intelligent control systems has been an elusive goal in image processing. The computationally intensive techniques a necessary for conventional image processing make real-time applications, such as object tracking and collision avoidance difficult. In order to endow an intelligent control system with the needed vision robustness, an adequate image enhancement subsystem capable of compensating for the wide variety of real-world degradations, must exist between the image capturing and the object recognition subsystems. This enhancement stage must be adaptive and must operate with consistency in the presence of both statistical and shape-based noise. To deal with this problem, we have developed an innovative algebraic approach which provides a sound mathematical framework for image representation and manipulation. Our image model provides a natural platform from which to pursue dynamic scene analysis, and its incorporation into a vision system would serve as the front-end to an intelligent control system. We have developed a unique polynomial representation of gray level imagery and applied this representation to develop polynomial operators on complex gray level scenes. This approach is highly advantageous since polynomials can be manipulated very easily, and are readily understood, thus providing a very convenient environment for image processing. Our model presents a highly structured and compact algebraic representation of grey-level images which can be viewed as fuzzy sets.

  20. Application of a digital high-speed camera and image processing system for investigations of short-term hypersonic fluids

    NASA Astrophysics Data System (ADS)

    Renken, Hartmut; Oelze, Holger W.; Rath, Hans J.

    1998-04-01

    The design and application of a digital high sped image data capturing system with a following image processing system applied to the Bremer Hochschul Hyperschallkanal BHHK is the content of this presentation. It is also the result of the cooperation between the departments aerodynamic and image processing at the ZARM-institute at the Drop Tower of Brennen. Similar systems are used by the combustion working group at ZARM and other external project partners. The BHHK, camera- and image storage system as well as the personal computer based image processing software are described next. Some examples of images taken at the BHHK are shown to illustrate the application. The new and very user-friendly Windows 32-bit system is capable to capture all camera data with a maximum pixel clock of 43 MHz and to process complete sequences of images in one step by using only one comfortable program.

  1. Control of Bethlehem's coke-oven battery A at Sparrow Point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michel, A.

    1984-02-01

    A new 6 m 80-oven compound-fired coke battery capable of producing in excess of 850,000 ton/year began production at Sparrow Point, Maryland, in 1982. The electrical, fuel distribution and control systems are described, together with the computer process control and monitoring systems.

  2. Automating Technical Processes and Reference Services Using SPIRES.

    ERIC Educational Resources Information Center

    Buckley, Joseph James

    1983-01-01

    Examines the capabilities, cost-effectiveness, and flexibility of the Stanford Public Information Retrieval System (SPIRES), an online information retrieval system producing a variety of printed products, and notes its use in the Title I Evaluation Clearinghouse, advantages of SPIRES, programing, and availability. Eleven references and a five-item…

  3. Managing the data explosion

    USGS Publications Warehouse

    Hooper, Richard P.; Aulenbach, Brent T.

    1993-01-01

    The 'data explosion' brought on by electronic sensors and automatic samplers can strain the capabilities of existing water-quality data-management systems just when they're needed most to process the information. The U.S. Geological Survey has responded to the problem by setting up an innovative system that allows rapid data analysis.

  4. Information Robots and Manipulators.

    ERIC Educational Resources Information Center

    Katys, G. P.; And Others

    In the modern concept a robot is a complex automatic cybernetics system capable of executing various operations in the sphere of human activity and in various respects combining the imitative capacity of the physical and mental activity of man. They are a class of automatic information systems intended for search, collection, processing, and…

  5. Preliminary Human Factors Guidelines for Automated Highway System Designers, Second Edition - Volume 1: Guidelines for AHS Designers

    DOT National Transportation Integrated Search

    1998-04-01

    Human factors can be defined as "designing to match the capabilities and limitations of the human user." The objectives of this human-centered design process are to maximize the effectiveness and efficiency of system performance, ensure a high level ...

  6. Introducing the Pressure-Sensing Palatograph--The Next Frontier in Electropalatography

    ERIC Educational Resources Information Center

    Murdoch, Bruce; Goozee, Justine; Veidt, Martin; Scott, Dion; Meyers, Ian

    2004-01-01

    Primary Objective. To extend the capabilities of current electropalatography (EPG) systems by developing a pressure-sensing EPG system. An initial trial of a prototype pressure-sensing palate will be presented. Research Design. The processes involved in designing the pressure sensors are outlined, with Hall effect transistors being selected. These…

  7. Using Computer Symbolic Algebra to Solve Differential Equations.

    ERIC Educational Resources Information Center

    Mathews, John H.

    1989-01-01

    This article illustrates that mathematical theory can be incorporated into the process to solve differential equations by a computer algebra system, muMATH. After an introduction to functions of muMATH, several short programs for enhancing the capabilities of the system are discussed. Listed are six references. (YP)

  8. Programmable DNA-Mediated Multitasking Processor.

    PubMed

    Shu, Jian-Jun; Wang, Qi-Wen; Yong, Kian-Yan; Shao, Fangwei; Lee, Kee Jin

    2015-04-30

    Because of DNA appealing features as perfect material, including minuscule size, defined structural repeat and rigidity, programmable DNA-mediated processing is a promising computing paradigm, which employs DNAs as information storing and processing substrates to tackle the computational problems. The massive parallelism of DNA hybridization exhibits transcendent potential to improve multitasking capabilities and yield a tremendous speed-up over the conventional electronic processors with stepwise signal cascade. As an example of multitasking capability, we present an in vitro programmable DNA-mediated optimal route planning processor as a functional unit embedded in contemporary navigation systems. The novel programmable DNA-mediated processor has several advantages over the existing silicon-mediated methods, such as conducting massive data storage and simultaneous processing via much fewer materials than conventional silicon devices.

  9. Integrating SAR and derived products into operational volcano monitoring and decision support systems

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; McAlpin, D. B.; Gong, W.; Ajadi, O.; Arko, S.; Webley, P. W.; Dehn, J.

    2015-02-01

    Remote sensing plays a critical role in operational volcano monitoring due to the often remote locations of volcanic systems and the large spatial extent of potential eruption pre-cursor signals. Despite the all-weather capabilities of radar remote sensing and its high performance in monitoring of change, the contribution of radar data to operational monitoring activities has been limited in the past. This is largely due to: (1) the high costs associated with radar data; (2) traditionally slow data processing and delivery procedures; and (3) the limited temporal sampling provided by spaceborne radars. With this paper, we present new data processing and data integration techniques that mitigate some of these limitations and allow for a meaningful integration of radar data into operational volcano monitoring decision support systems. Specifically, we present fast data access procedures as well as new approaches to multi-track processing that improve near real-time data access and temporal sampling of volcanic systems with SAR data. We introduce phase-based (coherent) and amplitude-based (incoherent) change detection procedures that are able to extract dense time series of hazard information from these data. For a demonstration, we present an integration of our processing system with an operational volcano monitoring system that was developed for use by the Alaska Volcano Observatory (AVO). Through an application to a historic eruption, we show that the integration of SAR into systems such as AVO can significantly improve the ability of operational systems to detect eruptive precursors. Therefore, the developed technology is expected to improve operational hazard detection, alerting, and management capabilities.

  10. The R-Shell approach - Using scheduling agents in complex distributed real-time systems

    NASA Technical Reports Server (NTRS)

    Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre

    1993-01-01

    Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.

  11. Online Multitasking Line-Scan Imaging Techniques for Simultaneous Safety and Quality Evaluation of Apples

    NASA Astrophysics Data System (ADS)

    Kim, Moon Sung; Lee, Kangjin; Chao, Kaunglin; Lefcourt, Alan; Cho, Byung-Kwan; Jun, Won

    We developed a push-broom, line-scan imaging system capable of simultaneous measurements of reflectance and fluorescence. The system allows multitasking inspections for quality and safety attributes of apples due to its dynamic capabilities in simultaneously capturing fluorescence and reflectance, and selectivity in multispectral bands. A multitasking image-based inspection system for online applications has been suggested in that a single imaging device that could perform a multitude of both safety and quality inspection needs. The presented multitask inspection approach in online applications may provide an economically viable means for a number of food processing industries being able to adapt to operate and meet the dynamic and specific inspection and sorting needs.

  12. Artist Concept of Atlantis' new home

    NASA Image and Video Library

    2012-01-18

    CAPE CANAVERAL, Fla. – At NASA’s Kennedy Space Center in Florida, workers are constructing 40-foot-diameter dish antenna arrays for the Ka-Band Objects Observation and Monitoring, or Ka-BOOM system. The antennas will be part of the operations command center facility. The construction site is near the former Vertical Processing Facility, which has been demolished. The Ka-BOOM project is one of the final steps in developing the techniques to build a high power, high resolution radar system capable of becoming a Near Earth Object Early Warning System. While also capable of space communication and radio science experiments, developing radar applications is the primary focus of the arrays. Photo credit: NASA/ Ben Smegelsky

  13. Smartphone based monitoring system for long-term sleep assessment.

    PubMed

    Domingues, Alexandre

    2015-01-01

    The diagnosis of sleep disorders, highly prevalent in Western countries, typically involves sophisticated procedures and equipment that are highly intrusive to the patient. The high processing capabilities and storage capacity of current portable devices, together with a big range of available sensors, many of them with wireless capabilities, create new opportunities and change the paradigms in sleep studies. In this work, a smartphone based sleep monitoring system is presented along with the details of the hardware, software and algorithm implementation. The aim of this system is to provide a way for subjects, with no pre-diagnosed sleep disorders, to monitor their sleep habits, and on the initial screening of abnormal sleep patterns.

  14. The Venus Balloon Project telemetry processing

    NASA Technical Reports Server (NTRS)

    Urech, J. M.; Chamarro, A.; Morales, J. L.; Urech, M. A.

    1986-01-01

    The peculiarities of the Venus Balloon telemetry system required the development of a new methodology for the telemetry processing, since the capabilities of the Deep Space Network (DSN) telemetry system do not include burst processing of short frames with two different bit rates and first bit acquisition. A software package was produced for the non-real time detection, demodulation, and decoding of the telemetry streams obtained from an open loop recording utilizing the DSN spectrum processing subsystem-radio science (DSP-RS). A general description of the resulting software package (DMO-5539-SP) and its adaptability to the real mission's variations is contained.

  15. Practical vision based degraded text recognition system

    NASA Astrophysics Data System (ADS)

    Mohammad, Khader; Agaian, Sos; Saleh, Hani

    2011-02-01

    Rapid growth and progress in the medical, industrial, security and technology fields means more and more consideration for the use of camera based optical character recognition (OCR) Applying OCR to scanned documents is quite mature, and there are many commercial and research products available on this topic. These products achieve acceptable recognition accuracy and reasonable processing times especially with trained software, and constrained text characteristics. Even though the application space for OCR is huge, it is quite challenging to design a single system that is capable of performing automatic OCR for text embedded in an image irrespective of the application. Challenges for OCR systems include; images are taken under natural real world conditions, Surface curvature, text orientation, font, size, lighting conditions, and noise. These and many other conditions make it extremely difficult to achieve reasonable character recognition. Performance for conventional OCR systems drops dramatically as the degradation level of the text image quality increases. In this paper, a new recognition method is proposed to recognize solid or dotted line degraded characters. The degraded text string is localized and segmented using a new algorithm. The new method was implemented and tested using a development framework system that is capable of performing OCR on camera captured images. The framework allows parameter tuning of the image-processing algorithm based on a training set of camera-captured text images. Novel methods were used for enhancement, text localization and the segmentation algorithm which enables building a custom system that is capable of performing automatic OCR which can be used for different applications. The developed framework system includes: new image enhancement, filtering, and segmentation techniques which enabled higher recognition accuracies, faster processing time, and lower energy consumption, compared with the best state of the art published techniques. The system successfully produced impressive OCR accuracies (90% -to- 93%) using customized systems generated by our development framework in two industrial OCR applications: water bottle label text recognition and concrete slab plate text recognition. The system was also trained for the Arabic language alphabet, and demonstrated extremely high recognition accuracy (99%) for Arabic license name plate text recognition with processing times of 10 seconds. The accuracy and run times of the system were compared to conventional and many states of art methods, the proposed system shows excellent results.

  16. CNPq/INPE-LANDSAT system report of activities

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Barbosa, M. N.

    1982-01-01

    The status of the Brazilian LANDSAT facilities and the results achieved are presented. In addition, a LANDSAT product sales/distribution analysis is provided. Data recording and processing capabilities and planned products are addressed.

  17. Solar thermal technology evaluation, fiscal year 1982. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The technology base of solar thermal energy is investigated. The materials, components, subsystems, and processes capable of meeting specific energy cost targets are emphasized, as are system efficiency and reliability.

  18. Heterogeneous concurrent computing with exportable services

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy

    1995-01-01

    Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.

  19. Image based performance analysis of thermal imagers

    NASA Astrophysics Data System (ADS)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  20. The Informatics Challenges Facing Biobanks: A Perspective from a United Kingdom Biobanking Network

    PubMed Central

    Groves, Martin; Jordan, Lee B.; Stobart, Hilary; Purdie, Colin A.; Thompson, Alastair M

    2015-01-01

    The challenges facing biobanks are changing from simple collections of materials to quality-assured fit-for-purpose clinically annotated samples. As a result, informatics awareness and capabilities of a biobank are now intrinsically related to quality. A biobank may be considered a data repository, in the form of raw data (the unprocessed samples), data surrounding the samples (processing and storage conditions), supplementary data (such as clinical annotations), and an increasing ethical requirement for biobanks to have a mechanism for researchers to return their data. The informatics capabilities of a biobank are no longer simply knowing sample locations; instead the capabilities will become a distinguishing factor in the ability of a biobank to provide appropriate samples. There is an increasing requirement for biobanking systems (whether in-house or commercially sourced) to ensure the informatics systems stay apace with the changes being experienced by the biobanking community. In turn, there is a requirement for the biobanks to have a clear informatics policy and directive that is embedded into the wider decision making process. As an example, the Breast Cancer Campaign Tissue Bank in the UK was a collaboration between four individual and diverse biobanks in the UK, and an informatics platform has been developed to address the challenges of running a distributed network. From developing such a system there are key observations about what can or cannot be achieved by informatics in isolation. This article will highlight some of the lessons learned during this development process. PMID:26418270

Top