Sample records for test procedures software

  1. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  2. IMCS reflight certification requirements and design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The requirements for reflight certification are established. Software requirements encompass the software programs that are resident in the PCC, DEP, PDSS, EC, or any related GSE. A design approach for the reflight software packages is recommended. These designs will be of sufficient detail to permit the implementation of reflight software. The PDSS/IMC Reflight Certification system provides the tools and mechanisms for the user to perform the reflight certification test procedures, test data capture, test data display, and test data analysis. The system as defined will be structured to permit maximum automation of reflight certification procedures and test data analysis.

  3. Agile deployment and code coverage testing metrics of the boot software on-board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián

    2018-02-01

    In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.

  4. ENVIRONMENTAL METHODS TESTING SITE PROJECT: DATA MANAGEMENT PROCEDURES PLAN

    EPA Science Inventory

    The Environmental Methods Testing Site (EMTS) Data Management Procedures Plan identifies the computer hardware and software resources used in the EMTS project. It identifies the major software packages that are available for use by principal investigators for the analysis of data...

  5. PDSS/IMC qualification test software acceptance procedures

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Tests to be performed for qualifying the payload development support system image motion compensator (IMC) are identified. The performance of these tests will verify the IMC interfaces and thereby verify the qualification test software.

  6. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  7. SEPAC software configuration control plan and procedures, revision 1

    NASA Technical Reports Server (NTRS)

    1981-01-01

    SEPAC Software Configuration Control Plan and Procedures are presented. The objective of the software configuration control is to establish the process for maintaining configuration control of the SEPAC software beginning with the baselining of SEPAC Flight Software Version 1 and encompass the integration and verification tests through Spacelab Level IV Integration. They are designed to provide a simplified but complete configuration control process. The intent is to require a minimum amount of paperwork but provide total traceability of SEPAC software.

  8. Product assurance policies and procedures for flight dynamics software development

    NASA Technical Reports Server (NTRS)

    Perry, Sandra; Jordan, Leon; Decker, William; Page, Gerald; Mcgarry, Frank E.; Valett, Jon

    1987-01-01

    The product assurance policies and procedures necessary to support flight dynamics software development projects for Goddard Space Flight Center are presented. The quality assurance and configuration management methods and tools for each phase of the software development life cycles are described, from requirements analysis through acceptance testing; maintenance and operation are not addressed.

  9. Acceptance test procedure bldg. 271-U remote monitoring of project W-059 B-Plant canyon exhaust system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCDANIEL, K.S.

    1999-09-01

    The test procedure provides for verifying indications and alarms The test procedure provides for verifying indications and alarms associated with the B Plant Canyon Ventilation System as they are being displayed on a remote monitoring workstation located in building 271-U. The system application software was installed by PLCS Plus under contract from B&W Hanford Company. The application software was installed on an existing operator workstation in building 271U which is owned and operated by Bechtel Hanford Inc.

  10. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1976-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.

  11. Guidelines for testing and release procedures

    NASA Technical Reports Server (NTRS)

    Molari, R.; Conway, M.

    1984-01-01

    Guidelines and procedures are recommended for the testing and release of the types of computer software efforts commonly performed at NASA/Ames Research Center. All recommendations are based on the premise that testing and release activities must be specifically selected for the environment, size, and purpose of each individual software project. Guidelines are presented for building a Test Plan and using formal Test Plan and Test Care Inspections on it. Frequent references are made to NASA/Ames Guidelines for Software Inspections. Guidelines are presented for selecting an Overall Test Approach and for each of the four main phases of testing: (1) Unit Testing of Components, (2) Integration Testing of Components, (3) System Integration Testing, and (4) Acceptance Testing. Tools used for testing are listed, including those available from operating systems used at Ames, specialized tools which can be developed, unit test drivers, stub module generators, and the use of format test reporting schemes.

  12. The Design of Software for Three-Phase Induction Motor Test System

    NASA Astrophysics Data System (ADS)

    Haixiang, Xu; Fengqi, Wu; Jiai, Xue

    2017-11-01

    The design and development of control system software is important to three-phase induction motor test equipment, which needs to be completely familiar with the test process and the control procedure of test equipment. In this paper, the software is developed according to the national standard (GB/T1032-2005) about three-phase induction motor test method by VB language. The control system and data analysis software and the implement about motor test system are described individually, which has the advantages of high automation and high accuracy.

  13. Advanced communications technology satellite high burst rate link evaluation terminal communication protocol software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Communication Protocol Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Communication Protocol Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Communication Protocol Software allows users to control and configure the Intermediate Frequency Switch Matrix (IFSM) on board the ACTS to yield a desired path through the spacecraft payload. Besides IFSM control, the C&PM Software System is also responsible for instrument control during HBR-LET experiments, uplink power control of the HBR-LET to demonstrate power augmentation during signal fade events, and data display. The Communication Protocol Software User's Guide, Version 1.0 (NASA CR-189162) outlines the commands and procedures to install and operate the Communication Protocol Software. Configuration files used to control the IFSM, operator commands, and error recovery procedures are discussed. The Communication Protocol Software Maintenance Manual, Version 1.0 (NASA CR-189163, to be published) is a programmer's guide to the Communication Protocol Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Communication Protocol Software, computer algorithms, format representations, and computer hardware configuration. The Communication Protocol Software Test Plan (NASA CR-189164, to be published) provides a step-by-step procedure to verify the operation of the software. Included in the Test Plan is command transmission, telemetry reception, error detection, and error recovery procedures.

  14. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George I.; Stetson, Howard K.

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders

  15. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.

  16. Acquisition Handbook - Update. Comprehensive Approach to Reusable Defensive Software (CARDS)

    DTIC Science & Technology

    1994-03-25

    designs, and implementation components (source code, test plans, procedures and results, and system/software documentation). This handbook provides a...activities where software components are acquired, evaluated, tested and sometimes modified. In addition to serving as a facility for the acquisition and...systems from such components [1]. Implementation components are at the lowest level and consist of: specifications; detailed designs; code, test

  17. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1975-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.

  18. Software Reliability, Measurement, and Testing. Volume 2. Guidebook for Software Reliability Measurement and Testing

    DTIC Science & Technology

    1992-04-01

    contractor’s existing data collection, analysis and corrective action system shall be utilized, with modification only as necessary to meet the...either from test or from analysis of field data . The procedures of MIL-STD-756B assume that the reliability of a 18 DEFINE IDENTIFY SOFTWARE LIFE CYCLE...to generate sufficient data to report a statistically valid reliability figure for a class of software. Casual data gathering accumulates data more

  19. System Testing of Ground Cooling System Components

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler Steven

    2014-01-01

    This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.

  20. Artificial intelligence and expert systems in-flight software testing

    NASA Technical Reports Server (NTRS)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  1. Guidelines for software inspections

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  2. Application of industry-standard guidelines for the validation of avionics software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  3. Proceedings of the Joint Logistics Commanders Joint Policy Coordinating Group on Computer Resource Management; Computer Software Management Software Workshop, 2-5 April 1979.

    DTIC Science & Technology

    1979-08-21

    Appendix s - Outline and Draft Material for Proposed Triservice Interim Guideline on Application of Software Acceptance Criteria....... 269 Appendix 9...AND DRAFT MATERIAL FOR PROPOSED TRISERVICE INTERIM GUIDELINE ON APPLICATION OF SOFTWARE ACCEPTANCE CRITERIA I I INTRODUCTION The purpose of this guide...contract item (CPCI) (code) 5. CPCI test plan 6. CPCI test procedures 7. CPCI test report 8. Handbooks and manuals. Al though additional material does

  4. AZ-101 Mixer Pump Test Qualification Test Procedures (QTP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    THOMAS, W.K.

    2000-01-10

    Describes the Qualification test procedure for the AZ-101 Mixer Pump Data Acquisition System (DAS). The purpose of this Qualification Test Procedure (QTP) is to confirm that the AZ-101 Mixer Pump System has been properly programmed and hardware configured correctly. This QTP will test the software setpoints for the alarms and also check the wiring configuration from the SIMcart to the HMI. An Acceptance Test Procedure (ATP), similar to this QTP will be performed to test field devices and connections from the field.

  5. Ground Software Maintenance Facility (GSMF) user's manual

    NASA Technical Reports Server (NTRS)

    Aquila, V.; Derrig, D.; Griffith, G.

    1986-01-01

    Instructions for the Ground Software Maintenance Facility (GSMF) system user is provided to operate the GSMF in all modes. The GSMF provides the resources for the Automatic Test Equipment (ATE) computer program maintenance (GCOS and GOAL). Applicable reference documents are listed. An operational overview and descriptions of the modes in terms of operator interface, options, equipment, material utilization, and operational procedures are contained. Test restart procedures are described. The GSMF documentation tree is presented including the user manual.

  6. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  7. Automation of electromagnetic compatability (EMC) test facilities

    NASA Technical Reports Server (NTRS)

    Harrison, C. A.

    1986-01-01

    Efforts to automate electromagnetic compatibility (EMC) test facilities at Marshall Space Flight Center are discussed. The present facility is used to accomplish a battery of nine standard tests (with limited variations) deigned to certify EMC of Shuttle payload equipment. Prior to this project, some EMC tests were partially automated, but others were performed manually. Software was developed to integrate all testing by means of a desk-top computer-controller. Near real-time data reduction and onboard graphics capabilities permit immediate assessment of test results. Provisions for disk storage of test data permit computer production of the test engineer's certification report. Software flexibility permits variation in the tests procedure, the ability to examine more closely those frequency bands which indicate compatibility problems, and the capability to incorporate additional test procedures.

  8. User systems guidelines for software projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abrahamson, L.

    1986-04-01

    This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)

  9. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    NASA Technical Reports Server (NTRS)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  10. PREDICTING CHRONIC LETHALITY OF CHEMICALS TO FISHES FROM ACUTE TOXICITY TEST DATA: THEORY OF ACCELERATED LIFE TESTING

    EPA Science Inventory

    A method for modeling aquatic toxicity date based on the theory of accelerated life testing and a procedure for maximum likelihood fitting the proposed model is presented. he procedure is computerized as software, which can predict chronic lethality of chemicals using data from a...

  11. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  12. Test/score/report: Simulation techniques for automating the test process

    NASA Technical Reports Server (NTRS)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.

  13. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  14. Ground Systems Development Environment (GSDE) interface requirements analysis

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Philips, John; Hartenstein, Ray; Bassman, Mitchell; Ruskin, Leslie; Perez-Davila, Alfredo

    1991-01-01

    A set of procedural and functional requirements are presented for the interface between software development environments and software integration and test systems used for space station ground systems software. The requirements focus on the need for centralized configuration management of software as it is transitioned from development to formal, target based testing. This concludes the GSDE Interface Requirements study. A summary is presented of findings concerning the interface itself, possible interface and prototyping directions for further study, and results of the investigation of the Cronus distributed applications environment.

  15. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  16. Florida specific NTCIP MIB development for actuated signal controller (ASC), closed-circuit television (CCTV), and center-to-center (C2C) communications with SunGuideSM software and ITS device test procedure development : summary of final report.

    DOT National Transportation Integrated Search

    2009-06-01

    To provide hardware, software, network, systems research, and testing for multi-million dollar traffic : operations, Intelligent Transportation Systems (ITS), and statewide communications investments, the : Traffic Engineering and Operations Office h...

  17. Florida specific NTCIP MIB development for actuated signal controller (ASC), closed-circuit television (CCTV), and center-to-center (C2C) communications with SunGuideSM software and ITS device test procedure development : executive summary.

    DOT National Transportation Integrated Search

    2009-06-01

    To provide hardware, software, network, systems research, and testing for multi-million : dollar traffic operations, Intelligent Transportation Systems (ITS), and statewide : communications investments, the Traffic Engineering and Operations Office h...

  18. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  19. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  20. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, W. A.; Lepicovsky, J.

    1992-01-01

    The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.

  1. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, William A.

    1992-01-01

    The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.

  2. Testing of CMA-2000 Microwave Landing System (MLS) airborne receiver

    NASA Astrophysics Data System (ADS)

    Labreche, L.; Murfin, A. J.

    1989-09-01

    Microwave landing system (MLS) is a precision approach and landing guidance system which provides position information and various air to ground data. Position information is provided on a wide coverage sector and is determined by an azimuth angle measurement, an elevation angle measurement, and a range measurement. MLS performance standards and testing of the MLS airborne receiver is mainly governed by Technical Standard Order TSO-C104 issued by the Federal Aviation Administration. This TSO defines detailed test procedures for use in determining the required performance under standard and stressed conditions. It also imposes disciplines on software development and testing procedures. Testing performed on the CMA-2000 MLS receiver and methods used in its validation are described. A computer automated test system has been developed to test for compliance with RTCA/DO-177 Minimum Operation Performance Standards. Extensive software verification and traceability tests designed to ensure compliance with RTCA/DO-178 are outlined.

  3. 242A Distributed Control System Year 2000 Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct yearmore » 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.« less

  4. Certification of highly complex safety-related systems.

    PubMed

    Reinert, D; Schaefer, M

    1999-01-01

    The BIA has now 15 years of experience with the certification of complex electronic systems for safety-related applications in the machinery sector. Using the example of machining centres this presentation will show the systematic procedure for verifying and validating control systems using Application Specific Integrated Circuits (ASICs) and microcomputers for safety functions. One section will describe the control structure of machining centres with control systems using "integrated safety." A diverse redundant architecture combined with crossmonitoring and forced dynamization is explained. In the main section the steps of the systematic certification procedure are explained showing some results of the certification of drilling machines. Specification reviews, design reviews with test case specification, statistical analysis, and walk-throughs are the analytical measures in the testing process. Systematic tests based on the test case specification, Electro Magnetic Interference (EMI), and environmental testing, and site acceptance tests on the machines are the testing measures for validation. A complex software driven system is always undergoing modification. Most of the changes are not safety-relevant but this has to be proven. A systematic procedure for certifying software modifications is presented in the last section of the paper.

  5. Shade matching assisted by digital photography and computer software.

    PubMed

    Schropp, Lars

    2009-04-01

    To evaluate the efficacy of digital photographs and graphic computer software for color matching compared to conventional visual matching. The shade of a tab from a shade guide (Vita 3D-Master Guide) placed in a phantom head was matched to a second guide of the same type by nine observers. This was done for twelve selected shade tabs (tests). The shade-matching procedure was performed visually in a simulated clinic environment and with digital photographs, and the time spent for both procedures was recorded. An alternative arrangement of the shade tabs was used in the digital photographs. In addition, a graphic software program was used for color analysis. Hue, chroma, and lightness values of the test tab and all tabs of the second guide were derived from the digital photographs. According to the CIE L*C*h* color system, the color differences between the test tab and tabs of the second guide were calculated. The shade guide tab that deviated least from the test tab was determined to be the match. Shade matching performance by means of graphic software was compared with the two visual methods and tested by Chi-square tests (alpha= 0.05). Eight of twelve test tabs (67%) were matched correctly by the computer software method. This was significantly better (p < 0.02) than the performance of the visual shade matching methods conducted in the simulated clinic (32% correct match) and with photographs (28% correct match). No correlation between time consumption for the visual shade matching methods and frequency of correct match was observed. Shade matching assisted by digital photographs and computer software was significantly more reliable than by conventional visual methods.

  6. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Test Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product used in conjunction with the supercritical fluid extraction Fourier transform infrared spectrometer (SFE-FTIR) system. This product converts the raw spectral data to useful area numbers. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.'' This test plan will be performed in conjunction with or prior to HNF-6936, ''HA-53 Supercritical Fluid Extraction System Acceptance Test Plan'', to perform analyses for water. The test will ensure that the software can be installed properly and will manipulate the analytical data correctly.

  7. CT and MRI slice separation evaluation by LabView developed software.

    PubMed

    Acri, Giuseppe; Testagrossa, Barbara; Sestito, Angela; Bonanno, Lilla; Vermiglio, Giuseppe

    2018-02-01

    The efficient use of Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice separation, during multislices acquisition, requires scan exploration of phantoms containing test objects. To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination the midpoint of full width at half maximum (FWHM) in real time while the distance from the profile midpoint of two progressive images is evaluated and measured. The results were compared with those obtained by processing the same phantom images with commercial software. To validate the proposed methodology the Fisher test was conducted on the resulting data sets. In all cases, there was no statistically significant variation between the commercial procedure and the LabView one, which can be used on any CT and MRI diagnostic devices. Copyright © 2017. Published by Elsevier GmbH.

  8. SDO FlatSat Facility

    NASA Technical Reports Server (NTRS)

    Amason, David L.

    2008-01-01

    The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.

  9. Analyzing the test process using structural coverage

    NASA Technical Reports Server (NTRS)

    Ramsey, James; Basili, Victor R.

    1985-01-01

    A large, commercially developed FORTRAN program was modified to produce structural coverage metrics. The modified program was executed on a set of functionally generated acceptance tests and a large sample of operational usage cases. The resulting structural coverage metrics are combined with fault and error data to evaluate structural coverage. It was shown that in the software environment the functionally generated tests seem to be a good approximation of operational use. The relative proportions of the exercised statement subclasses change as the structural coverage of the program increases. A method was also proposed for evaluating if two sets of input data exercise a program in a similar manner. Evidence was provided that implies that in this environment, faults revealed in a procedure are independent of the number of times the procedure is executed and that it may be reasonable to use procedure coverage in software models that use statement coverage. Finally, the evidence suggests that it may be possible to use structural coverage to aid in the management of the acceptance test processed.

  10. Cleanroom certification model

    NASA Technical Reports Server (NTRS)

    Currit, P. A.

    1983-01-01

    The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.

  11. Annual Research Progress Report.

    DTIC Science & Technology

    1979-09-30

    will be trained in SLRL test procedures and the methodology will be developed for the incorporation of test materials into the standard rearing diet ...requirements exist for system software maintenance and development of software to report dosing data, to calculate diet preparation data, to manage collected...influence of diet and exercise on myo- globin and metmyoglobin reductase were evaluated in the rat. The activity of inetmyo- globin reductase was

  12. A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    NASA Astrophysics Data System (ADS)

    Frailis, M.; Maris, M.; Zacchei, A.; Morisset, N.; Rohlfs, R.; Meharga, M.; Binko, P.; Türler, M.; Galeotta, S.; Gasparo, F.; Franceschi, E.; Butler, R. C.; D'Arcangelo, O.; Fogliani, S.; Gregorio, A.; Lowe, S. R.; Maggio, G.; Malaspina, M.; Mandolesi, N.; Manzato, P.; Pasian, F.; Perrotta, F.; Sandri, M.; Terenzi, L.; Tomasi, M.; Zonca, A.

    2009-12-01

    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.

  13. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  14. Design ATE systems for complex assemblies

    NASA Astrophysics Data System (ADS)

    Napier, R. S.; Flammer, G. H.; Moser, S. A.

    1983-06-01

    The use of ATE systems in radio specification testing can reduce the test time by approximately 90 to 95 percent. What is more, the test station does not require a highly trained operator. Since the system controller has full power over all the measurements, human errors are not introduced into the readings. The controller is immune to any need to increase output by allowing marginal units to pass through the system. In addition, the software compensates for predictable, repeatable system errors, for example, cabling losses, which are an inherent part of the test setup. With no variation in test procedures from unit to unit, there is a constant repeatability factor. Preparing the software, however, usually entails considerable expense. It is pointed out that many of the problems associated with ATE system software can be avoided with the use of a software-intensive, or computer-intensive, system organization. Its goal is to minimize the user's need for software development, thereby saving time and money.

  15. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  16. Is Your School Y2K-OK?

    ERIC Educational Resources Information Center

    Bates, Martine G.

    1999-01-01

    The most vulnerable Y2K areas for schools are networked computers, free-standing personal computers, software, and embedded chips in utilities such as telephones and fire alarms. Expensive, time-consuming procedures and software have been developed for testing and bringing most computers into compliance. Districts need a triage prioritization…

  17. Moving base Gravity Gradiometer Survey System (GGSS) program

    NASA Astrophysics Data System (ADS)

    Pfohl, Louis; Rusnak, Walter; Jircitano, Albert; Grierson, Andrew

    1988-04-01

    The GGSS program began in early 1983 with the objective of delivering a landmobile and airborne system capable of fast, accurate, and economical gravity gradient surveys of large areas anywhere in the world. The objective included the development and use of post-mission data reduction software to process the survey data into solutions for the gravity disturbance vector components (north, east and vertical). This document describes the GGSS equipment hardware and software, integration and lab test procedures and results, and airborne and land survey procedures and results. Included are discussions on test strategies, post-mission data reduction algorithms, and the data reduction processing experience. Perspectives and conclusions are drawn from the results.

  18. Joint Ordnance Test Procedure (JOTP)-001 Allied Ammunition Safety and Suitability for Service Assessment Testing Publication - Guidance

    DTIC Science & Technology

    2013-01-08

    hazard due to enemy attack or accident (e.g. Insensitive Munitions (IM) and Electromagnetic Environmental Effects (E3)) and the explosive materials...of mitigating potential explosive remnants of war hazards , particularly from unexploded ordnance , should be conducted. 6.5 Munition Software System...TYPE Final 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Joint Ordnance Test Procedure (JOTP)-001 Allied Ammunition Safety and

  19. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, William A.

    1991-01-01

    During 1991, the software developed allowed an operator to configure and checkout the TSI, Inc. laser velocimeter (LV) system prior to a run. This setup procedure established the operating conditions for the TSI MI-990 multichannel interface and the RMR-1989 rotating machinery resolver. In addition to initializing the instruments, the software package provides a means of specifying LV calibration constants, controlling the sampling process, and identifying the test parameters.

  20. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  1. An Integrated Analysis-Test Approach

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2003-01-01

    This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.

  2. A Survey and Evaluation of Software Quality Assurance.

    DTIC Science & Technology

    1984-09-01

    activities; 2. Cryptologic activities related to national security; 3. Command and control of military forces; 4. Equipment that is an integral part of a...Testing and Integration , and Performance or Operation (6). Figure 3 shows the software life cycle and the key outputs of the phases. The first phase to...defects. This procedure is considered the Checkout (13:09-91). Once coding is complete, the Testing and Integration Phase begins. Here the developed

  3. A Unique Software System For Simulation-to-Flight Research

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Hutchinson, Brian K.

    2001-01-01

    "Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.

  4. Development and integration of a LabVIEW-based modular architecture for automated execution of electrochemical catalyst testing.

    PubMed

    Topalov, Angel A; Katsounaros, Ioannis; Meier, Josef C; Klemm, Sebastian O; Mayrhofer, Karl J J

    2011-11-01

    This paper describes a system for performing electrochemical catalyst testing where all hardware components are controlled simultaneously using a single LabVIEW-based software application. The software that we developed can be operated in both manual mode for exploratory investigations and automatic mode for routine measurements, by using predefined execution procedures. The latter enables the execution of high-throughput or combinatorial investigations, which decrease substantially the time and cost for catalyst testing. The software was constructed using a modular architecture which simplifies the modification or extension of the system, depending on future needs. The system was tested by performing stability tests of commercial fuel cell electrocatalysts, and the advantages of the developed system are discussed. © 2011 American Institute of Physics

  5. Technical Note: Unified imaging and robotic couch quality assurance.

    PubMed

    Cook, Molly C; Roper, Justin; Elder, Eric S; Schreibmann, Eduard

    2016-09-01

    To introduce a simplified quality assurance (QA) procedure that integrates tests for the linac's imaging components and the robotic couch. Current QA procedures for evaluating the alignment of the imaging system and linac require careful positioning of a phantom at isocenter before image acquisition and analysis. A complementary procedure for the robotic couch requires an initial displacement of the phantom and then evaluates the accuracy of repositioning the phantom at isocenter. We propose a two-in-one procedure that introduces a custom software module and incorporates both checks into one motion for increased efficiency. The phantom was manually set with random translational and rotational shifts, imaged with the in-room imaging system, and then registered to the isocenter using a custom software module. The software measured positioning accuracy by comparing the location of the repositioned phantom with a CAD model of the phantom at isocenter, which is physically verified using the MV port graticule. Repeatability of the custom software was tested by an assessment of internal marker location extraction on a series of scans taken over differing kV and CBCT acquisition parameters. The proposed method was able to correctly position the phantom at isocenter within acceptable 1 mm and 1° SRS tolerances, verified by both physical inspection and the custom software. Residual errors for mechanical accuracy were 0.26 mm vertically, 0.21 mm longitudinally, 0.55 mm laterally, 0.21° in pitch, 0.1° in roll, and 0.67° in yaw. The software module was shown to be robust across various scan acquisition parameters, detecting markers within 0.15 mm translationally in kV acquisitions and within 0.5 mm translationally and 0.3° rotationally across CBCT acquisitions with significant variations in voxel size. Agreement with vendor registration methods was well within 0.5 mm; differences were not statistically significant. As compared to the current two-step approach, the proposed QA procedure streamlines the workflow, accounts for rotational errors in imaging alignment, and simulates a broad range of variations in setup errors seen in clinical practice.

  6. The Production Data Approach for Full Lifecycle Management

    NASA Astrophysics Data System (ADS)

    Schopf, J.

    2012-04-01

    The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.

  7. Jeff Cheatham, senior metrologist

    NASA Image and Video Library

    2015-01-27

    JEFF CHEATHAM, SENIOR METROLOGIST AT THE MARSHALL METROLOGY AND CALIBRATION LABORATORY, SPENT 12 YEARS DEVELOPING 2400 AUTOMATED SOFTWARE PROCEDURES USED FOR CALIBRATION AND TESTING SPACE VEHICLES AND EQUIPMENT

  8. Research and emulation of ranging in BPON system

    NASA Astrophysics Data System (ADS)

    Yang, Guangxiang; Tao, Dexin; He, Yan

    2005-12-01

    Ranging is one of the key technologies in Broadband Passive Optical Network based on the ATM (BPON) system. It is complex for software designers and difficult to test. In order to simplify the ranging procedure, enhance its efficiency, and find an appropriate method to verify it, a new ranging procedure that completely satisfies the requirements specified in ITU-T G.983.1 and one verifying method is proposed in this paper. A kind of ranging procedure without serial number (SN) searching function, called one-by-one ranging are developed under the condition of cold PON, cold Optical Network Termination (ONU). The ranging procedure includes the use of OLT and ONU flow charts respectively. By using the network emulation software OPNET, the BPON system is modeled and the ranging procedure is simulated. The emulation experimental results show that the presented ranging procedure can effectively eliminate the collision of burst mode signals between ONUs, which can be ranged one-by-one under the controlling of OLT, while also enhancing the ranging efficiency. As all of the message formats used in this research conform with the ITU-T G.983.1, the ranging procedure can meet the protocol specifications with good interoperability, and is very compatible with products of other manufacturer. According to the present study of ranging procedures, guidelines and principles are provided, Also some design difficulties are eliminated in the software design.

  9. Multiple IMU system test plan, volume 4. [subroutines for space shuttle requirements

    NASA Technical Reports Server (NTRS)

    Landey, M.; Vincent, K. T., Jr.; Whittredge, R. S.

    1974-01-01

    Operating procedures for this redundant system are described. A test plan is developed with two objectives. First, performance of the hardware and software delivered is demonstrated. Second, applicability of multiple IMU systems to the space shuttle mission is shown through detailed experiments with FDI algorithms and other multiple IMU software: gyrocompassing, calibration, and navigation. Gimbal flip is examined in light of its possible detrimental effects on FDI and navigation. For Vol. 3, see N74-10296.

  10. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner. Through agile testing, there has been increased value through individuals and interactions over processes and tools, improved customer collaboration, and improved responsiveness to changes through controlled planning. The presentation will describe agile testing methodology as taken with the SLS FSW Test and Verification team at Marshall Space Flight Center.

  11. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  12. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  13. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  14. Software control and system configuration management: A systems-wide approach

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1984-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  15. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  16. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  17. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    PubMed

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Development strategies for the satellite flight software on-board Meteosat Third Generation

    NASA Astrophysics Data System (ADS)

    Tipaldi, Massimo; Legendre, Cedric; Koopmann, Olliver; Ferraguto, Massimo; Wenker, Ralf; D'Angelo, Gianni

    2018-04-01

    Nowadays, satellites are becoming increasingly software dependent. Satellite Flight Software (FSW), that is to say, the application software running on the satellite main On-Board Computer (OBC), plays a relevant role in implementing complex space mission requirements. In this paper, we examine relevant technical approaches and programmatic strategies adopted for the development of the Meteosat Third Generation Satellite (MTG) FSW. To begin with, we present its layered model-based architecture, and the means for ensuring a robust and reliable interaction among the FSW components. Then, we focus on the selection of an effective software development life cycle model. In particular, by combining plan-driven and agile approaches, we can fulfill the need of having preliminary SW versions. They can be used for the elicitation of complex system-level requirements as well as for the initial satellite integration and testing activities. Another important aspect can be identified in the testing activities. Indeed, very demanding quality requirements have to be fulfilled in satellite SW applications. This manuscript proposes a test automation framework, which uses an XML-based test procedure language independent of the underlying test environment. Finally, a short overview of the MTG FSW sizing and timing budgets concludes the paper.

  19. Representation of the Physiological Factors Contributing to Postflight Changes in Functional Performance Using Motion Analysis Software

    NASA Technical Reports Server (NTRS)

    Parks, Kelsey

    2010-01-01

    Astronauts experience changes in multiple physiological systems due to exposure to the microgravity conditions of space flight. To understand how changes in physiological function influence functional performance, a testing procedure has been developed that evaluates both astronaut postflight functional performance and related physiological changes. Astronauts complete seven functional and physiological tests. The objective of this project is to use motion tracking and digitizing software to visually display the postflight decrement in the functional performance of the astronauts. The motion analysis software will be used to digitize astronaut data videos into stick figure videos to represent the astronauts as they perform the Functional Tasks Tests. This project will benefit NASA by allowing NASA scientists to present data of their neurological studies without revealing the identities of the astronauts.

  20. Multi-crop area estimation and mapping on a microprocessor/mainframe network

    NASA Technical Reports Server (NTRS)

    Sheffner, E.

    1985-01-01

    The data processing system is outlined for a 1985 test aimed at determining the performance characteristics of area estimation and mapping procedures connected with the California Cooperative Remote Sensing Project. The project is a joint effort of the USDA Statistical Reporting Service-Remote Sensing Branch, the California Department of Water Resources, NASA-Ames Research Center, and the University of California Remote Sensing Research Program. One objective of the program was to study performance when data processing is done on a microprocessor/mainframe network under operational conditions. The 1985 test covered the hardware, software, and network specifications and the integration of these three components. Plans for the year - including planned completion of PEDITOR software, testing of software on MIDAS, and accomplishment of data processing on the MIDAS-VAX-CRAY network - are discussed briefly.

  1. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    PubMed

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  2. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  3. Chemorheology of highly filled thermosets: Effects of fillers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halley, P.J.

    1996-12-31

    Highly filled thermosets are utilised in the manufacture of many high value added products in the aerospace, communication, computer and automobile industries. A fundamental understanding of the processing of these materials has, however, been hindered by the inherent complexities of the flow and cure properties of these composite thermoset materials. A chemorheological (gel point and chemoviscosity) testing procedure is described here that uses dynamic multiwave tests on a modified Rheometrics RDSII system, using a filled epoxy moulding compound (EMC). Data from this testing procedure has been combined with physical property and kinetic data to produce realistic simulation results using flowmore » simulation software (TSET; MOLDFLOW Pty Ltd). This chemorheological testing procedure has now been successfully implemented commercially by MOLDFLOW Pty Ltd.« less

  4. BESTEST-EX | Buildings | NREL

    Science.gov Websites

    method for testing home energy audit software and associated calibration methods. BESTEST-EX is one of Energy Analysis Model Calibration Methods. When completed, the ANSI/RESNET SMOT will specify test procedures for evaluating calibration methods used in conjunction with predicting building energy use and

  5. Bayesian networks for satellite payload testing

    NASA Astrophysics Data System (ADS)

    Przytula, Krzysztof W.; Hagen, Frank; Yung, Kar

    1999-11-01

    Satellite payloads are fast increasing in complexity, resulting in commensurate growth in cost of manufacturing and operation. A need exists for a software tool, which would assist engineers in production and operation of satellite systems. We have designed and implemented a software tool, which performs part of this task. The tool aids a test engineer in debugging satellite payloads during system testing. At this stage of satellite integration and testing both the tested payload and the testing equipment represent complicated systems consisting of a very large number of components and devices. When an error is detected during execution of a test procedure, the tool presents to the engineer a ranked list of potential sources of the error and a list of recommended further tests. The engineer decides this on this basis if to perform some of the recommended additional test or replace the suspect component. The tool has been installed in payload testing facility. The tool is based on Bayesian networks, a graphical method of representing uncertainty in terms of probabilistic influences. The Bayesian network was configured using detailed flow diagrams of testing procedures and block diagrams of the payload and testing hardware. The conditional and prior probability values were initially obtained from experts and refined in later stages of design. The Bayesian network provided a very informative model of the payload and testing equipment and inspired many new ideas regarding the future test procedures and testing equipment configurations. The tool is the first step in developing a family of tools for various phases of satellite integration and operation.

  6. Data collection procedures for the Software Engineering Laboratory (SEL) database

    NASA Technical Reports Server (NTRS)

    Heller, Gerard; Valett, Jon; Wild, Mary

    1992-01-01

    This document is a guidebook to collecting software engineering data on software development and maintenance efforts, as practiced in the Software Engineering Laboratory (SEL). It supersedes the document entitled Data Collection Procedures for the Rehosted SEL Database, number SEL-87-008 in the SEL series, which was published in October 1987. It presents procedures to be followed on software development and maintenance projects in the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC) for collecting data in support of SEL software engineering research activities. These procedures include detailed instructions for the completion and submission of SEL data collection forms.

  7. Test Driven Development: Lessons from a Simple Scientific Model

    NASA Astrophysics Data System (ADS)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  8. APEX 3: a multi-purpose test platform for auditory psychophysical experiments.

    PubMed

    Francart, Tom; van Wieringen, Astrid; Wouters, Jan

    2008-07-30

    APEX 3 is a software test platform for auditory behavioral experiments. It provides a generic means of setting up experiments without any programming. The supported output devices include sound cards and cochlear implants from Cochlear Corporation and Advanced Bionics Corporation. Many psychophysical procedures are provided and there is an interface to add custom procedures. Plug-in interfaces are provided for data filters and external controllers. APEX 3 is supported under Linux and Windows and is available free of charge.

  9. 75 FR 64737 - Automated Commercial Environment (ACE): Announcement of a National Customs Automation Program...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... Commissioner of CBP with authority to conduct limited test programs or procedures designed to evaluate planned.... Specifically, CBP is looking for test participants to include: 2-3 Ocean Carriers. At least one must be filing... their software ready to test with CBP once CBP begins the certification process. CBP will post the...

  10. Optimal Sample Size Determinations for the Heteroscedastic Two One-Sided Tests of Mean Equivalence: Design Schemes and Software Implementations

    ERIC Educational Resources Information Center

    Jan, Show-Li; Shieh, Gwowen

    2017-01-01

    Equivalence assessment is becoming an increasingly important topic in many application areas including behavioral and social sciences research. Although there exist more powerful tests, the two one-sided tests (TOST) procedure is a technically transparent and widely accepted method for establishing statistical equivalence. Alternatively, a direct…

  11. SEDS1 mission software verification using a signal simulator

    NASA Technical Reports Server (NTRS)

    Pierson, William E.

    1992-01-01

    The first flight of the Small Expendable Deployer System (SEDS1) is schedule to fly as the secondary payload of a Delta 2 in March, 1993. The objective of the SEDS1 mission is to collect data to validate the concept of tethered satellite systems and to verify computer simulations used to predict their behavior. SEDS1 will deploy a 50 lb. instrumented satellite as an end mass using a 20 km tether. Langley Research Center is providing the end mass instrumentation, while the Marshall Space Flight Center is designing and building the deployer. The objective of the experiment is to test the SEDS design concept by demonstrating that the system will satisfactorily deploy the full 20 km tether without stopping prematurely, come to a smooth stop on the application of a brake, and cut the tether at the proper time after it swings to the local vertical. Also, SEDS1 will collect data which will be used to test the accuracy of tether dynamics models used to stimulate this type of deployment. The experiment will last about 1.5 hours and complete approximately 1.5 orbits. Radar tracking of the Delta II and end mass is planned. In addition, the SEDS1 on-board computer will continuously record, store, and transmit mission data over the Delta II S-band telemetry system. The Data System will count tether windings as the tether unwinds, log the times of each turn and other mission events, monitor tether tension, and record the temperature of system components. A summary of the measurements taken during the SEDS1 are shown. The Data System will also control the tether brake and cutter mechanisms. Preliminary versions of two major sections of the flight software, the data telemetry modules and the data collection modules, were developed and tested under the 1990 NASA/ASEE Summer Faculty Fellowship Program. To facilitate the debugging of these software modules, a prototype SEDS Data System was programmed to simulate turn count signals. During the 1991 summer program, the concept of simulating signals produced by the SEDS electronics systems and circuits was expanded and more precisely defined. During the 1992 summer program, the SEDS signal simulator was programmed to test the requirements of the SEDS Mission software, and this simulator will be used in the formal verification of the SEDS Mission Software. The formal test procedures specification was written which incorporates the use of the signal simulator to test the SEDS Mission Software and which incorporates procedures for testing the other major component of the SEDS software, the Monitor Software.

  12. NOAO testing procedures for large optics

    NASA Astrophysics Data System (ADS)

    Stepp, Larry M.; Poczulp, Gary A.; Pearson, Earl T.; Roddier, Nicolas A.

    1992-03-01

    This paper describes optical testing procedures used at the National Optical Astronomy Observatories (NOAO) for testing large optics. It begins with a discussion of the philosophy behind the testing approach and then describes a number of different testing methods used at NOAO, including the wire test, full-aperture and sub-aperture Hartmann testing, and scatterplate interferometry. Specific innovations that enhance the testing capabilities are mentioned. NOAO data reduction software is described. Examples are given of specific output formats that are useful to the optician, using illustrations taken from recent testing of a 3.5- meter, f/1.75 borosilicate honeycomb mirror. Finally, we discuss some of the optical testing challenges posed by the large optics for the Gemini 8-meter Telescopes Project.

  13. State-of-the-art software for window energy-efficiency rating and labeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arasteh, D.; Finlayson, E.; Huang, J.

    1998-07-01

    Measuring the thermal performance of windows in typical residential buildings is an expensive proposition. Not only is laboratory testing expensive, but each window manufacturer typically offers hundreds of individual products, each of which has different thermal performance properties. With over a thousand window manufacturers nationally, a testing-based rating system would be prohibitively expensive to the industry and to consumers. Beginning in the early 1990s, simulation software began to be used as part of a national program for rating window U-values. The rating program has since been expanded to include Solar Hear Gain Coefficients and is now being extended to annualmore » energy performance. This paper describes four software packages available to the public from Lawrence Berkeley National Laboratory (LBNL). These software packages are used to evaluate window thermal performance: RESFEN (for evaluating annual energy costs), WINDOW (for calculating a product`s thermal performance properties), THERM (a preprocessor for WINDOW that determines two-dimensional heat-transfer effects), and Optics (a preprocessor for WINDOW`s glass database). Software not only offers a less expensive means than testing to evaluate window performance, it can also be used during the design process to help manufacturers produce windows that will meet target specifications. In addition, software can show small improvements in window performance that might not be detected in actual testing because of large uncertainties in test procedures.« less

  14. Small-scale fixed wing airplane software verification flight test

    NASA Astrophysics Data System (ADS)

    Miller, Natasha R.

    The increased demand for micro Unmanned Air Vehicles (UAV) driven by military requirements, commercial use, and academia is creating a need for the ability to quickly and accurately conduct low Reynolds Number aircraft design. There exist several open source software programs that are free or inexpensive that can be used for large scale aircraft design, but few software programs target the realm of low Reynolds Number flight. XFLR5 is an open source, free to download, software program that attempts to take into consideration viscous effects that occur at low Reynolds Number in airfoil design, 3D wing design, and 3D airplane design. An off the shelf, remote control airplane was used as a test bed to model in XFLR5 and then compared to flight test collected data. Flight test focused on the stability modes of the 3D plane, specifically the phugoid mode. Design and execution of the flight tests were accomplished for the RC airplane using methodology from full scale military airplane test procedures. Results from flight test were not conclusive in determining the accuracy of the XFLR5 software program. There were several sources of uncertainty that did not allow for a full analysis of the flight test results. An off the shelf drone autopilot was used as a data collection device for flight testing. The precision and accuracy of the autopilot is unknown. Potential future work should investigate flight test methods for small scale UAV flight.

  15. Payload and Components Real-Time Automated Test System (PACRATS), Data Acquisition of Leak Rate and Pressure Data Test Procedure

    NASA Technical Reports Server (NTRS)

    Rinehart, Maegan L.

    2011-01-01

    The purpose of this activity is to provide the Mechanical Components Test Facility (MCTF) with the capability to obtain electronic leak test and proof pressure data, Payload and Components Real-time Automated Test System (PACRATS) data acquisition software will be utilized to display real-time data. It will record leak rates and pressure/vacuum level(s) simultaneously. This added functionality will provide electronic leak test and pressure data at specified sampling frequencies. Electronically stored data will provide ES61 with increased data security, analysis, and accuracy. The tasks performed in this procedure are to verify PACRATS only, and are not intended to provide verifications for MCTF equipment.

  16. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  17. Evaluation plan for space station network interface units

    NASA Technical Reports Server (NTRS)

    Weaver, Alfred C.

    1990-01-01

    Outlined here is a procedure for evaluating network interface units (NIUs) produced for the Space Station program. The procedures should be equally applicable to the data management system (DMS) testbed NIUs produced by Honeywell and IBM. The evaluation procedures are divided into four areas. Performance measurement tools are hardware and software that must be developed in order to evaluate NIU performance. Performance tests are a series of tests, each of which documents some specific characteristic of NIU and/or network performance. In general, these performance tests quantify the speed, capacity, latency, and reliability of message transmission under a wide variety of conditions. Functionality tests are a series of tests and code inspections that demonstrate the functionality of the particular subset of ISO protocols which have been implemented in a given NIU. Conformance tests are a series of tests which would expose whether or not selected features within the ISO protocols are present and interoperable.

  18. Computer-Aided Teaching Using MATLAB/Simulink for Enhancing an IM Course With Laboratory Tests

    ERIC Educational Resources Information Center

    Bentounsi, A.; Djeghloud, H.; Benalla, H.; Birem, T.; Amiar, H.

    2011-01-01

    This paper describes an automatic procedure using MATLAB software to plot the circle diagram for two induction motors (IMs), with wound and squirrel-cage rotors, from no-load and blocked-rotor tests. The advantage of this approach is that it avoids the need for a direct load test in predetermining the IM characteristics under reduced power.…

  19. Exact and Monte carlo resampling procedures for the Wilcoxon-Mann-Whitney and Kruskal-Wallis tests.

    PubMed

    Berry, K J; Mielke, P W

    2000-12-01

    Exact and Monte Carlo resampling FORTRAN programs are described for the Wilcoxon-Mann-Whitney rank sum test and the Kruskal-Wallis one-way analysis of variance for ranks test. The program algorithms compensate for tied values and do not depend on asymptotic approximations for probability values, unlike most algorithms contained in PC-based statistical software packages.

  20. Ultraviolet Spectrometer and Polarimeter (UVSP) software development and hardware tests for the solar maximum mission

    NASA Technical Reports Server (NTRS)

    1984-01-01

    An analysis of UVSP wavelength drive hardware, problems, and recovery procedures; radiative power loss from solar plasmas; and correlations between observed UV brightness and inferred photospheric currents are given.

  1. Stand Alone Pressure Measurement Device (SAPMD) for the space shuttle Orbiter, part 2

    NASA Technical Reports Server (NTRS)

    Tomlinson, Bill

    1989-01-01

    The Stand Alone Pressure Measurement Device (SAPMD) specifications are examined. The HP.SAPMD GSE software is listed; the HP/SGA readme program is presented; and the SPMD acceptance test procedure is described.

  2. Defensive Swarm: An Agent Based Modeling Analysis

    DTIC Science & Technology

    2017-12-01

    INITIAL ALGORITHM (SINGLE- RUN ) TESTING .........................43  1.  Patrol Algorithm—Passive...scalability are therefore quite important to modeling in this highly variable domain. One can force the software to run the gamut of options to see...changes in operating constructs or procedures. Additionally, modelers can run thousands of iterations testing the model under different circumstances

  3. Information Flow Analysis of Level 4 Payload Processing Operations

    NASA Technical Reports Server (NTRS)

    Danz, Mary E.

    1991-01-01

    The Level 4 Mission Sequence Test (MST) was studied to develop strategies and recommendations to facilitate information flow. Recommendations developed as a result of this study include revised format of the Test and Assembly Procedure (TAP) document and a conceptualized software based system to assist in the management of information flow during the MST.

  4. Increasing productivity through Total Reuse Management (TRM)

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.

  5. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  6. Slice-thickness evaluation in CT and MRI: an alternative computerised procedure.

    PubMed

    Acri, G; Tripepi, M G; Causa, F; Testagrossa, B; Novario, R; Vermiglio, G

    2012-04-01

    The efficient use of computed tomography (CT) and magnetic resonance imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice thickness (ST) requires scan exploration of phantoms containing test objects (plane, cone or spiral). To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination of full width at half maximum (FWHM) in real time. The phantom consists of a polymethyl methacrylate (PMMA) box, diagonally crossed by a PMMA septum dividing the box into two sections. The phantom images were acquired and processed using the LabView-based procedure. The LabView (LV) results were compared with those obtained by processing the same phantom images with commercial software, and the Fisher exact test (F test) was conducted on the resulting data sets to validate the proposed methodology. In all cases, there was no statistically significant variation between the two different procedures and the LV procedure, which can therefore be proposed as a valuable alternative to other commonly used procedures and be reliably used on any CT and MRI scanner.

  7. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  8. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  9. A guide to onboard checkout. Volume 2: Environmental control and life support

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A description of space station equipment for environmental control and life support is presented. Reliability and maintenance procedures are reviewed. Failure analysis and checkout tests are discussed. The strategy for software checkout is noted.

  10. Clean Air Act Second Prospective Report Study Science Advisory Board Review Teleconference, August 11, 2010

    EPA Pesticide Factsheets

    This technical memorandum provides a description of the Adjustment to the Primary Particulate Matter Emissions Estimates and the Modeled Attainment Test Software Analysis (MATS) Procedure for the 812 Second Prospective Analysis

  11. [Quality assurance of a virtual simulation software: application to IMAgo and SIMAgo (ISOgray)].

    PubMed

    Isambert, A; Beaudré, A; Ferreira, I; Lefkopoulos, D

    2007-06-01

    Virtual simulation process is often used to prepare three dimensional conformal radiation therapy treatments. As the quality of the treatment is widely dependent on this step, it is mandatory to perform extensive controls on this software before clinical use. The tests presented in this work have been carried out on the treatment planning system ISOgray (DOSIsoft), including the delineation module IMAgo and the virtual simulation module SIMAgo. According to our experience, the most relevant controls of international protocols have been selected. These tests mainly focused on measuring and delineation tools, virtual simulation functionalities, and have been performed with three phantoms: the Quasar Multi-Purpose Body Phantom, the Quasar MLC Beam Geometry Phantom (Modus Medical Devices Inc.) and a phantom developed at Hospital Tenon. No major issues have been identified while performing the tests. These controls have emphasized the necessity for the user to consider with a critical eye the results displayed by a virtual simulation software. The contrast of visualisation, the slice thickness, the calculation and display mode of 3D structures used by the software are many factors of uncertainties. A virtual simulation software quality assurance procedure has been written and applied on a set of CT images. Similar tests have to be performed periodically and at minimum at each change of major version.

  12. Intelligent Medical Systems for Aerospace Emergency Medical Services

    NASA Technical Reports Server (NTRS)

    Epler, John; Zimmer, Gary

    2004-01-01

    The purpose of this project is to develop a portable, hands free device for emergency medical decision support to be used in remote or confined settings by non-physician providers. Phase I of the project will entail the development of a voice-activated device that will utilize an intelligent algorithm to provide guidance in establishing an airway in an emergency situation. The interactive, hands free software will process requests for assistance based on verbal prompts and algorithmic decision-making. The device will allow the CMO to attend to the patient while receiving verbal instruction. The software will also feature graphic representations where it is felt helpful in aiding in procedures. We will also develop a training program to orient users to the algorithmic approach, the use of the hardware and specific procedural considerations. We will validate the efficacy of this mode of technology application by testing in the Johns Hopkins Department of Emergency Medicine. Phase I of the project will focus on the validation of the proposed algorithm, testing and validation of the decision making tool and modifications of medical equipment. In Phase 11, we will produce the first generation software for hands-free, interactive medical decision making for use in acute care environments.

  13. Study of fault tolerant software technology for dynamic systems

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Zacharias, G. L.

    1985-01-01

    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.

  14. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    PubMed

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  15. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  16. Generation and reduction of the data for the Ulysses gravitational wave experiment

    NASA Technical Reports Server (NTRS)

    Agresti, R.; Bonifazi, P.; Iess, L.; Trager, G. B.

    1987-01-01

    A procedure for the generation and reduction of the radiometric data known as REGRES is described. The software is implemented on a HP-1000F computer and was tested on REGRES data relative to the Voyager I spacecraft. The REGRES data are a current output of NASA's Orbit Determination Program. The software package was developed in view of the data analysis of the gravitational wave experiment planned for the European spacecraft Ulysses.

  17. Progress on the development of automated data analysis algorithms and software for ultrasonic inspection of composites

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Coughlin, Chris; Forsyth, David S.; Welter, John T.

    2014-02-01

    Progress is presented on the development and implementation of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. ADA processing results are presented for test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions.

  18. A Measurement Plane for Optical Networks to Manage Emergency Events

    NASA Astrophysics Data System (ADS)

    Tego, E.; Carciofi, C.; Grazioso, P.; Petrini, V.; Pompei, S.; Matera, F.; Attanasio, V.; Nastri, E.; Restuccia, E.

    2017-11-01

    In this work, we show a wide geographical area optical network test bed, adopting the mPlane measurement plane for monitoring its performance and to manage software defined network approaches, with some specific tests and procedures dedicated to respond to disaster events and to support emergency networks. Such a test bed includes FTTX accesses, and it is currently implemented to support future 5G wireless services with slicing procedures based on Carrier Ethernet. The characteristics of this platform have been experimentally tested in the case of a damage-causing link failure and traffic congestion, showing a fast reactions to these disastrous events, allowing the user to recharge the initial QoS parameters.

  19. 48 CFR 227.7203-11 - Contractor procedures and records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7203-11 Contractor procedures and records. (a) The clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, requires a contractor, and its subcontractors or suppliers that will...

  20. 48 CFR 227.7203-11 - Contractor procedures and records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7203-11 Contractor procedures and records. (a) The clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, requires a contractor, and its subcontractors or suppliers that will...

  1. 48 CFR 227.7203-11 - Contractor procedures and records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7203-11 Contractor procedures and records. (a) The clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, requires a contractor, and its subcontractors or suppliers that will...

  2. 48 CFR 227.7203-11 - Contractor procedures and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7203-11 Contractor procedures and records. (a) The clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, requires a contractor, and its subcontractors or suppliers that will...

  3. 48 CFR 227.7203-11 - Contractor procedures and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7203-11 Contractor procedures and records. (a) The clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, requires a contractor, and its subcontractors or suppliers that will...

  4. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  5. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  6. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  7. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  8. Significant lexical relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedersen, T.; Kayaalp, M.; Bruce, R.

    Statistical NLP inevitably deals with a large number of rare events. As a consequence, NLP data often violates the assumptions implicit in traditional statistical procedures such as significance testing. We describe a significance test, an exact conditional test, that is appropriate for NLP data and can be performed using freely available software. We apply this test to the study of lexical relationships and demonstrate that the results obtained using this test are both theoretically more reliable and different from the results obtained using previously applied tests.

  9. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  10. ISO-IEC MPEG-2 software video codec

    NASA Astrophysics Data System (ADS)

    Eckart, Stefan; Fogg, Chad E.

    1995-04-01

    Part 5 of the International Standard ISO/IEC 13818 `Generic Coding of Moving Pictures and Associated Audio' (MPEG-2) is a Technical Report, a sample software implementation of the procedures in parts 1, 2 and 3 of the standard (systems, video, and audio). This paper focuses on the video software, which gives an example of a fully compliant implementation of the standard and of a good video quality encoder, and serves as a tool for compliance testing. The implementation and some of the development aspects of the codec are described. The encoder is based on Test Model 5 (TM5), one of the best, published, non-proprietary coding models, which was used during MPEG-2 collaborative stage to evaluate proposed algorithms and to verify the syntax. The most important part of the Test Model is controlling the quantization parameter based on the image content and bit rate constraints under both signal-to-noise and psycho-optical aspects. The decoder has been successfully tested for compliance with the MPEG-2 standard, using the ISO/IEC MPEG verification and compliance bitstream test suites as stimuli.

  11. The development procedures and tools applied for the attitude control software of the Italian satellite SAX

    NASA Astrophysics Data System (ADS)

    Hameetman, G. J.; Dekker, G. J.

    1993-11-01

    The Italian satellite (with a large Dutch contribution) SAX is a scientific satellite which has the mission to study roentgen sources. One main requirement for the Attitude and Orbit Control Subsystem (AOCS) is to achieve and maintain a stable pointing accuracy with a limit cycle of less than 90 arcsec during pointings of maximal 28 hours. The main SAX instrument, the Narrow Field Instrument, is highly sensitive to (indirect) radiation coming from the Sun. This sensitivity leads to another main requirement that under no circumstances the safe attitude domain may be left. The paper describes the application software in relation with the overall SAX AOCS subsystem, the CASE tools that have been used during the development, some advantages and disadvantages of the use of these tools, the measures taken to meet the more or less conflicting requirements of reliability and flexibility, and the lessons learned during development. The quality of the approach to the development has proven the (separately executed) hardware/software integration tests. During these tests, a neglectible number of software errors has been detected in the application software.

  12. A Java software for creation of image mosaics.

    PubMed

    Bossert, Oliver

    2004-08-01

    Besides the dimensions of the selected image field width, the resolution of the individual objects is also of major importance for automatic reconstruction and other sophisticated histological work. The software solution presented here allows the user to create image mosaics by using a combination of several photographs. Optimum control is achieved by combining two procedures and several control mechanisms. In sample tests involving 50 image pairs, all images were mosaiced without giving rise to error. The program is ready for public download.

  13. Discrete Element Modeling (DEM) of Triboelectrically Charged Particles: Revised Experiments

    NASA Technical Reports Server (NTRS)

    Hogue, Michael D.; Calle, Carlos I.; Curry, D. R.; Weitzman, P. S.

    2008-01-01

    In a previous work, the addition of basic screened Coulombic electrostatic forces to an existing commercial discrete element modeling (DEM) software was reported. Triboelectric experiments were performed to charge glass spheres rolling on inclined planes of various materials. Charge generation constants and the Q/m ratios for the test materials were calculated from the experimental data and compared to the simulation output of the DEM software. In this paper, we will discuss new values of the charge generation constants calculated from improved experimental procedures and data. Also, planned work to include dielectrophoretic, Van der Waals forces, and advanced mechanical forces into the software will be discussed.

  14. Florida specific NTCIP MIB development for actuated signal controller (ASC), closed-circuit television (CCTV), and center-to-center (C2C) communications with SunGuideSM software and ITS device test procedure development : technical report documentation pa

    DOT National Transportation Integrated Search

    2009-06-30

    The project has been focused on National Transportation Communications for ITS Protocol : (NTCIP) research and testing across the entire life cycle of traffic operations, ITS, and statewide : communications deployments. This life cycle includes desig...

  15. GNAS Maintenance Control Center (GMCC) Design Qualification Test and Evaluation (DQT&E) Test Procedures

    DTIC Science & Technology

    1992-06-01

    connected to one MPS via BROUTERS and modems ). b. MPS - RMMS interface to the GMCC. The IMCS software running on the MPS will.be used to monitor the...Start Date: TODAYS DATE REGION: SO Sector: 56K 69. GPWS: Press FI0 (PRODUCE REPORT) to generate the report. 70. GPWS: Press F12 (REPORT STATUS) to

  16. Staged-Fault Testing of Distance Protection Relay Settings

    NASA Astrophysics Data System (ADS)

    Havelka, J.; Malarić, R.; Frlan, K.

    2012-01-01

    In order to analyze the operation of the protection system during induced fault testing in the Croatian power system, a simulation using the CAPE software has been performed. The CAPE software (Computer-Aided Protection Engineering) is expert software intended primarily for relay protection engineers, which calculates current and voltage values during faults in the power system, so that relay protection devices can be properly set up. Once the accuracy of the simulation model had been confirmed, a series of simulations were performed in order to obtain the optimal fault location to test the protection system. The simulation results were used to specify the test sequence definitions for the end-to-end relay testing using advanced testing equipment with GPS synchronization for secondary injection in protection schemes based on communication. The objective of the end-to-end testing was to perform field validation of the protection settings, including verification of the circuit breaker operation, telecommunication channel time and the effectiveness of the relay algorithms. Once the end-to-end secondary injection testing had been completed, the induced fault testing was performed with three-end lines loaded and in service. This paper describes and analyses the test procedure, consisting of CAPE simulations, end-to-end test with advanced secondary equipment and staged-fault test of a three-end power line in the Croatian transmission system.

  17. Generation of Long-time Complex Signals for Testing the Instruments for Detection of Voltage Quality Disturbances

    NASA Astrophysics Data System (ADS)

    Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir

    2018-04-01

    Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.

  18. Chandra X-ray Center Science Data Systems Regression Testing of CIAO

    NASA Astrophysics Data System (ADS)

    Lee, N. P.; Karovska, M.; Galle, E. C.; Bonaventura, N. R.

    2011-07-01

    The Chandra Interactive Analysis of Observations (CIAO) is a software system developed for the analysis of Chandra X-ray Observatory observations. An important component of a successful CIAO release is the repeated testing of the tools across various platforms to ensure consistent and scientifically valid results. We describe the procedures of the scientific regression testing of CIAO and the enhancements made to the testing system to increase the efficiency of run time and result validation.

  19. Proprioceptive isokinetic exercise test

    NASA Technical Reports Server (NTRS)

    Dempster, P. T.; Bernauer, E. M.; Bond, M.; Greenleaf, J. E.

    1993-01-01

    Proprioception, the reception of stimuli within the body that indicates position, is an important mechanism for optimal human performance. People exposed to prolonged bed rest, microgravity, or other deconditioning situations usually experience reduced proprioceptor and kinesthetic stimuli that compromise body balance, posture, and equilibrium. A new proprioceptive test is described that utilizes the computer-driven LIDO isokinetic ergometer. An overview of the computer logic, software, and testing procedure for this proprioceptive test, which can be performed with the arms or legs, is described.

  20. A cognitive mobile BTS solution with software-defined radioelectric sensing.

    PubMed

    Muñoz, Jorge; Alonso, Javier Vales; García, Francisco Quiñoy; Costas, Sergio; Pillado, Marcos; Castaño, Francisco Javier González; Sánchez, Manuel García; Valcarce, Roberto López; Bravo, Cristina López

    2013-02-05

    Private communications inside large vehicles such as ships may be effectively provided using standard cellular systems. In this paper we propose a new solution based on software-defined radio with electromagnetic sensing support. Software-defined radio allows low-cost developments and, potentially, added-value services not available in commercial cellular networks. The platform of reference, OpenBTS, only supports single-channel cells. Our proposal, however, has the ability of changing BTS channel frequency without disrupting ongoing communications. This ability should be mandatory in vehicular environments, where neighbouring cell configurations may change rapidly, so a moving cell must be reconfigured in real-time to avoid interferences. Full details about frequency occupancy sensing and the channel reselection procedure are provided in this paper. Moreover, a procedure for fast terminal detection is proposed. This may be decisive in emergency situations, e.g., if someone falls overboard. Different tests confirm the feasibility of our proposal and its compatibility with commercial GSM terminals.

  1. A Cognitive Mobile BTS Solution with Software-Defined Radioelectric Sensing

    PubMed Central

    Muñoz, Jorge; Alonso, Javier Vales; García, Francisco Quiñoy; Costas, Secundino; Pillado, Marcos; Castaño, Francisco Javier González; Sánchez, Manuel Garćia; Valcarce, Roberto López; Bravo, Cristina López

    2013-01-01

    Private communications inside large vehicles such as ships may be effectively provided using standard cellular systems. In this paper we propose a new solution based on software-defined radio with electromagnetic sensing support. Software-defined radio allows low-cost developments and, potentially, added-value services not available in commercial cellular networks. The platform of reference, OpenBTS, only supports single-channel cells. Our proposal, however, has the ability of changing BTS channel frequency without disrupting ongoing communications. This ability should be mandatory in vehicular environments, where neighbouring cell configurations may change rapidly, so a moving cell must be reconfigured in real-time to avoid interferences. Full details about frequency occupancy sensing and the channel reselection procedure are provided in this paper. Moreover, a procedure for fast terminal detection is proposed. This may be decisive in emergency situations, e.g., if someone falls overboard. Different tests confirm the feasibility of our proposal and its compatibility with commercial GSM terminals. PMID:23385417

  2. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  3. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease.

    PubMed

    Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk

    2016-06-01

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.

  4. Estimating multilevel logistic regression models when the number of clusters is low: a comparison of different statistical software procedures.

    PubMed

    Austin, Peter C

    2010-04-22

    Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.

  5. Software design as a problem in learning theory (a research overview)

    NASA Technical Reports Server (NTRS)

    Fass, Leona F.

    1992-01-01

    Our interest in automating software design has come out of our research in automated reasoning, inductive inference, learnability, and algebraic machine theory. We have investigated these areas extensively, in connection with specific problems of language representation, acquisition, processing, and design. In the case of formal context-free (CF) languages we established existence of finite learnable models ('behavioral realizations') and procedures for constructing them effectively. We also determined techniques for automatic construction of the models, inductively inferring them from finite examples of how they should 'behave'. These results were obtainable due to appropriate representation of domain knowledge, and constraints on the domain that the representation defined. It was when we sought to generalize our results, and adapt or apply them, that we began investigating the possibility of determining similar procedures for constructing correct software. Discussions with other researchers led us to examine testing and verification processes, as they are related to inference, and due to their considerable importance in correct software design. Motivating papers by other researchers, led us to examine these processes in some depth. Here we present our approach to those software design issues raised by other researchers, within our own theoretical context. We describe our results, relative to those of the other researchers, and conclude that they do not compare unfavorably.

  6. [Development of a software standardizing optical density with operation settings related to several limitations].

    PubMed

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  7. New techniques for test development for tactical auto-pilots using microprocessors

    NASA Astrophysics Data System (ADS)

    Shemeta, E. H.

    1980-07-01

    This paper reports on a demonstration of the application of the method to generate system level tests for a typical tactical missile autopilot. The test algorithms are based on the autopilot control law. When loaded on the tester with appropriate control information, the complete autopilot is tested to establish if the specified control law requirements are met. Thus, the test procedure not only checks to see if the hardware is functional, but also checks the operational software. The technique also uses a 'learning' mode to allow minor timing or functional deviations from the expected responses to be incorporated in the test procedures. A potential application of this test development technique is the extraction of production test data for the various subassemblies. The technique will 'learn' the input-output patterns forming the basis for developement and production tests. If successful, these new techniques should allow the test development process to keep pace with semiconductor progress.

  8. Orbiter multiplexer-demultiplexer (MDM)/Space Lab Bus Interface Unit (SL/BIU) serial data interface evaluation, volume 2

    NASA Technical Reports Server (NTRS)

    Tobey, G. L.

    1978-01-01

    Tests were performed to evaluate the operating characteristics of the interface between the Space Lab Bus Interface Unit (SL/BIU) and the Orbiter Multiplexer-Demultiplexer (MDM) serial data input-output (SIO) module. This volume contains the test equipment preparation procedures and a detailed description of the Nova/Input Output Processor Simulator (IOPS) software used during the data transfer tests to determine word error rates (WER).

  9. Ada Programming Support Environment (APSE) Evaluation and Validation (E&V) Team

    DTIC Science & Technology

    1991-12-31

    standards. The purpose of the team was to assist the project in several ways. Raymond Szymanski of Wright Research Iand Development Center (WRDC, now...debuggers, program library systems, and compiler diagnostics. The test suite does not include explicit tests for the existence of language features . The...support software is a set of tools and procedures which assist in preparing and executing the test suite, in extracting data from the results of

  10. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  11. Ultrasonic interface level analyzer shop test procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    STAEHR, T.W.

    1999-05-24

    The Royce Instrument Corporation Model 2511 Interface Level Analyzer (URSILLA) system uses an ultrasonic ranging technique (SONAR) to measure sludge depths in holding tanks. Three URSILLA instrument assemblies provided by the W-151 project are planned to be used during mixer pump testing to provide data for determining sludge mobilization effectiveness of the mixer pumps and sludge settling rates. The purpose of this test is to provide a documented means of verifying that the functional components of the three URSILLA instruments operate properly. Successful completion of this Shop Test Procedure (STP) is a prerequisite for installation in the AZ-101 tank. Themore » objective of the test is to verify the operation of the URSILLA instruments and to verify data collection using a stand alone software program.« less

  12. Software quality assurance plan for GCS

    NASA Technical Reports Server (NTRS)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  13. Experimental and simulation flow rate analysis of the 3/2 directional pneumatic valve

    NASA Astrophysics Data System (ADS)

    Blasiak, Slawomir; Takosoglu, Jakub E.; Laski, Pawel A.; Pietrala, Dawid S.; Zwierzchowski, Jaroslaw; Bracha, Gabriel; Nowakowski, Lukasz; Blasiak, Malgorzata

    The work includes a study on the comparative analysis of two test methods. The first method - numerical method, consists in determining the flow characteristics with the use of ANSYS CFX. A modeled poppet directional valve 3/2 3D CAD software - SolidWorks was used for this purpose. Based on the solid model that was developed, simulation studies of the air flow through the way valve in the software for computational fluid dynamics Ansys CFX were conducted. The second method - experimental, entailed conducting tests on a specially constructed test stand. The comparison of the test results obtained on the basis of both methods made it possible to determine the cross-correlation. High compatibility of the results confirms the usefulness of the numerical procedures. Thus, they might serve to determine the flow characteristics of directional valves as an alternative to a costly and time-consuming test stand.

  14. 48 CFR 208.7403 - Acquisition procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SYSTEM, DEPARTMENT OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7403 Acquisition procedures. Follow the procedures at PGI 208.7403 when acquiring commercial software and related services. [71 FR 39005, July 11, 2006] ...

  15. Development of an Expert System for Representing Procedural Knowledge

    NASA Technical Reports Server (NTRS)

    Georgeff, Michael P.; Lansky, Amy L.

    1985-01-01

    A high level of automation is of paramount importance in most space operations. It is critical for unmanned missions and greatly increases the effectiveness of manned missions. However, although many functions can be automated by using advanced engineering techniques, others require complex reasoning, sensing, and manipulatory capabilities that go beyond this technology. Automation of fault diagnosis and malfunction handling is a case in point. The military have long been interested in this problem, and have developed automatic test equipment to aid in the maintenance of complex military hardware. These systems are all based on conventional software and engineering techniques. However, the effectiveness of such test equipment is severely limited. The equipment is inflexible and unresponsive to the skill level of the technicians using it. The diagnostic procedures cannot be matched to the exigencies of the current situation nor can they cope with reconfiguration or modification of the items under test. The diagnosis cannot be guided by useful advice from technicians and, when a fault cannot be isolated, no explanation is given as to the cause of failure. Because these systems perform a prescribed sequence of tests, they cannot utilize knowledge of a particular situation to focus attention on more likely trouble spots. Consequently, real-time performance is highly unsatisfactory. Furthermore, the cost of developing test software is substantial and time to maturation is excessive. Significant advances in artificial intelligence (AI) have recently led to the development of powerful and flexible reasoning systems, known as expert or knowledge-based systems. We have devised a powerful and theoretically sound scheme for representing and reasoning about procedural knowledge.

  16. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  17. [Quality assurance of the renal applications software].

    PubMed

    del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M

    2007-01-01

    The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.

  18. Software for Analyzing Laminar-to-Turbulent Flow Transitions

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software. This is achieved by combining two prior programs in an innovative manner

  19. Virtual Service, Real Data: Results of a Pilot Study.

    ERIC Educational Resources Information Center

    Kibbee, Jo; Ward, David; Ma, Wei

    2002-01-01

    Describes a pilot project at the University of Illinois at Urbana-Champaign reference and undergraduate libraries to test the feasibility of offering real-time online reference service via their Web site. Discusses software selection, policies and procedures, promotion and marketing, user interface, training and staffing, data collection, and…

  20. Development of a High Accuracy Angular Measurement System for Langley Research Center Hypersonic Wind Tunnel Facilities

    NASA Technical Reports Server (NTRS)

    Newman, Brett; Yu, Si-bok; Rhew, Ray D. (Technical Monitor)

    2003-01-01

    Modern experimental and test activities demand innovative and adaptable procedures to maximize data content and quality while working within severely constrained budgetary and facility resource environments. This report describes development of a high accuracy angular measurement capability for NASA Langley Research Center hypersonic wind tunnel facilities to overcome these deficiencies. Specifically, utilization of micro-electro-mechanical sensors including accelerometers and gyros, coupled with software driven data acquisition hardware, integrated within a prototype measurement system, is considered. Development methodology addresses basic design requirements formulated from wind tunnel facility constraints and current operating procedures, as well as engineering and scientific test objectives. Description of the analytical framework governing relationships between time dependent multi-axis acceleration and angular rate sensor data and the desired three dimensional Eulerian angular state of the test model is given. Calibration procedures for identifying and estimating critical parameters in the sensor hardware is also addressed.

  1. Software technology testbed softpanel prototype

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.

  2. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  3. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    PubMed

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  4. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    PubMed Central

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data. PMID:23573172

  5. Experimental Applications of Automatic Test Markup Language (ATML)

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; McCartney, Patrick; Gorringe, Chris

    2012-01-01

    The authors describe challenging use-cases for Automatic Test Markup Language (ATML), and evaluate solutions. The first case uses ATML Test Results to deliver active features to support test procedure development and test flow, and bridging mixed software development environments. The second case examines adding attributes to Systems Modelling Language (SysML) to create a linkage for deriving information from a model to fill in an ATML document set. Both cases are outside the original concept of operations for ATML but are typical when integrating large heterogeneous systems with modular contributions from multiple disciplines.

  6. PC based temporary shielding administrative procedure (TSAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsen, D.E.; Pederson, G.E.; Hamby, P.N.

    1995-03-01

    A completely new Administrative Procedure for temporary shielding was developed for use at Commonwealth Edison`s six nuclear stations. This procedure promotes the use of shielding, and addresses industry requirements for the use and control of temporary shielding. The importance of an effective procedure has increased since more temporary shielding is being used as ALARA goals become more ambitious. To help implement the administrative procedure, a personal computer software program was written to incorporate the procedural requirements. This software incorporates the useability of a Windows graphical user interface with extensive help and database features. This combination of a comprehensive administrative proceduremore » and user friendly software promotes the effective use and management of temporary shielding while ensuring that industry requirements are met.« less

  7. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  8. Detailed analysis of complex single molecule FRET data with the software MASH

    NASA Astrophysics Data System (ADS)

    Hadzic, Mélodie C. A. S.; Kowerko, Danny; Börner, Richard; Zelger-Paulus, Susann; Sigel, Roland K. O.

    2016-04-01

    The processing and analysis of surface-immobilized single molecule FRET (Förster resonance energy transfer) data follows systematic steps (e.g. single molecule localization, clearance of different sources of noise, selection of the conformational and kinetic model, etc.) that require a solid knowledge in optics, photophysics, signal processing and statistics. The present proceeding aims at standardizing and facilitating procedures for single molecule detection by guiding the reader through an optimization protocol for a particular experimental data set. Relevant features were determined from single molecule movies (SMM) imaging Cy3- and Cy5-labeled Sc.ai5γ group II intron molecules synthetically recreated, to test the performances of four different detection algorithms. Up to 120 different parameterizations per method were routinely evaluated to finally establish an optimum detection procedure. The present protocol is adaptable to any movie displaying surface-immobilized molecules, and can be easily reproduced with our home-written software MASH (multifunctional analysis software for heterogeneous data) and script routines (both available in the download section of www.chem.uzh.ch/rna).

  9. Second order tensor finite element

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Fly, J.; Berry, C.; Tworzydlo, W.; Vadaketh, S.; Bass, J.

    1990-01-01

    The results of a research and software development effort are presented for the finite element modeling of the static and dynamic behavior of anisotropic materials, with emphasis on single crystal alloys. Various versions of two dimensional and three dimensional hybrid finite elements were implemented and compared with displacement-based elements. Both static and dynamic cases are considered. The hybrid elements developed in the project were incorporated into the SPAR finite element code. In an extension of the first phase of the project, optimization of experimental tests for anisotropic materials was addressed. In particular, the problem of calculating material properties from tensile tests and of calculating stresses from strain measurements were considered. For both cases, numerical procedures and software for the optimization of strain gauge and material axes orientation were developed.

  10. Low cost airborne microwave landing system receiver, task 3

    NASA Technical Reports Server (NTRS)

    Hager, J. B.; Vancleave, J. R.

    1979-01-01

    Work performed on the low cost airborne Microwave Landing System (MLS) receiver is summarized. A detailed description of the prototype low cost MLS receiver is presented. This detail includes block diagrams, schematics, board assembly drawings, photographs of subassemblies, mechanical construction, parts lists, and microprocessor software. Test procedures are described and results are presented.

  11. The microcomputer scientific software series 6: ECOPHYS user's manual.

    Treesearch

    George E. Host; H. Michael Rauscher; J. G. Isebrands; Donald I. Dickmann; Richard E. Dickson; Thomas R. Crow; D.A. Michael

    1990-01-01

    ECOPHYS is an ecophysiological whole-tree growth process model designed to simulate the growth of poplar in the establishment year. This microcomputer-based model may be used to test the influence of genetically determined physiological or morphological attributes on plant growth. This manual describes the installation, file structures, and operational procedures for...

  12. Space Telecommunications Radio System (STRS) Compliance Testing

    NASA Technical Reports Server (NTRS)

    Handler, Louis M.

    2011-01-01

    The Space Telecommunications Radio System (STRS) defines an open architecture for software defined radios. This document describes the testing methodology to aid in determining the degree of compliance to the STRS architecture. Non-compliances are reported to the software and hardware developers as well as the NASA project manager so that any non-compliances may be fixed or waivers issued. Since the software developers may be divided into those that provide the operating environment including the operating system and STRS infrastructure (OE) and those that supply the waveform applications, the tests are divided accordingly. The static tests are also divided by the availability of an automated tool that determines whether the source code and configuration files contain the appropriate items. Thus, there are six separate step-by-step test procedures described as well as the corresponding requirements that they test. The six types of STRS compliance tests are: STRS application automated testing, STRS infrastructure automated testing, STRS infrastructure testing by compiling WFCCN with the infrastructure, STRS configuration file testing, STRS application manual code testing, and STRS infrastructure manual code testing. Examples of the input and output of the scripts are shown in the appendices as well as more specific information about what to configure and test in WFCCN for non-compliance. In addition, each STRS requirement is listed and the type of testing briefly described. Attached is also a set of guidelines on what to look for in addition to the requirements to aid in the document review process.

  13. Micro-tensile testing system

    DOEpatents

    Wenski, Edward G [Lenexa, KS

    2007-08-21

    A micro-tensile testing system providing a stand-alone test platform for testing and reporting physical or engineering properties of test samples of materials having thicknesses of approximately between 0.002 inch and 0.030 inch, including, for example, LiGA engineered materials. The testing system is able to perform a variety of static, dynamic, and cyclic tests. The testing system includes a rigid frame and adjustable gripping supports to minimize measurement errors due to deflection or bending under load; serrated grips for securing the extremely small test sample; high-speed laser scan micrometers for obtaining accurate results; and test software for controlling the testing procedure and reporting results.

  14. Micro-tensile testing system

    DOEpatents

    Wenski, Edward G.

    2006-01-10

    A micro-tensile testing system providing a stand-alone test platform for testing and reporting physical or engineering properties of test samples of materials having thicknesses of approximately between 0.002 inch and 0.030 inch, including, for example, LiGA engineered materials. The testing system is able to perform a variety of static, dynamic, and cyclic tests. The testing system includes a rigid frame and adjustable gripping supports to minimize measurement errors due to deflection or bending under load; serrated grips for securing the extremely small test sample; high-speed laser scan micrometers for obtaining accurate results; and test software for controlling the testing procedure and reporting results.

  15. Micro-tensile testing system

    DOEpatents

    Wenski, Edward G [Lenexa, KS

    2007-07-17

    A micro-tensile testing system providing a stand-alone test platform for testing and reporting physical or engineering properties of test samples of materials having thicknesses of approximately between 0.002 inch and 0.030 inch, including, for example, LiGA engineered materials. The testing system is able to perform a variety of static, dynamic, and cyclic tests. The testing system includes a rigid frame and adjustable gripping supports to minimize measurement errors due to deflection or bending under load; serrated grips for securing the extremely small test sample; high-speed laser scan micrometers for obtaining accurate results; and test software for controlling the testing procedure and reporting results.

  16. Estimation of sample size and testing power (part 5).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-02-01

    Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.

  17. System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures

    NASA Technical Reports Server (NTRS)

    Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger

    2007-01-01

    This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

  18. Validation of the Simbionix PROcedure Rehearsal Studio sizing module: A comparison of software for endovascular aneurysm repair sizing and planning.

    PubMed

    Velu, Juliëtte F; Groot Jebbink, Erik; de Vries, Jean-Paul P M; Slump, Cornelis H; Geelkerken, Robert H

    2017-02-01

    An important determinant of successful endovascular aortic aneurysm repair is proper sizing of the dimensions of the aortic-iliac vessels. The goal of the present study was to determine the concurrent validity, a method for comparison of test scores, for EVAR sizing and planning of the recently introduced Simbionix PROcedure Rehearsal Studio (PRORS). Seven vascular specialists analyzed anonymized computed tomography angiography scans of 70 patients with an infrarenal aneurysm of the abdominal aorta, using three different sizing software packages Simbionix PRORS (Simbionix USA Corp., Cleveland, OH, USA), 3mensio (Pie Medical Imaging BV, Maastricht, The Netherlands), and TeraRecon (Aquarius, Foster City, CA, USA). The following measurements were included in the protocol: diameter 1 mm below the most distal main renal artery, diameter 15 mm below the lowest renal artery, maximum aneurysm diameter, and length from the most distal renal artery to the left iliac artery bifurcation. Averaged over the locations, the intraclass correlation coefficient is 0.83 for Simbionix versus 3mensio, 0.81 for Simbionix versus TeraRecon, and 0.86 for 3mensio versus TeraRecon. It can be concluded that the Simbionix sizing software is as precise as two other validated and commercially available software packages.

  19. Software Testing for Evolutionary Iterative Rapid Prototyping

    DTIC Science & Technology

    1990-12-01

    kept later hours than I did. Amidst the hustle and bustle, their prayers and help around the house were a great ast.. Finally, if anything shows the...possible meanings. A basic dictionary definition describes prototyping as "an original type , form, or instance that serves as a modfe] on which later...on program size. Asset instruments 49 the subject procedure and produces a graph of the structure for the type of data flow testing conducted. It

  20. Randomized comparative evaluation of endoscopic submucosal dissection self-learning software in France and Japan.

    PubMed

    Pioche, Mathieu; Rivory, Jérôme; Nishizawa, Toshihiro; Uraoka, Toshio; Touzet, Sandrine; O'Brien, Marc; Saurin, Jean-Christophe; Ponchon, Thierry; Denis, Angélique; Yahagi, Naohisa

    2016-12-01

    Background and study aim: Endoscopic submucosal dissection (ESD) is currently the reference method to achieve an en bloc resection for large lesions; however, the technique is difficult and risky, with a long learning curve. In order to reduce the morbidity, training courses that use animal models are recommended. Recently, self-learning software has been developed to assist students in their training. The aim of this study was to evaluate the impact of this tool on the ESD learning curve. Methods: A prospective, randomized, comparative study enrolled 39 students who were experienced in interventional endoscopy. Each student was randomized to one of two groups and performed 30 ESDs of 30 mm standardized lesions in a bovine colon model. The software group used the self-learning software whereas the control group only observed an ESD procedure video. The primary outcome was the rate of successful ESD procedures, defined as complete en bloc resection without any perforation and performed in less than 75 minutes. Results: A total of 39 students performed 1170 ESDs. Success was achieved in 404 (70.9 %) in the software group and 367 (61.2 %) in the control group ( P  = 0.03). Among the successful procedures, there were no significant differences between the software and control groups in terms of perforation rate (22 [4.0 %] vs. 29 [5.1 %], respectively; P  = 0.27) and mean (SD) procedure duration (34.1 [13.4] vs. 32.3 [14.0] minutes, respectively; P  = 0.52). For the 30th procedure, the rate of complete resection was superior in the software group (84.2 %) compared with the control group (50.0 %; P  = 0.01). Conclusion: ESD self-learning software was effective in improving the quality of resection compared with a standard teaching method using procedure videos. This result suggests the benefit of incorporating such software into teaching programs. © Georg Thieme Verlag KG Stuttgart · New York.

  1. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  2. NASA UAS Traffic Management National Campaign Operations across Six UAS Test Sites

    NASA Technical Reports Server (NTRS)

    Rios, Joseph; Mulfinger, Daniel; Homola, Jeff; Venkatesan, Priya

    2016-01-01

    NASA's Unmanned Aircraft Systems Traffic Management research aims to develop policies, procedures, requirements, and other artifacts to inform the implementation of a future system that enables small drones to access the low altitude airspace. In this endeavor, NASA conducted a geographically diverse flight test in conjunction with the FAA's six unmanned aircraft systems Test Sites. A control center at NASA Ames Research Center autonomously managed the airspace for all participants in eight states as they flew operations (both real and simulated). The system allowed for common situational awareness across all stakeholders, kept traffic procedurally separated, offered messages to inform the participants of activity relevant to their operations. Over the 3- hour test, 102 flight operations connected to the central research platform with 17 different vehicle types and 8 distinct software client implementations while seamlessly interacting with simulated traffic.

  3. A Procedure for High Resolution Satellite Imagery Quality Assessment

    PubMed Central

    Crespi, Mattia; De Vendictis, Laura

    2009-01-01

    Data products generated from High Resolution Satellite Imagery (HRSI) are routinely evaluated during the so-called in-orbit test period, in order to verify if their quality fits the desired features and, if necessary, to obtain the image correction parameters to be used at the ground processing center. Nevertheless, it is often useful to have tools to evaluate image quality also at the final user level. Image quality is defined by some parameters, such as the radiometric resolution and its accuracy, represented by the noise level, and the geometric resolution and sharpness, described by the Modulation Transfer Function (MTF). This paper proposes a procedure to evaluate these image quality parameters; the procedure was implemented in a suitable software and tested on high resolution imagery acquired by the QuickBird, WorldView-1 and Cartosat-1 satellites. PMID:22412312

  4. UrQt: an efficient software for the Unsupervised Quality trimming of NGS data.

    PubMed

    Modolo, Laurent; Lerat, Emmanuelle

    2015-04-29

    Quality control is a necessary step of any Next Generation Sequencing analysis. Although customary, this step still requires manual interventions to empirically choose tuning parameters according to various quality statistics. Moreover, current quality control procedures that provide a "good quality" data set, are not optimal and discard many informative nucleotides. To address these drawbacks, we present a new quality control method, implemented in UrQt software, for Unsupervised Quality trimming of Next Generation Sequencing reads. Our trimming procedure relies on a well-defined probabilistic framework to detect the best segmentation between two segments of unreliable nucleotides, framing a segment of informative nucleotides. Our software only requires one user-friendly parameter to define the minimal quality threshold (phred score) to consider a nucleotide to be informative, which is independent of both the experiment and the quality of the data. This procedure is implemented in C++ in an efficient and parallelized software with a low memory footprint. We tested the performances of UrQt compared to the best-known trimming programs, on seven RNA and DNA sequencing experiments and demonstrated its optimality in the resulting tradeoff between the number of trimmed nucleotides and the quality objective. By finding the best segmentation to delimit a segment of good quality nucleotides, UrQt greatly increases the number of reads and of nucleotides that can be retained for a given quality objective. UrQt source files, binary executables for different operating systems and documentation are freely available (under the GPLv3) at the following address: https://lbbe.univ-lyon1.fr/-UrQt-.html .

  5. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    PubMed

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% <20%. In conclusion, our MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. The Muon Ionization Cooling Experiment User Software

    NASA Astrophysics Data System (ADS)

    Dobbs, A.; Rajaram, D.; MICE Collaboration

    2017-10-01

    The Muon Ionization Cooling Experiment (MICE) is a proof-of-principle experiment designed to demonstrate muon ionization cooling for the first time. MICE is currently on Step IV of its data taking programme, where transverse emittance reduction will be demonstrated. The MICE Analysis User Software (MAUS) is the reconstruction, simulation and analysis framework for the MICE experiment. MAUS is used for both offline data analysis and fast online data reconstruction and visualization to serve MICE data taking. This paper provides an introduction to MAUS, describing the central Python and C++ based framework, the data structure and and the code management and testing procedures.

  7. The SPORT-NMR Software: A Tool for Determining Relaxation Times in Unresolved NMR Spectra

    NASA Astrophysics Data System (ADS)

    Geppi, Marco; Forte, Claudia

    1999-03-01

    A software package which allows the correct determination of individual relaxation times for all the nonequivalent nuclei in poorly resolved NMR spectra is described. The procedure used, based on the fitting of each spectrum in the series recorded in the relaxation experiment, should improve the analysis of relaxation data in terms of quantitative dynamic information, especially in anisotropic phases. Tests on simulated data and experimental examples concerning1H and13CT1ρmeasurement in a solid copolymer and2HT1ZandT1Qmeasurement in a liquid crystal are shown and discussed.

  8. A Cloud-based, Open-Source, Command-and-Control Software Paradigm for Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Melton, R.; Thomas, J.

    With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.

  9. Virtually Out of This World!

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Ames Research Center granted Reality Capture Technologies (RCT), Inc., a license to further develop NASA's Mars Map software platform. The company incorporated NASA#s innovation into software that uses the Virtual Plant Model (VPM)(TM) to structure, modify, and implement the construction sites of industrial facilities, as well as develop, validate, and train operators on procedures. The VPM orchestrates the exchange of information between engineering, production, and business transaction systems. This enables users to simulate, control, and optimize work processes while increasing the reliability of critical business decisions. Engineers can complete the construction process and test various aspects of it in virtual reality before building the actual structure. With virtual access to and simulation of the construction site, project personnel can manage, access control, and respond to changes on complex constructions more effectively. Engineers can also create operating procedures, training, and documentation. Virtual Plant Model(TM) is a trademark of Reality Capture Technologies, Inc.

  10. Design and implementation of Ada programs to facilitate automated testing

    NASA Technical Reports Server (NTRS)

    Dean, Jack; Fox, Barry; Oropeza, Michael

    1991-01-01

    An automated method utilized to test the software components of COMPASS, an interactive computer aided scheduling system, is presented. Each package of this system introduces a private type, and works to construct instances of that type, along with read and write routines for that type. Generic procedures that can generate test drivers for these functions are given and show how the test drivers can read from a test data file the functions to call, the arguments for those functions, what the anticipated result should be, and whether an exception should be raised for the function given the arguments.

  11. Noninvasive Test Detects Cardiovascular Disease

    NASA Technical Reports Server (NTRS)

    2007-01-01

    At NASA's Jet Propulsion Laboratory (JPL), NASA-developed Video Imaging Communication and Retrieval (VICAR) software laid the groundwork for analyzing images of all kinds. A project seeking to use imaging technology for health care diagnosis began when the imaging team considered using the VICAR software to analyze X-ray images of soft tissue. With marginal success using X-rays, the team applied the same methodology to ultrasound imagery, which was already digitally formatted. The new approach proved successful for assessing amounts of plaque build-up and arterial wall thickness, direct predictors of heart disease, and the result was a noninvasive diagnostic system with the ability to accurately predict heart health. Medical Technologies International Inc. (MTI) further developed and then submitted the technology to a vigorous review process at the FDA, which cleared the software for public use. The software, patented under the name Prowin, is being used in MTI's patented ArterioVision, a carotid intima-media thickness (CIMT) test that uses ultrasound image-capturing and analysis software to noninvasively identify the risk for the major cause of heart attack and strokes: atherosclerosis. ArterioVision provides a direct measurement of atherosclerosis by safely and painlessly measuring the thickness of the first two layers of the carotid artery wall using an ultrasound procedure and advanced image-analysis software. The technology is now in use in all 50 states and in many countries throughout the world.

  12. Software for computerised analysis of cardiotocographic traces.

    PubMed

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Features of the Upgraded Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) Software

    NASA Technical Reports Server (NTRS)

    Mason, Michelle L.; Rufer, Shann J.

    2016-01-01

    The Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) software is used at the NASA Langley Research Center to analyze global aeroheating data on wind tunnel models tested in the Langley Aerothermodynamics Laboratory. One-dimensional, semi-infinite heating data derived from IHEAT are used in the design of thermal protection systems for hypersonic vehicles that are exposed to severe aeroheating loads, such as reentry vehicles during descent and landing procedures. This software program originally was written in the PV-WAVE(Registered Trademark) programming language to analyze phosphor thermography data from the two-color, relative-intensity system developed at Langley. To increase the efficiency, functionality, and reliability of IHEAT, the program was migrated to MATLAB(Registered Trademark) syntax and compiled as a stand-alone executable file labeled version 4.0. New features of IHEAT 4.0 include the options to perform diagnostic checks of the accuracy of the acquired data during a wind tunnel test, to extract data along a specified multi-segment line following a feature such as a leading edge or a streamline, and to batch process all of the temporal frame data from a wind tunnel run. Results from IHEAT 4.0 were compared on a pixel level to the output images from the legacy software to validate the program. The absolute differences between the heat transfer data output from the two programs were on the order of 10(exp -5) to 10(exp -7). IHEAT 4.0 replaces the PV-WAVE(Registered Trademark) version as the production software for aeroheating experiments conducted in the hypersonic facilities at NASA Langley.

  14. Waste retrieval sluicing system data acquisition system acceptance test report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevins, R.R.

    1998-07-31

    This document describes the test procedure for the Project W-320 Tank C-106 Sluicing Data Acquisition System (W-320 DAS). The Software Test portion will test items identified in the WRSS DAS System Description (SD), HNF-2115. Traceability to HNF-2115 will be via a reference that follows in parenthesis, after the test section title. The Field Test portion will test sensor operability, analog to digital conversion, and alarm setpoints for field instrumentation. The W-320 DAS supplies data to assist thermal modeling of tanks 241-C-106 and 241-AY-102. It is designed to be a central repository for information from sources that would otherwise have tomore » be read, recorded, and integrated manually. Thus, completion of the DAS requires communication with several different data collection devices and output to a usable PC data formats. This test procedure will demonstrate that the DAS functions as required by the project requirements stated in Section 3 of the W-320 DAS System Description, HNF-2115.« less

  15. Software Selection: A Primer on Source and Evaluation.

    ERIC Educational Resources Information Center

    Burston, Jack

    2003-01-01

    Provides guidance on making decisions regarding the selection of foreign language instructional software. Identifies sources of foreign language software, indicates sources of foreign language software reviews, and outlines essential procedures of software evaluation. (Author/VWL)

  16. Artificial Neural Networks-Based Software for Measuring Heat Collection Rate and Heat Loss Coefficient of Water-in-Glass Evacuated Tube Solar Water Heaters

    PubMed Central

    Liu, Zhijian; Liu, Kejun; Li, Hao; Zhang, Xinyu; Jin, Guangya; Cheng, Kewei

    2015-01-01

    Measurements of heat collection rate and heat loss coefficient are crucial for the evaluation of in service water-in-glass evacuated tube solar water heaters. However, conventional measurement requires expensive detection devices and undergoes a series of complicated procedures. To simplify the measurement and reduce the cost, software based on artificial neural networks for measuring heat collection rate and heat loss coefficient of water-in-glass evacuated tube solar water heaters was developed. Using multilayer feed-forward neural networks with back-propagation algorithm, we developed and tested our program on the basis of 915measuredsamples of water-in-glass evacuated tube solar water heaters. This artificial neural networks-based software program automatically obtained accurate heat collection rateand heat loss coefficient using simply "portable test instruments" acquired parameters, including tube length, number of tubes, tube center distance, heat water mass in tank, collector area, angle between tubes and ground and final temperature. Our results show that this software (on both personal computer and Android platforms) is efficient and convenient to predict the heat collection rate and heat loss coefficient due to it slow root mean square errors in prediction. The software now can be downloaded from http://t.cn/RLPKF08. PMID:26624613

  17. Artificial Neural Networks-Based Software for Measuring Heat Collection Rate and Heat Loss Coefficient of Water-in-Glass Evacuated Tube Solar Water Heaters.

    PubMed

    Liu, Zhijian; Liu, Kejun; Li, Hao; Zhang, Xinyu; Jin, Guangya; Cheng, Kewei

    2015-01-01

    Measurements of heat collection rate and heat loss coefficient are crucial for the evaluation of in service water-in-glass evacuated tube solar water heaters. However, conventional measurement requires expensive detection devices and undergoes a series of complicated procedures. To simplify the measurement and reduce the cost, software based on artificial neural networks for measuring heat collection rate and heat loss coefficient of water-in-glass evacuated tube solar water heaters was developed. Using multilayer feed-forward neural networks with back-propagation algorithm, we developed and tested our program on the basis of 915 measured samples of water-in-glass evacuated tube solar water heaters. This artificial neural networks-based software program automatically obtained accurate heat collection rate and heat loss coefficient using simply "portable test instruments" acquired parameters, including tube length, number of tubes, tube center distance, heat water mass in tank, collector area, angle between tubes and ground and final temperature. Our results show that this software (on both personal computer and Android platforms) is efficient and convenient to predict the heat collection rate and heat loss coefficient due to it slow root mean square errors in prediction. The software now can be downloaded from http://t.cn/RLPKF08.

  18. A guide to onboard checkout. Volume 1: Guidance, navigation and control

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of a study of onboard checkout techniques, as they relate to space station subsystems, as a guide to those who may need to implement onboard checkout in similar subsystems. Guidance, navigation, and control subsystems, and their reliability and failure analyses are presented. Software and testing procedures are also given.

  19. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  20. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    PubMed

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  1. APEX/SPIN: a free test platform to measure speech intelligibility.

    PubMed

    Francart, Tom; Hofmann, Michael; Vanthornhout, Jonas; Van Deun, Lieselot; van Wieringen, Astrid; Wouters, Jan

    2017-02-01

    Measuring speech intelligibility in quiet and noise is important in clinical practice and research. An easy-to-use free software platform for conducting speech tests is presented, called APEX/SPIN. The APEX/SPIN platform allows the use of any speech material in combination with any noise. A graphical user interface provides control over a large range of parameters, such as number of loudspeakers, signal-to-noise ratio and parameters of the procedure. An easy-to-use graphical interface is provided for calibration and storage of calibration values. To validate the platform, perception of words in quiet and sentences in noise were measured both with APEX/SPIN and with an audiometer and CD player, which is a conventional setup in current clinical practice. Five normal-hearing listeners participated in the experimental evaluation. Speech perception results were similar for the APEX/SPIN platform and conventional procedures. APEX/SPIN is a freely available and open source platform that allows the administration of all kinds of custom speech perception tests and procedures.

  2. Using 3D printed models for planning and guidance during endovascular intervention: a technical advance.

    PubMed

    Itagaki, Michael W

    2015-01-01

    Three-dimensional (3D) printing applications in medicine have been limited due to high cost and technical difficulty of creating 3D printed objects. It is not known whether patient-specific, hollow, small-caliber vascular models can be manufactured with 3D printing, and used for small vessel endoluminal testing of devices. Manufacture of anatomically accurate, patient-specific, small-caliber arterial models was attempted using data from a patient's CT scan, free open-source software, and low-cost Internet 3D printing services. Prior to endovascular treatment of a patient with multiple splenic artery aneurysms, a 3D printed model was used preoperatively to test catheter equipment and practice the procedure. A second model was used intraoperatively as a reference. Full-scale plastic models were successfully produced. Testing determined the optimal puncture site for catheter positioning. A guide catheter, base catheter, and microcatheter combination selected during testing was used intraoperatively with success, and the need for repeat angiograms to optimize image orientation was minimized. A difficult and unconventional procedure was successful in treating the aneurysms while preserving splenic function. We conclude that creation of small-caliber vascular models with 3D printing is possible. Free software and low-cost printing services make creation of these models affordable and practical. Models are useful in preoperative planning and intraoperative guidance.

  3. Inductive knowledge acquisition experience with commercial tools for space shuttle main engine testing

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.

  4. NASA/MSFC ground experiment for large space structure control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Seltzer, S. M.; Tollison, D. K.

    1984-01-01

    Marshall Space Flight Center has developed a facility in which closed loop control of Large Space Structures (LSS) can be demonstrated and verified. The main objective of the facility is to verify LSS control system techniques so that on orbit performance can be ensured. The facility consists of an LSS test article which is connected to a payload mounting system that provides control torque commands. It is attached to a base excitation system which will simulate disturbances most likely to occur for Orbiter and DOD payloads. A control computer will contain the calibration software, the reference system, the alignment procedures, the telemetry software, and the control algorithms. The total system will be suspended in such a fashion that LSS test article has the characteristics common to all LSS.

  5. Automatic Nanodesign Using Evolutionary Techniques

    NASA Technical Reports Server (NTRS)

    Globus, Al; Saini, Subhash (Technical Monitor)

    1998-01-01

    Many problems associated with the development of nanotechnology require custom designed molecules. We use genetic graph software, a new development, to automatically evolve molecules of interest when only the requirements are known. Genetic graph software designs molecules, and potentially nanoelectronic circuits, given a fitness function that determines which of two molecules is better. A set of molecules, the first generation, is generated at random then tested with the fitness function, Subsequent generations are created by randomly choosing two parent molecules with a bias towards high scoring molecules, tearing each molecules in two at random, and mating parts from the mother and father to create two children. This procedure is repeated until a satisfactory molecule is found. An atom pair similarity test is currently used as the fitness function to evolve molecules similar to existing pharmaceuticals.

  6. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  7. GEDAE-LaB: A Free Software to Calculate the Energy System Contributions during Exercise

    PubMed Central

    Bertuzzi, Rômulo; Melegati, Jorge; Bueno, Salomão; Ghiarone, Thaysa; Pasqua, Leonardo A.; Gáspari, Arthur Fernandes; Lima-Silva, Adriano E.; Goldman, Alfredo

    2016-01-01

    Purpose The aim of the current study is to describe the functionality of free software developed for energy system contributions and energy expenditure calculation during exercise, namely GEDAE-LaB. Methods Eleven participants performed the following tests: 1) a maximal cycling incremental test to measure the ventilatory threshold and maximal oxygen uptake (V˙O2max); 2) a cycling workload constant test at moderate domain (90% ventilatory threshold); 3) a cycling workload constant test at severe domain (110% V˙O2max). Oxygen uptake and plasma lactate were measured during the tests. The contributions of the aerobic (AMET), anaerobic lactic (LAMET), and anaerobic alactic (ALMET) systems were calculated based on the oxygen uptake during exercise, the oxygen energy equivalents provided by lactate accumulation, and the fast component of excess post-exercise oxygen consumption, respectively. In order to assess the intra-investigator variation, four different investigators performed the analyses independently using GEDAE-LaB. A direct comparison with commercial software was also provided. Results All subjects completed 10 min of exercise at moderate domain, while the time to exhaustion at severe domain was 144 ± 65 s. The AMET, LAMET, and ALMET contributions during moderate domain were about 93, 2, and 5%, respectively. The AMET, LAMET, and ALMET contributions during severe domain were about 66, 21, and 13%, respectively. No statistical differences were found between the energy system contributions and energy expenditure obtained by GEDAE-LaB and commercial software for both moderate and severe domains (P > 0.05). The ICC revealed that these estimates were highly reliable among the four investigators for both moderate and severe domains (all ICC ≥ 0.94). Conclusion These findings suggest that GEDAE-LaB is a free software easily comprehended by users minimally familiarized with adopted procedures for calculations of energetic profile using oxygen uptake and lactate accumulation during exercise. By providing availability of the software and its source code we hope to facilitate future related research. PMID:26727499

  8. GEDAE-LaB: A Free Software to Calculate the Energy System Contributions during Exercise.

    PubMed

    Bertuzzi, Rômulo; Melegati, Jorge; Bueno, Salomão; Ghiarone, Thaysa; Pasqua, Leonardo A; Gáspari, Arthur Fernandes; Lima-Silva, Adriano E; Goldman, Alfredo

    2016-01-01

    The aim of the current study is to describe the functionality of free software developed for energy system contributions and energy expenditure calculation during exercise, namely GEDAE-LaB. Eleven participants performed the following tests: 1) a maximal cycling incremental test to measure the ventilatory threshold and maximal oxygen uptake (V̇O2max); 2) a cycling workload constant test at moderate domain (90% ventilatory threshold); 3) a cycling workload constant test at severe domain (110% V̇O2max). Oxygen uptake and plasma lactate were measured during the tests. The contributions of the aerobic (AMET), anaerobic lactic (LAMET), and anaerobic alactic (ALMET) systems were calculated based on the oxygen uptake during exercise, the oxygen energy equivalents provided by lactate accumulation, and the fast component of excess post-exercise oxygen consumption, respectively. In order to assess the intra-investigator variation, four different investigators performed the analyses independently using GEDAE-LaB. A direct comparison with commercial software was also provided. All subjects completed 10 min of exercise at moderate domain, while the time to exhaustion at severe domain was 144 ± 65 s. The AMET, LAMET, and ALMET contributions during moderate domain were about 93, 2, and 5%, respectively. The AMET, LAMET, and ALMET contributions during severe domain were about 66, 21, and 13%, respectively. No statistical differences were found between the energy system contributions and energy expenditure obtained by GEDAE-LaB and commercial software for both moderate and severe domains (P > 0.05). The ICC revealed that these estimates were highly reliable among the four investigators for both moderate and severe domains (all ICC ≥ 0.94). These findings suggest that GEDAE-LaB is a free software easily comprehended by users minimally familiarized with adopted procedures for calculations of energetic profile using oxygen uptake and lactate accumulation during exercise. By providing availability of the software and its source code we hope to facilitate future related research.

  9. Industrial Technology Modernization Program. Project 80. Increase Efficiency of Card Test/Device Test Areas by the Usage of Improved Material Handling Systems. Revision 1. Phase 2

    DTIC Science & Technology

    1988-03-01

    INDUSTRY ANALYSIS/FINDINGS 132 12 EQUIPMENT/MACHINERY ALTERNATIVES 134 13 MIS REQUIREMENTS/IMPROVEMENTS 135 14 COST BENEFIT ANALYSIS AND PROCEDURE 137 15...SOFTWARE DIAGRAM 14.0-1 COST BENEFIT ANALYSIS METHODOLOGY 138 14.3-1 PROJECT 80 EXPENDITURE SCHEDULE 141 14.4-1 PROJECT 80 CASH FLOWS 142 15.1-1 PROJECT 80...testing, streamlining work flow, and installation of ergonomically designed work cell/work centers. The benefits associated with the implementation of ITM

  10. Clinical Outcome Metrics for Optimization of Robust Training

    NASA Technical Reports Server (NTRS)

    Ebert, D.; Byrne, V. E.; McGuire, K. M.; Hurst, V. W., IV; Kerstman, E. L.; Cole, R. W.; Sargsyan, A. E.; Garcia, K. M.; Reyes, D.; Young, M.

    2016-01-01

    Introduction: The emphasis of this research is on the Human Research Program (HRP) Exploration Medical Capability's (ExMC) "Risk of Unacceptable Health and Mission Outcomes Due to Limitations of In-Flight Medical Capabilities." Specifically, this project aims to contribute to the closure of gap ExMC 2.02: We do not know how the inclusion of a physician crew medical officer quantitatively impacts clinical outcomes during exploration missions. The experiments are specifically designed to address clinical outcome differences between physician and non-physician cohorts in both near-term and longer-term (mission impacting) outcomes. Methods: Medical simulations will systematically compare success of individual diagnostic and therapeutic procedure simulations performed by physician and non-physician crew medical officer (CMO) analogs using clearly defined short-term (individual procedure) outcome metrics. In the subsequent step of the project, the procedure simulation outcomes will be used as input to a modified version of the NASA Integrated Medical Model (IMM) to analyze the effect of the outcome (degree of success) of individual procedures (including successful, imperfectly performed, and failed procedures) on overall long-term clinical outcomes and the consequent mission impacts. The procedures to be simulated are endotracheal intubation, fundoscopic examination, kidney/urinary ultrasound, ultrasound-guided intravenous catheter insertion, and a differential diagnosis exercise. Multiple assessment techniques will be used, centered on medical procedure simulation studies occurring at 3, 6, and 12 months after initial training (as depicted in the following flow diagram of the experiment design). Discussion: Analysis of procedure outcomes in the physician and non-physician groups and their subsets (tested at different elapsed times post training) will allow the team to 1) define differences between physician and non-physician CMOs in terms of both procedure performance (pre-IMM analysis) and overall mitigation of the mission medical impact (IMM analysis); 2) refine the procedure outcome and clinical outcome metrics themselves; 3) refine or develop innovative medical training products and solutions to maximize CMO performance; and 4) validate the methods and products of this experiment for operational use in the planning, execution, and quality assurance of the CMO training process The team has finalized training protocols and developed a software training/testing tool in collaboration with Butler Graphics (Detroit, MI). In addition to the "hands on" medical procedure modules, the software includes a differential diagnosis exercise (limited clinical decision support tool) to evaluate the diagnostic skills of participants. Human subject testing will occur over the next year.

  11. Performance of the Primary Mirror Center-of-Curvature Optical Metrology System during Cryogenic Testing of the JWST Pathfinder Telescope

    NASA Technical Reports Server (NTRS)

    Hadaway, James B.; Wells, Conrad; Olczak, Gene; Waldman, Mark; Whitman, Tony; Cosentino, Joseph; Connolly, Mark; Chaney, David; Telfer, Randal

    2016-01-01

    The JWST primary mirror consists of 18 1.5 m hexagonal segments, each with 6-DoF and RoC adjustment. The telescope will be tested at its cryogenic operating temperature at Johnson Space Center. The testing will include center-of-curvature measurements of the PM, using the Center-of-Curvature Optical Assembly (COCOA) and the Absolute Distance Meter Assembly (ADMA). The performance of these metrology systems, including hardware, software, procedures, was assessed during two cryogenic tests at JSC, using the JWST Pathfinder telescope. This paper describes the test setup, the testing performed, and the resulting metrology system performance.

  12. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  13. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of the software was an extensive effort. The challenges of specifying a suitable test matrix with reconfigurable systems that offer numerous configurations is highlighted. Since the flight system testing requires methodical, controlled testing that limits risk, a nearly identical ground system to the on-orbit flight system was required to develop the software and write verification procedures before it was installed and tested on the flight system. The development of the SCAN testbed was an accelerated effort to meet launch constraints, and this paper discusses tradeoffs made to balance needed software functionality and still maintain the schedule. Future upgrades are discussed that optimize the avionics and allow experimenters to utilize the SCAN testbed potential.

  14. Identification and evaluation of software measures

    NASA Technical Reports Server (NTRS)

    Card, D. N.

    1981-01-01

    A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.

  15. Photovoltaic array space power plus diagnostics experiment

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1990-01-01

    The objective is to summarize the five years of hardware development and fabrication represented by the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) Instrument. The original PASP Experiment requirements and background is presented along with the modifications which were requested to transform the PASP Experiment into the PASP Plus Instrument. The PASP Plus hardware and software is described. Test results for components and subsystems are given as well as final system tests. Also included are appendices which describe the major subsystems and present supporting documentation such as block diagrams, schematics, circuit board artwork, drawings, test procedures and test reports.

  16. Simultaneous real-time data collection methods

    NASA Technical Reports Server (NTRS)

    Klincsek, Thomas

    1992-01-01

    This paper describes the development of electronic test equipment which executes, supervises, and reports on various tests. This validation process uses computers to analyze test results and report conclusions. The test equipment consists of an electronics component and the data collection and reporting unit. The PC software, display screens, and real-time data-base are described. Pass-fail procedures and data replay are discussed. The OS2 operating system and Presentation Manager user interface system were used to create a highly interactive automated system. The system outputs are hardcopy printouts and MS DOS format files which may be used as input for other PC programs.

  17. Shuttle orbiter Ku-band radar/communications system design evaluation

    NASA Technical Reports Server (NTRS)

    Dodds, J.; Holmes, J.; Huth, G. K.; Iwasaki, R.; Maronde, R.; Polydoros, A.; Weber, C.; Broad, P.

    1980-01-01

    Tasks performed in an examination and critique of a Ku-band radar communications system for the shuttle orbiter are reported. Topics cover: (1) Ku-band high gain antenna/widebeam horn design evaluation; (2) evaluation of the Ku-band SPA and EA-1 LRU software; (3) system test evaluation; (4) critical design review and development test evaluation; (5) Ku-band bent pipe channel performance evaluation; (6) Ku-band LRU interchangeability analysis; and (7) deliverable test equipment evaluation. Where discrepancies were found, modifications and improvements to the Ku-band system and the associated test procedures are suggested.

  18. Publishing Platform for Scientific Software - Lessons Learned

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim

    2015-04-01

    Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.

  19. Simulation and Flight Test Capability for Testing Prototype Sense and Avoid System Elements

    NASA Technical Reports Server (NTRS)

    Howell, Charles T.; Stock, Todd M.; Verstynen, Harry A.; Wehner, Paul J.

    2012-01-01

    NASA Langley Research Center (LaRC) and The MITRE Corporation (MITRE) have developed, and successfully demonstrated, an integrated simulation-to-flight capability for evaluating sense and avoid (SAA) system elements. This integrated capability consists of a MITRE developed fast-time computer simulation for evaluating SAA algorithms, and a NASA LaRC surrogate unmanned aircraft system (UAS) equipped to support hardware and software in-the-loop evaluation of SAA system elements (e.g., algorithms, sensors, architecture, communications, autonomous systems), concepts, and procedures. The fast-time computer simulation subjects algorithms to simulated flight encounters/ conditions and generates a fitness report that records strengths, weaknesses, and overall performance. Reviewed algorithms (and their fitness report) are then transferred to NASA LaRC where additional (joint) airworthiness evaluations are performed on the candidate SAA system-element configurations, concepts, and/or procedures of interest; software and hardware components are integrated into the Surrogate UAS research systems; and flight safety and mission planning activities are completed. Onboard the Surrogate UAS, candidate SAA system element configurations, concepts, and/or procedures are subjected to flight evaluations and in-flight performance is monitored. The Surrogate UAS, which can be controlled remotely via generic Ground Station uplink or automatically via onboard systems, operates with a NASA Safety Pilot/Pilot in Command onboard to permit safe operations in mixed airspace with manned aircraft. An end-to-end demonstration of a typical application of the capability was performed in non-exclusionary airspace in October 2011; additional research, development, flight testing, and evaluation efforts using this integrated capability are planned throughout fiscal year 2012 and 2013.

  20. Target controlled infusion for kids: trials and simulations.

    PubMed

    Mehta, Disha; McCormack, Jon; Fung, Parry; Dumont, Guy; Ansermino, J

    2008-01-01

    Target controlled infusion (TCI) for Kids is a computer controlled system designed to administer propofol for general anesthesia. A controller establishes infusion rates required to achieve a specified concentration at the drug's effect site (C(e)) by implementing a continuously updated pharmacokinetic-pharmacodymanic model. This manuscript provides an overview of the system's design, preclinical tests, and a clinical pilot study. In pre-clinical tests, predicted infusion rates for 20 simulated procedures displayed complete convergent validity between two software implementations, Labview and Matlab, at computational intervals of 5, 10, and 15s, but diverged with 20s intervals due to system rounding errors. The volume of drug delivered by the TCI system also displayed convergent validity with Tivatrainer, a widely used TCI simulation software. Further tests, were conducted for 50 random procedures to evaluate discrepancies between volumes reported and those actually delivered by the system. Accuracies were within clinically acceptable ranges and normally distributed with a mean of 0.08 +/- 0.01 ml. In the clinical study, propofol pharmacokinetics were simulated for 30 surgical procedures involving children aged 3 months to 9 years. Predicted C(e) values during standard clinical practice, the accuracy of wake-up times predicted by the system, and potential correlations between patient wake-up times, C(e), and state entropy (SE) were assessed. Neither Ce nor SE was a reliable predictor of wake-up time in children, but the small sample size of this study does not fully accommodate the noted variation in children's response to propofol. A C(e) value of 1.9 mug/ml was found to best predict emergence from anesthesia in children.

  1. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  2. Instrumentation & Data Acquisition System (D AS) Engineer

    NASA Technical Reports Server (NTRS)

    Jackson, Markus Deon

    2015-01-01

    The primary job of an Instrumentation and Data Acquisition System (DAS) Engineer is to properly measure physical phenomenon of hardware using appropriate instrumentation and DAS equipment designed to record data during a specified test of the hardware. A DAS system includes a CPU or processor, a data storage device such as a hard drive, a data communication bus such as Universal Serial Bus, software to control the DAS system processes like calibrations, recording of data and processing of data. It also includes signal conditioning amplifiers, and certain sensors for specified measurements. My internship responsibilities have included testing and adjusting Pacific Instruments Model 9355 signal conditioning amplifiers, writing and performing checkout procedures, writing and performing calibration procedures while learning the basics of instrumentation.

  3. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  4. Viewing the Reviewing: An Observational Study of the Use of an Interactive Digital Video To Help Teach the Concepts of Design Inspection Reviews.

    ERIC Educational Resources Information Center

    Love, Matthew

    "Design Inspection Reviews" are structured meetings in which participants follow certain rules of procedure and behavior when conducting detailed readings of design plans to identify errors and misunderstandings. The technique is widely used in the software engineering industry, where it is demonstrably more effective than testing at…

  5. Geometric error characterization and error budgets. [thematic mapper

    NASA Technical Reports Server (NTRS)

    Beyer, E.

    1982-01-01

    Procedures used in characterizing geometric error sources for a spaceborne imaging system are described using the LANDSAT D thematic mapper ground segment processing as the prototype. Software was tested through simulation and is undergoing tests with the operational hardware as part of the prelaunch system evaluation. Geometric accuracy specifications, geometric correction, and control point processing are discussed. Cross track and along track errors are tabulated for the thematic mapper, the spacecraft, and ground processing to show the temporal registration error budget in pixel (42.5 microrad) 90%.

  6. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  7. 3D echocardiographic analysis of aortic annulus for transcatheter aortic valve replacement using novel aortic valve quantification software: Comparison with computed tomography.

    PubMed

    Mediratta, Anuj; Addetia, Karima; Medvedofsky, Diego; Schneider, Robert J; Kruse, Eric; Shah, Atman P; Nathan, Sandeep; Paul, Jonathan D; Blair, John E; Ota, Takeyoshi; Balkhy, Husam H; Patel, Amit R; Mor-Avi, Victor; Lang, Roberto M

    2017-05-01

    With the increasing use of transcatheter aortic valve replacement (TAVR) in patients with aortic stenosis (AS), computed tomography (CT) remains the standard for annulus sizing. However, 3D transesophageal echocardiography (TEE) has been an alternative in patients with contraindications to CT. We sought to (1) test the feasibility, accuracy, and reproducibility of prototype 3DTEE analysis software (Philips) for aortic annular measurements and (2) compare the new approach to the existing echocardiographic techniques. We prospectively studied 52 patients who underwent gated contrast CT, procedural 3DTEE, and TAVR. 3DTEE images were analyzed using novel semi-automated software designed for 3D measurements of the aortic root, which uses multiplanar reconstruction, similar to CT analysis. Aortic annulus measurements included area, perimeter, and diameter calculations from these measurements. The results were compared to CT-derived values. Additionally, 3D echocardiographic measurements (3D planimetry and mitral valve analysis software adapted for the aortic valve) were also compared to the CT reference values. 3DTEE image quality was sufficient in 90% of patients for aortic annulus measurements using the new software, which were in good agreement with CT (r-values: .89-.91) and small (<4%) inter-modality nonsignificant biases. Repeated measurements showed <10% measurements variability. The new 3D analysis was the more accurate and reproducible of the existing echocardiographic techniques. Novel semi-automated 3DTEE analysis software can accurately measure aortic annulus in patients with severe AS undergoing TAVR, in better agreement with CT than the existing methodology. Accordingly, intra-procedural TEE could potentially replace CT in patients where CT carries significant risk. © 2017, Wiley Periodicals, Inc.

  8. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  9. Flight Experiment Demonstration System (FEDS) functional description and interface document

    NASA Technical Reports Server (NTRS)

    Belcher, R. C.; Shank, D. E.

    1984-01-01

    This document presents a functional description of the Flight Experiment Demonstration System (FEDS) and of interfaces between FEDS and external hardware and software. FEDS is a modification of the Automated Orbit Determination System (AODS). FEDS has been developed to support a ground demonstration of microprocessor-based onboard orbit determination. This document provides an overview of the structure and logic of FEDS and details the various operational procedures to build and execute FEDS. It also documents a microprocessor interface between FEDS and a TDRSS user transponder and describes a software simulator of the interface used in the development and system testing of FEDS.

  10. Taming the Viper: Software Upgrade for VFAUser and Viper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DORIN,RANDALL T.; MOSER III,JOHN C.

    2000-08-08

    This report describes the procedure and properties of the software upgrade for the Vibration Performance Recorder. The upgrade will check the 20 memory cards for proper read/write operation. The upgrade was successfully installed and uploaded into the Viper and the field laptop. The memory checking routine must run overnight to complete the test, although the laptop need only be connected to the Viper unit until the downloading routine is finished. The routine has limited ability to recognize incomplete or corrupt header and footer files. The routine requires 400 Megabytes of free hard disk space. There is one minor technical flawmore » detailed in the conclusion.« less

  11. Data and Analysis Center for Software.

    DTIC Science & Technology

    1980-06-01

    can make use of it in their day- to -day activities of developing, maintaining, and managing software. The biblio- graphic collection is composed of...which refer to development, design, or programming approaches whicn view a software system component, or module in terms of its required or intended... practices " are also included In this group. PROCEDURES (I keyword) Procedures is a term used ambiguously in the literature to refer to functions

  12. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  13. Development of an automated ultrasonic testing system

    NASA Astrophysics Data System (ADS)

    Shuxiang, Jiao; Wong, Brian Stephen

    2005-04-01

    Non-Destructive Testing is necessary in areas where defects in structures emerge over time due to wear and tear and structural integrity is necessary to maintain its usability. However, manual testing results in many limitations: high training cost, long training procedure, and worse, the inconsistent test results. A prime objective of this project is to develop an automatic Non-Destructive testing system for a shaft of the wheel axle of a railway carriage. Various methods, such as the neural network, pattern recognition methods and knowledge-based system are used for the artificial intelligence problem. In this paper, a statistical pattern recognition approach, Classification Tree is applied. Before feature selection, a thorough study on the ultrasonic signals produced was carried out. Based on the analysis of the ultrasonic signals, three signal processing methods were developed to enhance the ultrasonic signals: Cross-Correlation, Zero-Phase filter and Averaging. The target of this step is to reduce the noise and make the signal character more distinguishable. Four features: 1. The Auto Regressive Model Coefficients. 2. Standard Deviation. 3. Pearson Correlation 4. Dispersion Uniformity Degree are selected. And then a Classification Tree is created and applied to recognize the peak positions and amplitudes. Searching local maximum is carried out before feature computing. This procedure reduces much computation time in the real-time testing. Based on this algorithm, a software package called SOFRA was developed to recognize the peaks, calibrate automatically and test a simulated shaft automatically. The automatic calibration procedure and the automatic shaft testing procedure are developed.

  14. A management system for evaluating the Virginia periodic motor vehicle inspection program : software manual and implementation procedures : final report.

    DOT National Transportation Integrated Search

    1978-01-01

    This report deals with the Periodic Motor Vehicle Inspection Management Evaluation System software documentation and implementation procedures. A companion report entitled "A Management System for Evaluating the Virginia Periodic Motor Vehicle Inspec...

  15. Design and validation of Segment--freely available software for cardiovascular image analysis.

    PubMed

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-11

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.

  16. DNA Commission of the International Society for Forensic Genetics: Recommendations on the validation of software programs performing biostatistical calculations for forensic genetics applications.

    PubMed

    Coble, M D; Buckleton, J; Butler, J M; Egeland, T; Fimmers, R; Gill, P; Gusmão, L; Guttman, B; Krawczak, M; Morling, N; Parson, W; Pinto, N; Schneider, P M; Sherry, S T; Willuweit, S; Prinz, M

    2016-11-01

    The use of biostatistical software programs to assist in data interpretation and calculate likelihood ratios is essential to forensic geneticists and part of the daily case work flow for both kinship and DNA identification laboratories. Previous recommendations issued by the DNA Commission of the International Society for Forensic Genetics (ISFG) covered the application of bio-statistical evaluations for STR typing results in identification and kinship cases, and this is now being expanded to provide best practices regarding validation and verification of the software required for these calculations. With larger multiplexes, more complex mixtures, and increasing requests for extended family testing, laboratories are relying more than ever on specific software solutions and sufficient validation, training and extensive documentation are of upmost importance. Here, we present recommendations for the minimum requirements to validate bio-statistical software to be used in forensic genetics. We distinguish between developmental validation and the responsibilities of the software developer or provider, and the internal validation studies to be performed by the end user. Recommendations for the software provider address, for example, the documentation of the underlying models used by the software, validation data expectations, version control, implementation and training support, as well as continuity and user notifications. For the internal validations the recommendations include: creating a validation plan, requirements for the range of samples to be tested, Standard Operating Procedure development, and internal laboratory training and education. To ensure that all laboratories have access to a wide range of samples for validation and training purposes the ISFG DNA commission encourages collaborative studies and public repositories of STR typing results. Published by Elsevier Ireland Ltd.

  17. Communications processor for C3 analysis and wargaming

    NASA Astrophysics Data System (ADS)

    Clark, L. N.; Pless, L. D.; Rapp, R. L.

    1982-03-01

    This thesis developed the software capability to allow the investigation of c3 problems, procedures and methodologies. The resultant communications model, that while independent of a specific wargame, is currently implemented in conjunction with the McClintic Theater Model. It provides a computerized message handling system (C3 Model) which allows simulation of communication links (circuits) with user-definable delays; garble and loss rates; and multiple circuit types, addresses, and levels of command. It is designed to be used for test and evaluation of command and control problems in the areas of organizational relationships, communication networks and procedures, and combat doctrine or tactics.

  18. A Re-programmable Platform for Dynamic Burn-in Test of Xilinx Virtexll 3000 FPGA for Military and Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Roosta, Ramin; Wang, Xinchen; Sadigursky, Michael; Tracton, Phil

    2004-01-01

    Field Programmable Gate Arrays (FPGA) have played increasingly important roles in military and aerospace applications. Xilinx SRAM-based FPGAs have been extensively used in commercial applications. They have been used less frequently in space flight applications due to their susceptibility to single-event upsets. Reliability of these devices in space applications is a concern that has not been addressed. The objective of this project is to design a fully programmable hardware/software platform that allows (but is not limited to) comprehensive static/dynamic burn-in test of Virtex-II 3000 FPGAs, at speed test and SEU test. Conventional methods test very few discrete AC parameters (primarily switching) of a given integrated circuit. This approach will test any possible configuration of the FPGA and any associated performance parameters. It allows complete or partial re-programming of the FPGA and verification of the program by using read back followed by dynamic test. Designers have full control over which functional elements of the FPGA to stress. They can completely simulate all possible types of configurations/functions. Another benefit of this platform is that it allows collecting information on elevation of the junction temperature as a function of gate utilization, operating frequency and functionality. A software tool has been implemented to demonstrate the various features of the system. The software consists of three major parts: the parallel interface driver, main system procedure and a graphical user interface (GUI).

  19. Test plan for 32-bit microcomputers for the Water Resources Division; Chapter A, Test plan for acquisition of prototype 32-bit microcomputers

    USGS Publications Warehouse

    Hutchison, N.E.; Harbaugh, A.W.; Holloway, R.A.; Merk, C.F.

    1987-01-01

    The Water Resources Division (WRD) of the U.S. Geological Survey is evaluating 32-bit microcomputers to determine how they can complement, and perhaps later replace, the existing network of minicomputers. The WRD is also designing a National Water Information System (NWIS) that will combine and integrate the existing National Water Data Storage and Retrieval System (WATSTORE), National Water Data Exchange (NAWDEX), and components of several other existing systems. The procedures and testing done in a market evaluation of 32-bit microcomputers are documented. The results of the testing are documented in the NWIS Project Office. The market evaluation was done to identify commercially available hardware and software that could be used for implementing early NWIS prototypes to determine the applicability of 32-bit microcomputers for data base and general computing applications. Three microcomputers will be used for these prototype studies. The results of the prototype studies will be used to compile requirements for a Request for Procurement (RFP) for hardware and software to meet the WRD 's needs in the early 1990's. The identification of qualified vendors to provide the prototype hardware and software included reviewing industry literature, and making telephone calls and personal visits to prospective vendors. Those vendors that appeared to meet general requirements were required to run benchmark tests. (Author 's abstract)

  20. E-Control: First Public Release of Remote Control Software for VLBI Telescopes

    NASA Technical Reports Server (NTRS)

    Neidhardt, Alexander; Ettl, Martin; Rottmann, Helge; Ploetz, Christian; Muehlbauer, Matthias; Hase, Hayo; Alef, Walter; Sobarzo, Sergio; Herrera, Cristian; Himwich, Ed

    2010-01-01

    Automating and remotely controlling observations are important for future operations in a Global Geodetic Observing System (GGOS). At the Geodetic Observatory Wettzell, in cooperation with the Max-Planck-Institute for Radio Astronomy in Bonn, a software extension to the existing NASA Field System has been developed for remote control. It uses the principle of a remotely accessible, autonomous process cell as a server extension for the Field System. The communication is realized for low transfer rates using Remote Procedure Calls (RPC). It uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. The user interacts with this system over a modern graphical user interface created with wxWidgets. For security reasons the communication is automatically tunneled through a Secure Shell (SSH) session to the telescope. There are already successful test observations with the telescopes at O Higgins, Concepcion, and Wettzell. At Wettzell the software is already used routinely for weekend observations. Therefore the first public release of the software is now available, which will also be useful for other telescopes.

  1. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  2. relaxGUI: a new software for fast and simple NMR relaxation data analysis and calculation of ps-ns and μs motion of proteins.

    PubMed

    Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R

    2011-06-01

    Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.

  3. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  4. WalkThrough Example Procedures for MAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph

    This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.

  5. Three-dimensional path planning software-assisted transjugular intrahepatic portosystemic shunt: a technical modification.

    PubMed

    Tsauo, Jiaywei; Luo, Xuefeng; Ye, Linchao; Li, Xiao

    2015-06-01

    This study was designed to report our results with a modified technique of three-dimensional (3D) path planning software assisted transjugular intrahepatic portosystemic shunt (TIPS). 3D path planning software was recently developed to facilitate TIPS creation by using two carbon dioxide portograms acquired at least 20° apart to generate a 3D path for overlay needle guidance. However, one shortcoming is that puncturing along the overlay would be technically impossible if the angle of the liver access set and the angle of the 3D path are not the same. To solve this problem, a prototype 3D path planning software was fitted with a utility to calculate the angle of the 3D path. Using this, we modified the angle of the liver access set accordingly during the procedure in ten patients. Failure for technical reasons occurred in three patients (unsuccessful wedged hepatic venography in two cases, software technical failure in one case). The procedure was successful in the remaining seven patients, and only one needle pass was required to obtain portal vein access in each case. The course of puncture was comparable to the 3D path in all patients. No procedure-related complication occurred following the procedures. Adjusting the angle of the liver access set to match the angle of the 3D path determined by the software appears to be a favorable modification to the technique of 3D path planning software assisted TIPS.

  6. The ISO SWS on-line system

    NASA Technical Reports Server (NTRS)

    Roelfsema, P. R.; Kester, D. J. M.; Wesselius, P. R.; Wieprech, E.; Sym, N.

    1992-01-01

    The software which is currently being developed for the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO) is described. The spectrometer has a wide range of capabilities in the 2-45 micron infrared band. SWS contains two independent gratings, one for the long and one for the short wavelength section of the band. With the gratings a spectral resolution of approximately 1000 to approximately 2500 can be obtained. The instrument also contains two Fabry-Perault's yielding a resolution between approximately 1000 and approximately 20000. Software is currently being developed for the acquisition, calibration, and analysis of SWS data. The software is firstly required to run in a pipeline mode without human interaction, to process data as they are received from the telescope. However, both for testing and calibration of the instrument as well as for evaluation of the planned operating procedures the software should also be suitable for interactive use. Thirdly the same software will be used for long term characterization of the instrument. The software must work properly within the environment designed by the European Space Agency (ESA) for the spacecraft operations. As a result strict constraints are put on I/O devices, throughput etc.

  7. Users manual for an expert system (HSPEXP) for calibration of the hydrological simulation program; Fortran

    USGS Publications Warehouse

    Lumb, A.M.; McCammon, R.B.; Kittle, J.L.

    1994-01-01

    Expert system software was developed to assist less experienced modelers with calibration of a watershed model and to facilitate the interaction between the modeler and the modeling process not provided by mathematical optimization. A prototype was developed with artificial intelligence software tools, a knowledge engineer, and two domain experts. The manual procedures used by the domain experts were identified and the prototype was then coded by the knowledge engineer. The expert system consists of a set of hierarchical rules designed to guide the calibration of the model through a systematic evaluation of model parameters. When the prototype was completed and tested, it was rewritten for portability and operational use and was named HSPEXP. The watershed model Hydrological Simulation Program--Fortran (HSPF) is used in the expert system. This report is the users manual for HSPEXP and contains a discussion of the concepts and detailed steps and examples for using the software. The system has been tested on watersheds in the States of Washington and Maryland, and the system correctly identified the model parameters to be adjusted and the adjustments led to improved calibration.

  8. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  9. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  10. On fitting generalized linear mixed-effects models for binary responses using different statistical packages.

    PubMed

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M

    2011-09-10

    The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  11. Quality Assurance of RNA Expression Profiling in Clinical Laboratories

    PubMed Central

    Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L.

    2012-01-01

    RNA expression profiles are increasingly used to diagnose and classify disease, based on expression patterns of as many as several thousand RNAs. To ensure quality of expression profiling services in clinical settings, a standard operating procedure incorporates multiple quality indicators and controls, beginning with preanalytic specimen preparation and proceeding thorough analysis, interpretation, and reporting. Before testing, histopathological examination of each cellular specimen, along with optional cell enrichment procedures, ensures adequacy of the input tissue. Other tactics include endogenous controls to evaluate adequacy of RNA and exogenous or spiked controls to evaluate run- and patient-specific performance of the test system, respectively. Unique aspects of quality assurance for array-based tests include controls for the pertinent outcome signatures that often supersede controls for each individual analyte, built-in redundancy for critical analytes or biochemical pathways, and software-supported scrutiny of abundant data by a laboratory physician who interprets the findings in a manner facilitating appropriate medical intervention. Access to high-quality reagents, instruments, and software from commercial sources promotes standardization and adoption in clinical settings, once an assay is vetted in validation studies as being analytically sound and clinically useful. Careful attention to the well-honed principles of laboratory medicine, along with guidance from government and professional groups on strategies to preserve RNA and manage large data sets, promotes clinical-grade assay performance. PMID:22020152

  12. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  13. An approach for assessing software prototypes

    NASA Technical Reports Server (NTRS)

    Church, V. E.; Card, D. N.; Agresti, W. W.; Jordan, Q. L.

    1986-01-01

    A procedure for evaluating a software prototype is presented. The need to assess the prototype itself arises from the use of prototyping to demonstrate the feasibility of a design or development stategy. The assessment procedure can also be of use in deciding whether to evolve a prototype into a complete system. The procedure consists of identifying evaluations criteria, defining alterative design approaches, and ranking the alternatives according to the criteria.

  14. The Effectiveness of the GeoGebra Software: The Intermediary Role of Procedural Knowledge on Students' Conceptual Knowledge and Their Achievement in Mathematics

    ERIC Educational Resources Information Center

    Zulnaidi, Hutkemri; Zamri, Sharifah Norul Akmar Syed

    2017-01-01

    The aim of this study was to identify the effects of GeoGebra software on students' conceptual and procedural knowledge and the achievement of Form Two students in Mathematics particularly on the topic Function. In addition, this study determined the role of procedural knowledge as a mediator between conceptual knowledge and students' achievement.…

  15. A system study for the application of microcomputers to research flight test techniques

    NASA Technical Reports Server (NTRS)

    Smyth, R. K.

    1983-01-01

    The onboard simulator is a three degree of freedom aircraft behavior simulator which provides parameters used by the interception procedure. These parameters can be used for verifying closed loop performance before flight. The air to air intercept mode is a software package integrated in the simulation process that generates a target motion and performs a tracking procedure that predicts the most likely next target position, for a defined time step. This procedure also updates relative position parameters and gives adequate fire commands. A microcomputer based on an aircraft spin warning system periodically samples the assymetric thrust and yaw rate of an airplane and then issues voice synthesized warnings and /or suggests to the ilot how to respond to the situation.

  16. Development of flying qualities criteria for single pilot instrument flight operations

    NASA Technical Reports Server (NTRS)

    Bar-Gill, A.; Nixon, W. B.; Miller, G. E.

    1982-01-01

    Flying qualities criteria for Single Pilot Instrument Flight Rule (SPIFR) operations were investigated. The ARA aircraft was modified and adapted for SPIFR operations. Aircraft configurations to be flight-tested were chosen and matched on the ARA in-flight simulator, implementing modern control theory algorithms. Mission planning and experimental matrix design were completed. Microprocessor software for the onboard data acquisition system was debugged and flight-tested. Flight-path reconstruction procedure and the associated FORTRAN program were developed. Algorithms associated with the statistical analysis of flight test results and the SPIFR flying qualities criteria deduction are discussed.

  17. A Remote Registration Based on MIDAS

    NASA Astrophysics Data System (ADS)

    JIN, Xin

    2017-04-01

    We often need for software registration to protect the interests of the software developers. This article narrated one kind of software long-distance registration technology. The registration method is: place the registration information in a database table, after the procedure starts in check table registration information, if it has registered then the procedure may the normal operation; Otherwise, the customer must input the sequence number and registers through the network on the long-distance server. If it registers successfully, then records the registration information in the database table. This remote registration method can protect the rights of software developers.

  18. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  19. Real-time fluoroscopic needle guidance in the interventional radiology suite using navigational software for percutaneous bone biopsies in children.

    PubMed

    Shellikeri, Sphoorti; Setser, Randolph M; Hwang, Tiffany J; Srinivasan, Abhay; Krishnamurthy, Ganesh; Vatsky, Seth; Girard, Erin; Zhu, Xiaowei; Keller, Marc S; Cahill, Anne Marie

    2017-07-01

    Navigational software provides real-time fluoroscopic needle guidance for percutaneous procedures in the Interventional Radiology (IR) suite. We describe our experience with navigational software for pediatric percutaneous bone biopsies in the IR suite and compare technical success, diagnostic accuracy, radiation dose and procedure time with that of CT-guided biopsies. Pediatric bone biopsies performed using navigational software (Syngo iGuide, Siemens Healthcare) from 2011 to 2016 were prospectively included and anatomically matched CT-guided bone biopsies from 2008 to 2016 were retrospectively reviewed with institutional review board approval. C-arm CT protocols used for navigational software-assisted cases included institution-developed low-dose (0.1/0.17 μGy/projection), regular-dose (0.36 μGy/projection), or a combination of low-dose/regular-dose protocols. Estimated effective radiation dose and procedure times were compared between software-assisted and CT-guided biopsies. Twenty-six patients (15 male; mean age: 10 years) underwent software-assisted biopsies (15 pelvic, 7 lumbar and 4 lower extremity) and 33 patients (13 male; mean age: 9 years) underwent CT-guided biopsies (22 pelvic, 7 lumbar and 4 lower extremity). Both modality biopsies resulted in a 100% technical success rate. Twenty-five of 26 (96%) software-assisted and 29/33 (88%) CT-guided biopsies were diagnostic. Overall, the effective radiation dose was significantly lower in software-assisted than CT-guided cases (3.0±3.4 vs. 6.6±7.7 mSv, P=0.02). The effective dose difference was most dramatic in software-assisted cases using low-dose C-arm CT (1.2±1.8 vs. 6.6±7.7 mSv, P=0.001) or combined low-dose/regular-dose C-arm CT (1.9±2.4 vs. 6.6±7.7 mSv, P=0.04), whereas effective dose was comparable in software-assisted cases using regular-dose C-arm CT (6.0±3.5 vs. 6.6±7.7 mSv, P=0.7). Mean procedure time was significantly lower for software-assisted cases (91±54 vs. 141±68 min, P=0.005). In our experience, navigational software technology in the IR suite is a promising alternative to CT guidance for pediatric bone biopsies providing comparable technical success and diagnostic accuracy with lower radiation dose and procedure time, in addition to providing real-time fluoroscopic needle guidance.

  20. Identifying fMRI Model Violations with Lagrange Multiplier Tests

    PubMed Central

    Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor

    2013-01-01

    The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665

  1. 48 CFR 227.7200 - Scope of subpart.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7200 Scope of subpart. This subpart— (a) Prescribes policies and procedures for the acquisition of computer software and computer software documentation, and the...

  2. 48 CFR 227.7200 - Scope of subpart.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7200 Scope of subpart. This subpart— (a) Prescribes policies and procedures for the acquisition of computer software and computer software documentation, and the...

  3. 48 CFR 227.7200 - Scope of subpart.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7200 Scope of subpart. This subpart— (a) Prescribes policies and procedures for the acquisition of computer software and computer software documentation, and the...

  4. 48 CFR 227.7200 - Scope of subpart.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7200 Scope of subpart. This subpart— (a) Prescribes policies and procedures for the acquisition of computer software and computer software documentation, and the...

  5. 48 CFR 227.7200 - Scope of subpart.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7200 Scope of subpart. This subpart— (a) Prescribes policies and procedures for the acquisition of computer software and computer software documentation, and the...

  6. A Procedural Electroencephalogram Simulator for Evaluation of Anesthesia Monitors.

    PubMed

    Petersen, Christian Leth; Görges, Matthias; Massey, Roslyn; Dumont, Guy Albert; Ansermino, J Mark

    2016-11-01

    Recent research and advances in the automation of anesthesia are driving the need to better understand electroencephalogram (EEG)-based anesthesia end points and to test the performance of anesthesia monitors. This effort is currently limited by the need to collect raw EEG data directly from patients. A procedural method to synthesize EEG signals was implemented in a mobile software application. The application is capable of sending the simulated signal to an anesthesia depth of hypnosis monitor. Systematic sweeps of the simulator generate functional monitor response profiles reminiscent of how network analyzers are used to test electronic components. Three commercial anesthesia monitors (Entropy, NeuroSENSE, and BIS) were compared with this new technology, and significant response and feature variations between the monitor models were observed; this includes reproducible, nonmonotonic apparent multistate behavior and significant hysteresis at light levels of anesthesia. Anesthesia monitor response to a procedural simulator can reveal significant differences in internal signal processing algorithms. The ability to synthesize EEG signals at different anesthetic depths potentially provides a new method for systematically testing EEG-based monitors and automated anesthesia systems with all sensor hardware fully operational before human trials.

  7. GCS plan for software aspects of certification

    NASA Technical Reports Server (NTRS)

    Shagnea, Anita M.; Lowman, Douglas S.; Withers, B. Edward

    1990-01-01

    As part of the Guidance and Control Software (GCS) research project being sponsored by NASA to evaluate the failure processes of software, standard industry software development procedures are being employed. To ensure that these procedures are authentic, the guidelines outlined in the Radio Technical Commission for Aeronautics (RTCA/DO-178A document entitled, software considerations in airborne systems and equipment certification, were adopted. A major aspect of these guidelines is proper documentation. As such, this report, the plan for software aspects of certification, was produced in accordance with DO-178A. An overview is given of the GCS research project, including the goals of the project, project organization, and project schedules. It also specifies the plans for all aspects of the project which relate to the certification of the GCS implementations developed under a NASA contract. These plans include decisions made regarding the software specification, accuracy requirements, configuration management, implementation development and verification, and the development of the GCS simulator.

  8. Development of parallel line analysis criteria for recombinant adenovirus potency assay and definition of a unit of potency.

    PubMed

    Ogawa, Yasushi; Fawaz, Farah; Reyes, Candice; Lai, Julie; Pungor, Erno

    2007-01-01

    Parameter settings of a parallel line analysis procedure were defined by applying statistical analysis procedures to the absorbance data from a cell-based potency bioassay for a recombinant adenovirus, Adenovirus 5 Fibroblast Growth Factor-4 (Ad5FGF-4). The parallel line analysis was performed with a commercially available software, PLA 1.2. The software performs Dixon outlier test on replicates of the absorbance data, performs linear regression analysis to define linear region of the absorbance data, and tests parallelism between the linear regions of standard and sample. Width of Fiducial limit, expressed as a percent of the measured potency, was developed as a criterion for rejection of the assay data and to significantly improve the reliability of the assay results. With the linear range-finding criteria of the software set to a minimum of 5 consecutive dilutions and best statistical outcome, and in combination with the Fiducial limit width acceptance criterion of <135%, 13% of the assay results were rejected. With these criteria applied, the assay was found to be linear over the range of 0.25 to 4 relative potency units, defined as the potency of the sample normalized to the potency of Ad5FGF-4 standard containing 6 x 10(6) adenovirus particles/mL. The overall precision of the assay was estimated to be 52%. Without the application of Fiducial limit width criterion, the assay results were not linear over the range, and an overall precision of 76% was calculated from the data. An absolute unit of potency for the assay was defined by using the parallel line analysis procedure as the amount of Ad5FGF-4 that results in an absorbance value that is 121% of the average absorbance readings of the wells containing cells not infected with the adenovirus.

  9. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  10. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  11. Pre-Flight Dark Forward Electrical Testing of the Mir Cooperative Solar Array

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Scheiman, David A.; Hoffman, David J.

    1997-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. After final assembly in Russia, the MCSA was shipped to the NASA Kennedy Space Center (KSC) in the summer of 1995 and launched to Mir in November 1995. Program managers were concerned of the potential for MCSA damage during the transatlantic shipment and the associated handling operations. To address this concern, NASA Lewis Research Center (LERC) developed an innovative dark-forward electrical test program to assess the gross electrical condition of each generator following shipment from Russia. The use of dark test techniques, which allow the array to remain in the stowed configuration, greatly simplifies the checkout of large area solar arrays. MCSA dark electrical testing was successfully performed at KSC in July 1995 following transatlantic shipment. Data from this testing enabled engineers to quantify the effects of potential MCSA physical damage that would degrade on-orbit electrical performance. In this paper, an overview of the principles and heritage of photovoltaic array dark testing is given. The specific MCSA dark test program is also described including the hardware, software, testing procedures and test results. The current-voltage (4) response of both solar cell circuitry and by-pass diode circuitry was obtained. To guide the development of dark test hardware, software and procedures, a dedicated FORTRAN computer code was developed to predict the dark 4 responses of generators with a variety of feasible damage modes. By comparing the actual test data with the predictions, the physical condition of the generator could be inferred. Based on this data analysis, no electrical short-circuits or open-circuits were detected. This suggested the MCSA did not sustain physical damage that affected electrical performance during handling and shipment from Russia to the US. Good agreement between the test data and computational predictions indicated MCSA electrical performance was amenable to accurate analysis and was well understood.

  12. Analysis of Photogrammetry Data from ISIM Mockup, June 1, 2007

    NASA Technical Reports Server (NTRS)

    Nowak, Maria; Hill, Mike

    2007-01-01

    During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software.

  13. Leadership Development Program Final Project

    NASA Technical Reports Server (NTRS)

    Parrish, Teresa C.

    2016-01-01

    TOSC is NASA's prime contractor tasked to successfully assemble, test, and launch the EM1 spacecraft. TOSC success is highly dependent on design products from the other NASA Programs manufacturing and delivering the flight hardware; Space Launch System(SLS) and Multi-Purpose Crew Vehicle(MPCV). Design products directly feed into TOSC's: Procedures, Personnel training, Hardware assembly, Software development, Integrated vehicle test and checkout, Launch. TOSC senior management recognized a significant schedule risk as these products are still being developed by the other two (2) programs; SVE and ACE positions were created.

  14. Three-Dimensional Path Planning Software-Assisted Transjugular Intrahepatic Portosystemic Shunt: A Technical Modification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsauo, Jiaywei, E-mail: 80732059@qq.com; Luo, Xuefeng, E-mail: luobo-913@126.com; Ye, Linchao, E-mail: linchao.ye@siemens.com

    2015-06-15

    PurposeThis study was designed to report our results with a modified technique of three-dimensional (3D) path planning software assisted transjugular intrahepatic portosystemic shunt (TIPS).Methods3D path planning software was recently developed to facilitate TIPS creation by using two carbon dioxide portograms acquired at least 20° apart to generate a 3D path for overlay needle guidance. However, one shortcoming is that puncturing along the overlay would be technically impossible if the angle of the liver access set and the angle of the 3D path are not the same. To solve this problem, a prototype 3D path planning software was fitted with a utility to calculate themore » angle of the 3D path. Using this, we modified the angle of the liver access set accordingly during the procedure in ten patients.ResultsFailure for technical reasons occurred in three patients (unsuccessful wedged hepatic venography in two cases, software technical failure in one case). The procedure was successful in the remaining seven patients, and only one needle pass was required to obtain portal vein access in each case. The course of puncture was comparable to the 3D path in all patients. No procedure-related complication occurred following the procedures.ConclusionsAdjusting the angle of the liver access set to match the angle of the 3D path determined by the software appears to be a favorable modification to the technique of 3D path planning software assisted TIPS.« less

  15. Hardware in-the-Loop Demonstration of Real-Time Orbit Determination in High Earth Orbits

    NASA Technical Reports Server (NTRS)

    Moreau, Michael; Naasz, Bo; Leitner, Jesse; Carpenter, J. Russell; Gaylor, Dave

    2005-01-01

    This paper presents results from a study conducted at Goddard Space Flight Center (GSFC) to assess the real-time orbit determination accuracy of GPS-based navigation in a number of different high Earth orbital regimes. Measurements collected from a GPS receiver (connected to a GPS radio frequency (RF) signal simulator) were processed in a navigation filter in real-time, and resulting errors in the estimated states were assessed. For the most challenging orbit simulated, a 12 hour Molniya orbit with an apogee of approximately 39,000 km, mean total position and velocity errors were approximately 7 meters and 3 mm/s respectively. The study also makes direct comparisons between the results from the above hardware in-the-loop tests and results obtained by processing GPS measurements generated from software simulations. Care was taken to use the same models and assumptions in the generation of both the real-time and software simulated measurements, in order that the real-time data could be used to help validate the assumptions and models used in the software simulations. The study makes use of the unique capabilities of the Formation Flying Test Bed at GSFC, which provides a capability to interface with different GPS receivers and to produce real-time, filtered orbit solutions even when less than four satellites are visible. The result is a powerful tool for assessing onboard navigation performance in a wide range of orbital regimes, and a test-bed for developing software and procedures for use in real spacecraft applications.

  16. Analyzing longitudinal data with the linear mixed models procedure in SPSS.

    PubMed

    West, Brady T

    2009-09-01

    Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.

  17. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  18. Extended System Operations Studies for Automated Guideway Transit Systems : Procedure for the Analysis of Representative AGT Deployments

    DOT National Transportation Integrated Search

    1981-12-01

    The purpose of this report is to present a general procedure for using the SOS software to analyze AGT systems. Data to aid the analyst in specifying input information, required as input to the software, are summarized in the appendices. The data are...

  19. 75 FR 34734 - Improving Market and Planning Efficiency Through Improved Software; Notice of Agenda and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-18

    ... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff Technical Conference June 10, 2010. This notice establishes the agenda and procedures for the staff[email protected] . Kimberly D. Bose, Secretary. Agenda for AD10-12 Staff Technical Conference on Enhanced Power...

  20. System administrator`s guide to CDPS. Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Didier, B.T.; Portwood, M.H.

    The System Administrator`s Guide to CDPS is intended for those responsible for setting up and maintaining the hardware and software of a Common Mapping Standard (CMS) Date Production System (CDPS) installation. This guide assists the system administrator in performing typical administrative functions. It is not intended to replace the Ultrix Documentation Set that should be available for a DCPS installation. The Ultrix Documentation Set will be required to provide details on referenced Ultrix commands as well as procedures for performing Ultrix maintenance functions. There are six major sections in this guide. Section 1 introduces the system administrator to CDPS andmore » describes the assumptions that are made by this guide. Section 2 describes the CDPS platform configuration. Section 3 describes the platform preparation that is required to install the CDPS software. Section 4 describes the CPS software and its installation procedures. Section 5 describes the CDS software and its installation procedures. Section 6 describes various operation and maintenance procedures. Four appendices are also provided. Appendix A contains a list of used acronyms. Appendix B provides a terse description of common Ultrix commands that are used in administrative functions. Appendix C provides sample CPS and CDS configuration files. Appendix D provides a required list and a recommended list of Ultrix software subsets for installation on a CDPS platform.« less

  1. SpaceHab 1 maintenance experiment

    NASA Technical Reports Server (NTRS)

    Bohannon, Jackie W.

    1994-01-01

    The SpaceHab 1 flight on STS-57 served as a test platform for evaluation of two space station payloads. The first payload evaluated a space station maintenance concept using a sweep signal generator and a 48-channel logic analyzer to perform fault detection and isolation. Crew procedures files, test setup diagram files, and software to configure the test equipment were created on the ground and uplinked on the astronauts' voice communication circuit to perform tests in flight. In order to use these files, the portable computer was operated in a multi-window configuration. The test data transmitted to the ground allowing the ground staff to identify the cause of the fault and provide the crew with the repair procedures and diagrams. The crew successfully repaired the system under test. The second payload investigated hand soldering and de-soldering of standard components on printed circuit (PC) boards in zero gravity. It also used a new type of intra-vehicular foot restraints which uses the neutral body posture in zero-g to provide retention of the crew without their conscious attention.

  2. Design and Test of a Transonic Axial Splittered Rotor

    DTIC Science & Technology

    2015-06-15

    AXIAL SPLITTERED ROTOR A new design procedure was developed that uses commercial-off-the-shelf software (MATLAB, SolidWorks, and ANSYS-CFX) for the...geometric rendering and analysis of a transonic axial compressor rotor with splitter blades. Predictive numerical simulations were conducted and...Compressor, Splittered Rotor REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S ACRONYM(S) ARO 8. PERFORMING

  3. Launch Vehicle Operations Simulator

    NASA Technical Reports Server (NTRS)

    Blackledge, J. W.

    1974-01-01

    The Saturn Launch Vehicle Operations Simulator (LVOS) was developed for NASA at Kennedy Space Center. LVOS simulates the Saturn launch vehicle and its ground support equipment. The simulator was intended primarily to be used as a launch crew trainer but it is also being used for test procedure and software validation. A NASA/contractor team of engineers and programmers implemented the simulator after the Apollo XI lunar landing during the low activity periods between launches.

  4. Pressure and Temperature Sensitive Paint Field System

    NASA Technical Reports Server (NTRS)

    Sprinkle, Danny R.; Obara, Clifford J.; Amer, Tahani R.; Faulcon, Nettie D.; Carmine, Michael T.; Burkett, Cecil G.; Pritchard, Daniel W.; Oglesby, Donald M.

    2004-01-01

    This report documents the Pressure and Temperature Sensitive Paint Field System that is used to provide global surface pressure and temperature measurements on models tested in Langley wind tunnels. The system was developed and is maintained by Global Surface Measurements Team personnel of the Data Acquisition and Information Management Branch in the Research Facilities Services Competency. Descriptions of the system hardware and software are presented and operational procedures are detailed.

  5. Low latency messages on distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Rosing, Matthew; Saltz, Joel

    1993-01-01

    Many of the issues in developing an efficient interface for communication on distributed memory machines are described and a portable interface is proposed. Although the hardware component of message latency is less than one microsecond on many distributed memory machines, the software latency associated with sending and receiving typed messages is on the order of 50 microseconds. The reason for this imbalance is that the software interface does not match the hardware. By changing the interface to match the hardware more closely, applications with fine grained communication can be put on these machines. Based on several tests that were run on the iPSC/860, an interface that will better match current distributed memory machines is proposed. The model used in the proposed interface consists of a computation processor and a communication processor on each node. Communication between these processors and other nodes in the system is done through a buffered network. Information that is transmitted is either data or procedures to be executed on the remote processor. The dual processor system is better suited for efficiently handling asynchronous communications compared to a single processor system. The ability to send data or procedure is very flexible for minimizing message latency, based on the type of communication being performed. The test performed and the proposed interface are described.

  6. Managing the Implementation of Mission Operations Automation

    NASA Technical Reports Server (NTRS)

    Sodano, R.; Crouse, P.; Odendahl, S.; Fatig, M.; McMahon, K.; Lakin, J.

    2006-01-01

    Reducing the cost of mission operations has necessitated a high level of automation both on spacecraft and ground systems. While automation on spacecraft is implemented during the design phase, ground system automation tends to be implemented during the prime mission operations phase. Experience has shown that this tendency for late automation development can be hindered by several factors: additional hardware and software resources may need to be procured; software must be developed and tested on a non-interference basis with primary operations with limited manpower; and established procedures may not be suited for automation requiring substantial rework. In this paper we will review the experience of successfully automating mission operations for seven on-orbit missions: the Compton Gamma Ray Observatory (CGRO), the Rossi X-Ray Timing Explorer (RXTE), the Advanced Composition Explorer (ACE), the Far Ultraviolet Spectroscopic Explorer (FUSE), Interplanetary Physics Laboratory (WIND), Polar Plasma Laboratory (POLAR), and the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE). We will provide lessons learned in areas such as: spacecraft recorder management, procedure development, lights out commanding from the ground system vs. stored command loads, spacecraft contingency response time, and ground station interfaces. Implementing automation strategies during the mission concept and spacecraft integration and test phase as the most efficient method will be discussed.

  7. Coupled rotor/airframe vibration analysis program manual manual. Volume 1: User's and programmer's instructions

    NASA Technical Reports Server (NTRS)

    Cassarino, S.; Sopher, R.

    1982-01-01

    user instruction and software descriptions for the base program of the coupled rotor/airframe vibration analysis are provided. The functional capabilities and procedures for running the program are provided. Interfaces with external programs are discussed. The procedure of synthesizing a dynamic system and the various solution methods are described. Input data and output results are presented. Detailed information is provided on the program structure. Sample test case results for five representative dynamic configurations are provided and discussed. System response are plotted to demonstrate the plots capabilities available. Instructions to install and execute SIMVIB on the CDC computer system are provided.

  8. Sharing programming resources between Bio* projects through remote procedure call and native call stack strategies.

    PubMed

    Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki

    2012-01-01

    Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.

  9. Automated sequence analysis and editing software for HIV drug resistance testing.

    PubMed

    Struck, Daniel; Wallis, Carole L; Denisov, Gennady; Lambert, Christine; Servais, Jean-Yves; Viana, Raquel V; Letsoalo, Esrom; Bronze, Michelle; Aitken, Sue C; Schuurman, Rob; Stevens, Wendy; Schmit, Jean Claude; Rinke de Wit, Tobias; Perez Bercoff, Danielle

    2012-05-01

    Access to antiretroviral treatment in resource-limited-settings is inevitably paralleled by the emergence of HIV drug resistance. Monitoring treatment efficacy and HIV drugs resistance testing are therefore of increasing importance in resource-limited settings. Yet low-cost technologies and procedures suited to the particular context and constraints of such settings are still lacking. The ART-A (Affordable Resistance Testing for Africa) consortium brought together public and private partners to address this issue. To develop an automated sequence analysis and editing software to support high throughput automated sequencing. The ART-A Software was designed to automatically process and edit ABI chromatograms or FASTA files from HIV-1 isolates. The ART-A Software performs the basecalling, assigns quality values, aligns query sequences against a set reference, infers a consensus sequence, identifies the HIV type and subtype, translates the nucleotide sequence to amino acids and reports insertions/deletions, premature stop codons, ambiguities and mixed calls. The results can be automatically exported to Excel to identify mutations. Automated analysis was compared to manual analysis using a panel of 1624 PR-RT sequences generated in 3 different laboratories. Discrepancies between manual and automated sequence analysis were 0.69% at the nucleotide level and 0.57% at the amino acid level (668,047 AA analyzed), and discordances at major resistance mutations were recorded in 62 cases (4.83% of differences, 0.04% of all AA) for PR and 171 (6.18% of differences, 0.03% of all AA) cases for RT. The ART-A Software is a time-sparing tool for pre-analyzing HIV and viral quasispecies sequences in high throughput laboratories and highlighting positions requiring attention. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Workflow in interventional radiology: nerve blocks and facet blocks

    NASA Astrophysics Data System (ADS)

    Siddoway, Donald; Ingeholm, Mary Lou; Burgert, Oliver; Neumuth, Thomas; Watson, Vance; Cleary, Kevin

    2006-03-01

    Workflow analysis has the potential to dramatically improve the efficiency and clinical outcomes of medical procedures. In this study, we recorded the workflow for nerve block and facet block procedures in the interventional radiology suite at Georgetown University Hospital in Washington, DC, USA. We employed a custom client/server software architecture developed by the Innovation Center for Computer Assisted Surgery (ICCAS) at the University of Leipzig, Germany. This software runs in an internet browser, and allows the user to record the actions taken by the physician during a procedure. The data recorded during the procedure is stored as an XML document, which can then be further processed. We have successfully gathered data on a number if cases using a tablet PC, and these preliminary results show the feasibility of using this software in an interventional radiology setting. We are currently accruing additional cases and when more data has been collected we will analyze the workflow of these procedures to look for inefficiencies and potential improvements.

  11. Recent Survey and Application of the simSUNDT Software

    NASA Astrophysics Data System (ADS)

    Persson, G.; Wirdelius, H.

    2010-02-01

    The simSUNDT software is based on a previous developed program (SUNDT). The latest version has been customized in order to generate realistic synthetic data (including a grain noise model), compatible with a number of off-line analysis software. The software consists of a Windows®-based preprocessor and postprocessor together with a mathematical kernel (UTDefect), dealing with the actual mathematical modeling. The model employs various integral transforms and integral equation and enables simulations of the entire ultrasonic testing situation. The model is completely three-dimensional though the simulated component is two-dimensional, bounded by the scanning surface and a planar back surface as an option. It is of great importance that inspection methods that are applied are proper validated and that their capability of detection of cracks and defects are quantified. In order to achieve this, statistical methods such as Probability of Detection (POD) often are applied, with the ambition to estimate the detectability as a function of defect size. Despite the fact that the proposed procedure with the utilization of test pieces is very expensive, it also tends to introduce a number of possible misalignments between the actual NDT situation that is to be performed and the proposed experimental simulation. The presentation will describe the developed model that will enable simulation of a phased array NDT inspection and the ambition to use this simulation software to generate POD information. The paper also includes the most recent developments of the model including some initial experimental validation of the phased array probe model.

  12. Computer program for solving laminar, transitional, or turbulent compressible boundary-layer equations for two-dimensional and axisymmetric flow

    NASA Technical Reports Server (NTRS)

    Harris, J. E.; Blanchard, D. K.

    1982-01-01

    A numerical algorithm and computer program are presented for solving the laminar, transitional, or turbulent two dimensional or axisymmetric compressible boundary-layer equations for perfect-gas flows. The governing equations are solved by an iterative three-point implicit finite-difference procedure. The software, program VGBLP, is a modification of the approach presented in NASA TR R-368 and NASA TM X-2458, respectively. The major modifications are: (1) replacement of the fourth-order Runge-Kutta integration technique with a finite-difference procedure for numerically solving the equations required to initiate the parabolic marching procedure; (2) introduction of the Blottner variable-grid scheme; (3) implementation of an iteration scheme allowing the coupled system of equations to be converged to a specified accuracy level; and (4) inclusion of an iteration scheme for variable-entropy calculations. These modifications to the approach presented in NASA TR R-368 and NASA TM X-2458 yield a software package with high computational efficiency and flexibility. Turbulence-closure options include either two-layer eddy-viscosity or mixing-length models. Eddy conductivity is modeled as a function of eddy viscosity through a static turbulent Prandtl number formulation. Several options are provided for specifying the static turbulent Prandtl number. The transitional boundary layer is treated through a streamwise intermittency function which modifies the turbulence-closure model. This model is based on the probability distribution of turbulent spots and ranges from zero to unity for laminar and turbulent flow, respectively. Several test cases are presented as guides for potential users of the software.

  13. CoRoTlog

    NASA Astrophysics Data System (ADS)

    Plasson, Ph.

    2006-11-01

    LESIA, in close cooperation with CNES, DLR and IWF, is responsible for the tests and validation of the CoRoT instrument digital process unit which is made up of the BEX and DPU assembly. The main part of the work has consisted in validating the DPU software and in testing the BEX/DPU coupling. This work took more than two years due to the central role of the software tested and its technical complexity. The first task, in the validation process, was to carry out the acceptance tests of the DPU software. These tests consisted in checking each of the 325 requirements identified in the URD (User Requirements Document) and were played in a configuration using the DPU coupled to a BEX simulator. During the acceptance tests, all the transversal functionalities of the DPU software, like the TC/TM management, the state machine management, the BEX driving, the system monitoring or the maintenance functionalities were checked in depth. The functionalities associated with the seismology and exoplanetology processing, like the loading of window and mask descriptors or the configuration of the service execution parameters, were also exhaustively tested. After having validated the DPU software against the user requirements using a BEX simulator, the following step consisted in coupling the DPU and the BEX in order to check that the formed unit worked correctly and met the performance requirements. These tests were conducted in two phases: the first one was devoted to the functional aspects and the tests of interface, the second one to the performance aspects. The performance tests were based on the use of the DPU software scientific services and on the use of full images representative of a realistic sky as inputs. These tests were also based on the use of a reference set of windows and parameters, which was provided by the scientific team and was representative, in terms of load and complexity, of the one that could be used during the observation mode of the CoRoT instrument. Theywere played in a configuration using either a BCC simulator or a real BCC coupled to a video simulator, to feed the BEX/DPU unit. The validation of the scientific algorithms was conducted in parallel to the phase of the BEX/DPU coupling tests. The objective of this phase was to check that the algorithms implemented in the scientific services of the DPU software were in good conformity with those specified in the URD and that the obtained numerical precision corresponded to that expected. Forty cases of tests were defined covering the fine and rough angular error measurement processing, the rejection of the brilliant pixels, the subtraction of the offset and the sky background, the photometry algorithms, the SAA handling and reference image management. For each test case, the LESIA scientific team produced, by simulation, using the model instrument, the dynamic data files and the parameter sets allowing to feed the DPU on the one hand, and, on the other hand, a model of the onboard software. These data files correspond to FITS images (black windows, star windows, offset windows) containing more or less disturbances and making it possible to test the DPU software in dynamic mode over durations of up to 48 hours. To perform the test and validation activities of the CoRoT instrument digital process unit, a set of software testing tools was developed by LESIA (Software Ground Support Equipment, hereafter "SGSE"). Thanks to their versatility and modularity, these software testing tools were actually used during all the activities of integration, tests and validation of the instrument and its subsystems CoRoTCase and CoRoTCam. The CoRoT SGSE were specified, designed and developed by LESIA. The objective was to have a software system allowing the users (validation team of the onboard software, instrument integration team, etc.) to remotely control and monitor the whole instrument or only one of the subsystems of the instrument like the DPU coupled to a simulator BEX or the BEX/DPU unit coupled to a BCC simulator. The idea was to be able to interact in real time with the system under test by driving the various EGSE, but also to play test procedures implemented as scripts organized into libraries, to record the telemetries and housekeeping data in a database, and to be able to carry out post-mortem analyses.

  14. Data collection and analysis software development for rotor dynamics testing in spin laboratory

    NASA Astrophysics Data System (ADS)

    Abdul-Aziz, Ali; Arble, Daniel; Woike, Mark

    2017-04-01

    Gas turbine engine components undergo high rotational loading another complex environmental conditions. Such operating environment leads these components to experience damages and cracks that can cause catastrophic failure during flights. There are traditional crack detections and health monitoring methodologies currently being used which rely on periodic routine maintenances, nondestructive inspections that often times involve engine and components dis-assemblies. These methods do not also offer adequate information about the faults, especially, if these faults at subsurface or not clearly evident. At NASA Glenn research center, the rotor dynamics laboratory is presently involved in developing newer techniques that are highly dependent on sensor technology to enable health monitoring and prediction of damage and cracks in rotor disks. These approaches are noninvasive and relatively economical. Spin tests are performed using a subscale test article mimicking turbine rotor disk undergoing rotational load. Non-contact instruments such as capacitive and microwave sensors are used to measure the blade tip gap displacement and blade vibrations characteristics in an attempt develop a physics based model to assess/predict the faults in the rotor disk. Data collection is a major component in this experimental-analytical procedure and as a result, an upgrade to an older version of the data acquisition software which is based on LabVIEW program has been implemented to support efficiently running tests and analyze the results. Outcomes obtained from the tests data and related experimental and analytical rotor dynamics modeling including key features of the updated software are presented and discussed.

  15. Usability Testing of an Interactive Virtual Reality Distraction Intervention to Reduce Procedural Pain in Children and Adolescents With Cancer.

    PubMed

    Birnie, Kathryn A; Kulandaivelu, Yalinie; Jibb, Lindsay; Hroch, Petra; Positano, Karyn; Robertson, Simon; Campbell, Fiona; Abla, Oussama; Stinson, Jennifer

    2018-06-01

    Needle procedures are among the most distressing aspects of pediatric cancer-related treatment. Virtual reality (VR) distraction offers promise for needle-related pain and distress given its highly immersive and interactive virtual environment. This study assessed the usability (ease of use and understanding, acceptability) of a custom VR intervention for children with cancer undergoing implantable venous access device (IVAD) needle insertion. Three iterative cycles of mixed-method usability testing with semistructured interviews were undertaken to refine the VR. Participants included 17 children and adolescents (8-18 years old) with cancer who used the VR intervention prior to or during IVAD access. Most participants reported the VR as easy to use (82%) and understand (94%), and would like to use it during subsequent needle procedures (94%). Based on usability testing, refinements were made to VR hardware, software, and clinical implementation. Refinements focused on increasing responsiveness, interaction, and immersion of the VR program, reducing head movement for VR interaction, and enabling participant alerts to steps of the procedure by clinical staff. No adverse events of nausea or dizziness were reported. The VR intervention was deemed acceptable and safe. Next steps include assessing feasibility and effectiveness of the VR intervention for pain and distress.

  16. Automation photometer of Hitachi U–2000 spectrophotometer with RS–232C–based computer

    PubMed Central

    Kumar, K. Senthil; Lakshmi, B. S.; Pennathur, Gautam

    1998-01-01

    The interfacing of a commonly used spectrophotometer, the Hitachi U2000, through its RS–232C port to a IBM compatible computer is described. The hardware for data acquisation was designed by suitably modifying readily available materials, and the software was written using the C programming language. The various steps involved in these procedures are elucidated in detail. The efficacy of the procedure was tested experimentally by running the visible spectrum of a cyanine dye. The spectrum was plotted through a printer hooked to the computer. The spectrum was also plotted by transforming the abscissa to the wavenumber scale. This was carried out by using another module written in C. The efficiency of the whole set-up has been calculated using standard procedures. PMID:18924834

  17. On Quality and Measures in Software Engineering

    ERIC Educational Resources Information Center

    Bucur, Ion I.

    2006-01-01

    Complexity measures are mainly used to estimate vital information about reliability and maintainability of software systems from regular analysis of the source code. Such measures also provide constant feedback during a software project to assist the control of the development procedure. There exist several models to classify a software product's…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, K.; Tsai, H.; Decision and Information Sciences

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it canmore » be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The Documentation Package for the RFID Temperature Monitoring System; Software Test Plan and Results for ARG-US OnSite; Configuration Management Plan (CMP) for the ARG-US System; Requirements Management Plan for the ARG-US System; and Design Management Plan for ARG-US.« less

  19. Software Engineering Laboratory (SEL) data and information policy

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank

    1991-01-01

    The policies and overall procedures that are used in distributing and in making available products of the Software Engineering Laboratory (SEL) are discussed. The products include project data and measures, project source code, reports, and software tools.

  20. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  1. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    NASA Astrophysics Data System (ADS)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  2. Spaceport Command and Control System Software Development

    NASA Technical Reports Server (NTRS)

    Glasser, Abraham

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This large system requires a large amount of intensive testing that will properly measure the capabilities of the system. Automating the test procedures would save the project money from human labor costs, as well as making the testing process more efficient. Therefore, the Exploration Systems Division (formerly the Electrical Engineering Division) at Kennedy Space Center (KSC) has recruited interns for the past two years to work alongside full-time engineers to develop these automated tests, as well as innovate upon the current automation process.

  3. Ground station software for receiving and handling Irecin telemetry data

    NASA Astrophysics Data System (ADS)

    Ferrante, M.; Petrozzi, M.; Di Ciolo, L.; Ortenzi, A.; Troso, G

    2004-11-01

    The on board resources, needed to perform the mission tasks, are very limited in nano-satellites. This paper proposes a software system to receive, manage and process in Real Time the Telemetry data coming from IRECIN nanosatellite and transmit operator manual commands and operative procedures. During the receiving phase, it shows the IRECIN subsystem physical values, visualizes the IRECIN attitude, and performs other suitable functions. The IRECIN Ground Station program is in charge to exchange information between IRECIN and the Ground segment. It carries out, in real time during IRECIN transmission phase, IRECIN attitude drawing, sun direction drawing, power supply received from Sun, visualization of the telemetry data, visualization of Earth magnetic field and more other functions. The received data are memorized and interpreted by a module, parser, and distribute to the suitable modules. Moreover it allows sending manual and automatic commands. Manual commands are delivered by an operator, on the other hand, automatic commands are provided by pre-configured operative procedures. Operative procedures development is realized in a previous phase called configuration phase. This program is also in charge to carry out a test session by mean the scheduler and commanding modules allowing execution of specific tasks without operator control. A log module to memorize received and transmitted data is realized. A phase to analyze, filter and visualize in off line the collected data, called post analysis, is based on the data extraction form the log module. At the same time, the Ground Station Software can work in network allowing managing, receiving and sending data/commands from different sites. The proposed system constitutes the software of IRECIN Ground Station. IRECIN is a modular nanosatellite weighting less than 2 kg, constituted by sixteen external sides with surface-mounted solar cells and three internal Al plates, kept together by four steel bars. Lithium-ions batteries are used. Attitude is determined by two three-axis magnetometers and the solar panels data. Control is provided by an active magnetic control system. The spacecraft will be spin- stabilized with the spin-axis normal to the orbit. All IRECIN electronic components are SMD technology in order to reduce weight and size. The realized Electronic board are completely developed, realized and tested at the Vitrociset S.P.A. under control of Research and Develop Group

  4. Radiation dose optimisation for conventional imaging in infants and newborns using automatic dose management software: an application of the new 2013/59 EURATOM directive.

    PubMed

    Alejo, L; Corredoira, E; Sánchez-Muñoz, F; Huerga, C; Aza, Z; Plaza-Núñez, R; Serrada, A; Bret-Zurita, M; Parrón, M; Prieto-Areyano, C; Garzón-Moll, G; Madero, R; Guibelalde, E

    2018-04-09

    Objective: The new 2013/59 EURATOM Directive (ED) demands dosimetric optimisation procedures without undue delay. The aim of this study was to optimise paediatric conventional radiology examinations applying the ED without compromising the clinical diagnosis. Automatic dose management software (ADMS) was used to analyse 2678 studies of children from birth to 5 years of age, obtaining local diagnostic reference levels (DRLs) in terms of entrance surface air kerma. Given local DRL for infants and chest examinations exceeded the European Commission (EC) DRL, an optimisation was performed decreasing the kVp and applying the automatic control exposure. To assess the image quality, an analysis of high-contrast resolution (HCSR), signal-to-noise ratio (SNR) and figure of merit (FOM) was performed, as well as a blind test based on the generalised estimating equations method. For newborns and chest examinations, the local DRL exceeded the EC DRL by 113%. After the optimisation, a reduction of 54% was obtained. No significant differences were found in the image quality blind test. A decrease in SNR (-37%) and HCSR (-68%), and an increase in FOM (42%), was observed. ADMS allows the fast calculation of local DRLs and the performance of optimisation procedures in babies without delay. However, physical and clinical analyses of image quality remain to be needed to ensure the diagnostic integrity after the optimisation process. Advances in knowledge: ADMS are useful to detect radiation protection problems and to perform optimisation procedures in paediatric conventional imaging without undue delay, as ED requires.

  5. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    NASA Technical Reports Server (NTRS)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.

  6. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  7. WFF TOPEX Software Documentation Overview, May 1999. Volume 2

    NASA Technical Reports Server (NTRS)

    Brooks, Ronald L.; Lee, Jeffrey

    2003-01-01

    This document provides an overview'of software development activities and the resulting products and procedures developed by the TOPEX Software Development Team (SWDT) at Wallops Flight Facility, in support of the WFF TOPEX Engineering Assessment and Verification efforts.

  8. Advanced crew procedures development techniques: Procedures and performance program description

    NASA Technical Reports Server (NTRS)

    Arbet, J. D.; Mangiaracina, A. A.

    1975-01-01

    The Procedures and Performance Program (PPP) for operation in conjunction with the Shuttle Procedures Simulator (SPS) is described. The PPP user interface, the SPS/PPP interface, and the PPP applications software are discussed.

  9. New developments on the homogenization of Canadian daily temperature data

    NASA Astrophysics Data System (ADS)

    Vincent, Lucie A.; Wang, Xiaolan L.

    2010-05-01

    Long-term and homogenized surface air temperature datasets had been prepared for the analysis of climate trends in Canada (Vincent and Gullett 1999). Non-climatic steps due to instruments relocation/changes and changes in observing procedures were identified in the annual mean of the daily maximum and minimum temperatures using a technique based on regression models (Vincent 1998). Monthly adjustments were derived from the regression models and daily adjustments were obtained from an interpolation procedure using the monthly adjustments (Vincent et al. 2002). Recently, new statistical tests have been developed to improve the power of detecting changepoints in climatological data time series. The penalized maximal t (PMT) test (Wang et al. 2007) and the penalized maximal F (PMF) test (Wang 2008b) were developed to take into account the position of each changepoint in order to minimize the effect of unequal and small sample size. A software package RHtestsV3 (Wang and Feng 2009) has also been developed to implement these tests to homogenize climate data series. A recursive procedure was developed to estimate the annual cycle, linear trend, and lag-1 autocorrelation of the base series in tandem, so that the effect of lag-1 autocorrelation is accounted for in the tests. A Quantile Matching (QM) algorithm (Wang 2009) was also developed for adjusting Gaussian daily data so that the empirical distributions of all segments of the detrended series match each other. The RHtestsV3 package was used to prepare a second generation of homogenized temperatures in Canada. Both the PMT test and the PMF test were applied to detect shifts in monthly mean temperature series. Reference series was used in conducting a PMT test. Whenever possible, the main causes of the shifts were retrieved through historical evidence such as the station inspection reports. Finally, the QM algorithm was used to adjust the daily temperature series for the artificial shifts identified from the respective monthly mean series. These procedures were applied to homogenize daily maximum and minimum temperatures recorded at 336 stations across Canada. During the presentation, the procedures will be summarized and their application will be illustrated throughout the provision of selected examples. References Vincent, L.A., X. Zhang, B.R. Bonsal and W.D. Hogg, 2002: Homogenization of daily temperatures over Canada. J. Climate, 15, 1322-1334. Vincent, L.A., and D.W. Gullett, 1999: Canadian historical and homogeneous temperature datasets for climate change analyses. Int. J. Climatol., 19, 1375-1388. Vincent, L.A., 1998: A technique for the identification of inhomogeneities in Canadian temperature series. J. Climate, 11, 1094-1104. Wang, X. L., 2009: A quantile matching adjustment algorithm for Gaussian data series. Climate Research Division, Atmospheric Science and Technology Directorate, Science and Technology Branch, Environment Canada. 5 pp. [Available online at http://cccma.seos.uvic.ca/ETCCDMI/software.shtml]. Wang X. L. and Y. Feng, 2009: RHtestsV3 User Manual. Climate Research Division, Atmospheric Science and Technology Directorate, Science and Technology Branch, Environment Canada. 26 pp. [Available online at http://cccma.seos.uvic.ca/ETCCDMI/software.shtml]. Wang, X. L., 2008a: Accounting for autocorrelation in detecting mean-shifts in climate data series using the penalized maximal t or F test. J. Appl. Meteor. Climatol., 47, 2423-2444. Wang, X. L., 2008b: Penalized maximal F-test for detecting undocumented mean-shifts without trend-change. J. Atmos. Oceanic Tech., 25 (No. 3), 368-384. DOI:10.1175/2007/JTECHA982.1. Wang, X. L., Q. H. Wen, and Y. Wu, 2007: Penalized maximal t test for detecting undocumented mean change in climate data series. J. Appl. Meteor. Climatol., 46 (No. 6), 916-931. DOI:10.1175/JAM2504.1

  10. Antibiogramj: A tool for analysing images from disk diffusion tests.

    PubMed

    Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M

    2017-05-01

    Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Feasibility of Close-Range Photogrammetric Models for Geographic Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Luke; /Rice U.

    2011-06-22

    The objective of this project was to determine the feasibility of using close-range architectural photogrammetry as an alternative three dimensional modeling technique in order to place the digital models in a geographic information system (GIS) at SLAC. With the available equipment and Australis photogrammetry software, the creation of full and accurate models of an example building, Building 281 on SLAC campus, was attempted. After conducting several equipment tests to determine the precision achievable, a complete photogrammetric survey was attempted. The dimensions of the resulting models were then compared against the true dimensions of the building. A complete building model wasmore » not evidenced to be obtainable using the current equipment and software. This failure was likely attributable to the limits of the software rather than the precision of the physical equipment. However, partial models of the building were shown to be accurate and determined to still be usable in a GIS. With further development of the photogrammetric software and survey procedure, the desired generation of a complete three dimensional model is likely still feasible.« less

  12. Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification

    NASA Technical Reports Server (NTRS)

    Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand; hide

    2016-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  13. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  14. Integrated Advanced Sounding Unit-A (AMSU-A). Configuration Management Plan

    NASA Technical Reports Server (NTRS)

    Cavanaugh, J.

    1996-01-01

    The purpose of this plan is to identify the baseline to be established during the development life cycle of the integrated AMSU-A, and define the methods and procedures which Aerojet will follow in the implementation of configuration control for each established baseline. Also this plan establishes the Configuration Management process to be used for the deliverable hardware, software, and firmware of the Integrated AMSU-A during development, design, fabrication, test, and delivery.

  15. A needle guidance system for biopsy and therapy using two-dimensional ultrasound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bluvol, Nathan; Sheikh, Allison; Kornecki, Anat

    2008-02-15

    Image-guided needle biopsies are currently used to provide a definitive diagnosis of breast cancer; however, difficulties in tumor targeting exist as the ultrasound (United States) scan plane and biopsy needle must remain coplanar throughout the procedure to display the actual needle tip position. The additional time associated with aligning and maintaining this coplanar relationship results in increased patient discomfort. Biopsy procedural efficiency is further hindered since needle pathway interpretation is often difficult, especially for needle insertions at large depths that usually require multiple reinsertions. The authors developed a system that would increase the speed and accuracy of current breast biopsymore » procedures using readily available two-dimensional (2D) US technology. This system is composed of a passive articulated mechanical arm that attaches to a 2D US transducer. The arm is connected to a computer through custom electronics and software, which were developed as an interface for tracking the positioning of the mechanical components in real time. The arm couples to the biopsy needle and provides visual guidance for the physician performing the procedure in the form of a real-time projected needle pathway overlay on an US image of the breast. An agar test phantom, with stainless steel targets interspersed randomly throughout, was used to validate needle trajectory positioning accuracy. The biopsy needle was guided by both the software and hardware components to the targets. The phantom, with the needle inserted and device decoupled, was placed in an x-ray stereotactic mammography (SM) machine. The needle trajectory and bead target locations were determined in three dimensions from the SM images. Results indicated a mean needle trajectory accuracy error of 0.75{+-}0.42 mm. This is adequate to sample lesions that are <2 mm in diameter. Chicken tissue test phantoms were used to compare core needle biopsy procedure times between experienced radiologists and inexperienced resident radiologists using free-hand US and the needle guidance system. Cylindrical polyvinyl alcohol cryogel lesions, colored blue, were embedded in chicken tissue. Radiologists identified the lesions, visible as hypoechoic masses in the US images, and performed biopsy using a 14-gauge needle. Procedure times were compared based on experience and the technique performed. Using a pair-wise t test, lower biopsy procedure times were observed when using the guidance system versus the free-hand technique (t=12.59, p<0.001). The authors believe that with this improved biopsy guidance they will be able to reduce the ''false negative'' rate of biopsies, especially in the hands of less experienced physicians.« less

  16. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  17. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  18. SU-G-IeP3-05: Effects of Image Receptor Technology and Dose Reduction Software On Radiation Dose Estimates for Fluoroscopically-Guided Interventional (FGI) Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merritt, Z; Dave, J; Eschelman, D

    Purpose: To investigate the effects of image receptor technology and dose reduction software on radiation dose estimates for most frequently performed fluoroscopically-guided interventional (FGI) procedures at a tertiary health care center. Methods: IRB approval was obtained for retrospective analysis of FGI procedures performed in the interventional radiology suites between January-2011 and December-2015. This included procedures performed using image-intensifier (II) based systems which were subsequently replaced, flat-panel-detector (FPD) based systems which were later upgraded with ClarityIQ dose reduction software (Philips Healthcare) and relatively new FPD system already equipped with ClarityIQ. Post procedure, technologists entered system-reported cumulative air kerma (CAK) and kerma-areamore » product (KAP; only KAP for II based systems) in RIS; these values were analyzed. Data pre-processing included correcting typographical errors and cross-verifying CAK and KAP. The most frequent high and low dose FGI procedures were identified and corresponding CAK and KAP values were compared. Results: Out of 27,251 procedures within this time period, most frequent high and low dose procedures were chemo/immuno-embolization (n=1967) and abscess drainage (n=1821). Mean KAP for embolization and abscess drainage procedures were 260,657, 310,304 and 94,908 mGycm{sup 2}, and 14,497, 15,040 and 6307 mGycm{sup 2} using II-, FPD- and FPD with ClarityIQ- based systems, respectively. Statistically significant differences were observed in KAP values for embolization procedures with respect to different systems but for abscess drainage procedures significant differences were only noted between systems with FPD and FPD with ClarityIQ (p<0.05). Mean CAK reduced significantly from 823 to 308 mGy and from 43 to 21 mGy for embolization and abscess drainage procedures, respectively, in transitioning to FPD systems with ClarityIQ (p<0.05). Conclusion: While transitioning from II- to FPD- based systems was not associated with dose reduction for the most frequently performed FGI procedures, substantial dose reduction was noted with relatively newer systems and dose reduction software.« less

  19. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    PubMed

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  20. Software use cases to elicit the software requirements analysis within the ASTRI project

    NASA Astrophysics Data System (ADS)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered common agreement about system requirements, defined the normal and alternative course of events, understood better the business process, and defined the system test to ensure that the delivered software works properly. We present a summary of the ASTRI SST-2M prototype use cases, and how the lessons learned can be exploited for the ASTRI mini-array proposed for the CTA Observatory.

  1. Final report on fiscal year 1992 activities for the environmental monitors line-loss study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenoyer, J.L.

    The work performed on this Environmental Monitors Line-Loss Study has been performed under Contract Numbers MLW-SVV-073750 and MFH-SVV-207554. Work on the task was initiated mid-December 1991, and this report documents and summarizes the work performed through January 18, 1993. The sections included in this report summarize the work performed on the Environmental Monitors Line-Loss Study. The sections included in this report are arranged to reflect individual sub-tasks and include: descriptions of measurement systems and procedures used to obtain cascade impactor samples and laser spectrometer measurements from multiple stacks and locations; information on data acquisition, analyses, assessment, and software; discussion ofmore » the analyses and measurement results from the cascade impactor and laser spectrometer systems and software used; discussion on the development of general test methods and procedures for line-loss determinations; an overall summary and specific conclusions that can be made with regard to efforts performed on this task during FY 1992 and FY 1993. Supporting information for these sections is included in this report as appendices.« less

  2. A methodological, task-based approach to Procedure-Specific Simulations training.

    PubMed

    Setty, Yaki; Salzman, Oren

    2016-12-01

    Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.

  3. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    NASA Astrophysics Data System (ADS)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.

  4. Modal Analysis for Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANGO software is to provide a solution for improving small signal stability of power systems through adjusting operator-controllable variables using PMU measurement. System oscillation problems are one of the major threats to the grid stability and reliability in California and the Western Interconnection. These problems result in power fluctuations, lower grid operation efficiency, and may even lead to large-scale grid breakup and outages. This MANGO software aims to solve this problem by automatically generating recommended operation procedures termed Modal Analysis for Grid Operation (MANGO) to improve damping of inter-area oscillation modes. The MANGO procedure includes three steps: recognizing small signalmore » stability problems, implementing operating point adjustment using modal sensitivity, and evaluating the effectiveness of the adjustment. The MANGO software package is designed to help implement the MANGO procedure.« less

  5. A standardized test battery for the study of synesthesia

    PubMed Central

    Eagleman, David M.; Kagan, Arielle D.; Nelson, Stephanie S.; Sagaram, Deepak; Sarma, Anand K.

    2014-01-01

    Synesthesia is an unusual condition in which stimulation of one modality evokes sensation or experience in another modality. Although discussed in the literature well over a century ago, synesthesia slipped out of the scientific spotlight for decades because of the difficulty in verifying and quantifying private perceptual experiences. In recent years, the study of synesthesia has enjoyed a renaissance due to the introduction of tests that demonstrate the reality of the condition, its automatic and involuntary nature, and its measurable perceptual consequences. However, while several research groups now study synesthesia, there is no single protocol for comparing, contrasting and pooling synesthetic subjects across these groups. There is no standard battery of tests, no quantifiable scoring system, and no standard phrasing of questions. Additionally, the tests that exist offer no means for data comparison. To remedy this deficit we have devised the Synesthesia Battery. This unified collection of tests is freely accessible online (http://www.synesthete.org). It consists of a questionnaire and several online software programs, and test results are immediately available for use by synesthetes and invited researchers. Performance on the tests is quantified with a standard scoring system. We introduce several novel tests here, and offer the software for running the tests. By presenting standardized procedures for testing and comparing subjects, this endeavor hopes to speed scientific progress in synesthesia research. PMID:16919755

  6. 32 CFR 291.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., functions, decisions, or procedures of a DNA organization. Normally, computer software, including source.... (This does not include the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in...

  7. 75 FR 30010 - Improving Market and Planning Efficiency Through Improved Software; Notice of Agenda and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff... conference to be held on June 2, 2010 and June 3, 2010, to discuss issues related to unit commitment software... Unit Commitment Software Federal Energy Regulatory Commission June 2, 2010 8 a.m Richard O'Neill, FERC...

  8. 75 FR 30387 - Improving Market and Planning Efficiency Through Improved Software; Notice of Agenda and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-01

    ... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff... planning models and software. The technical conference will be held from 8 a.m. to 5:30 p.m. (EDT) on June.... Agenda for AD10-12 Staff Technical Conference on Planning Models and Software Federal Energy Regulatory...

  9. Reaction control system/remote manipulator system automation

    NASA Technical Reports Server (NTRS)

    Hiers, Harry K.

    1990-01-01

    The objectives of this project is to evaluate the capability of the Procedural Reasoning System (PRS) in a typical real-time space shuttle application and to assess its potential for use in the Space Station Freedom. PRS, developed by SRI International, is a result of research in automating the monitoring and control of spacecraft systems. The particular application selected for the present work is the automation of malfunction handling procedures for the Shuttle Remote Manipulator System (SRMS). The SRMS malfunction procedures will be encoded within the PRS framework, a crew interface appropriate to the RMS application will be developed, and the real-time data interface software developed. The resulting PRS will then be integrated with the high-fidelity On-orbit Simulation of the NASA Johnson Space Center's System Engineering Simulator, and tests under various SRMS fault scenarios will be conducted.

  10. Data Analysis for the LISA Pathfinder Mission

    NASA Technical Reports Server (NTRS)

    Thorpe, James Ira

    2009-01-01

    The LTP (LISA Technology Package) is the core part of the Laser Interferometer Space Antenna (LISA) Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as 10 test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterized under different operating conditions. In order to best optimize subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument ' In order to do this, all analyses must be designed and tested in advance of the mission and have sufficient built-in flexibility to account for unexpected results or behaviour. To support this activity, a robust and flexible data analysis software package is also required. This poster presents two of the main components that make up the data analysis effort: the data analysis software and the mock-data challenges used to validate analysis procedures and experiment designs.

  11. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  12. Aeroservoelastic Testing of a Sidewall Mounted Free Flying Wind-Tunnel Model

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Vetter, Travis K.; Penning, Kevin B.; Coulson, David A.; Heeg, Jennifer

    2008-01-01

    A team comprised of the Air Force Research Laboratory (AFRL), Northrop Grumman, Lockheed Martin, and the NASA Langley Research Center conducted three j wind-tunnel tests in the Transonic Dynamics Tunnel to demonstrate active control technologies relevant to large, exible vehicles. In the rst of these three tests, a semispan, aeroelastically scaled, wind-tunnel model of a ying wing SensorCraft vehi- cle was mounted to a force balance to demonstrate gust load alleviation. In the second and third tests, the same wing was mated to a new, multi-degree-of-freedom, sidewall mount. This mount allowed the half-span model to translate vertically and pitch at the wing root, allowing better simulation of the full span vehicle's rigid-body modes. Gust Load Alleviation (GLA) and Body Freedom Flutter (BFF) suppression were successfully demonstrated. The rigid body degrees-of-freedom required that the model be own in the wind tunnel using an active control system. This risky mode of testing necessitated that a model arrestment system be integrated into the new mount. The safe and successful completion of these free-flying tests required the development and integration of custom hardware and software. This paper describes the many systems, software, and procedures that were developed as part of this effort.

  13. Detecting agricultural to urban land use change from multi-temporal MSS digital data. [Salt Lake County, Utah

    NASA Technical Reports Server (NTRS)

    Ridd, M. K.; Merola, J. A.; Jaynes, R. A.

    1983-01-01

    Conversion of agricultural land to a variety of urban uses is a major problem along the Wasatch Front, Utah. Although LANDSAT MSS data is a relatively coarse tool for discriminating categories of change in urban-size plots, its availability prompts a thorough test of its power to detect change. The procedures being applied to a test area in Salt Lake County, Utah, where the land conversion problem is acute are presented. The identity of land uses before and after conversion was determined and digital procedures for doing so were compared. Several algorithms were compared, utilizing both raw data and preprocessed data. Verification of results involved high quality color infrared photography and field observation. Two data sets were digitally registered, specific change categories internally identified in the software, results tabulated by computer, and change maps printed at 1:24,000 scale.

  14. X-Graphs: Language and Algorithms for Heterogeneous Graph Streams

    DTIC Science & Technology

    2017-09-01

    INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d

  15. Copyright: Know Your Electronic Rights!

    ERIC Educational Resources Information Center

    Valauskas, Edward J.

    1992-01-01

    Defines copyright and examines the interests of computer software publishers. Issues related to the rights of libraries in the circulation of software are discussed, including the fair use principle, software vendors' licensing agreements, and cooperation between libraries and vendors. An inset describes procedures for internal auditing of…

  16. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  17. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  18. SHELL: A Simulator for the Software Test Vehicle of the INFOPLEX Database Computer.

    DTIC Science & Technology

    1982-01-01

    3.6. GCER: Incoming Messages . . . . ...... 17 3.7. AAHER . . . . . . . . . . . . . . . 18 3.8. SAHER o .# . . a . . . . . . . . . . 18 3.9. Data...equal to the STIME clock of the AAH. The LOGMSG event will be handled by the SAHER . 3.8. SAHER SAHER is the procedure which simulates the action of a...SAH. A full description of its operation is given in Chapter 5. In short, each activation of SAHER resembles a scheduling cycle (of the operating

  19. Test and evaluation of the Argonne BPAC10 Series air chamber calorimeter designed for 20 minute measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perry, R.B.; Fiarman, S.; Jung, E.A.

    1990-10-01

    This paper is the final report on DOE-OSS Task ANLE88002 Fast Air Chamber Calorimetry.'' The task objective was to design, construct, and test an isothermal air chamber calorimeter for plutonium assay of bulk samples that would meet the following requirements for sample power measurement: average sample measurement time less than 20 minutes. Measurement of samples with power output up to 10 W. Precision of better than 1% RSD for sample power greater than 1 W. Precision better than 0.010 watt SD, for sample power less than 1 W. This report gives a description of the calorimeter hardware and software andmore » discusses the test results. The instrument operating procedure, included as an appendix, gives examples of typical input/output and explains the menu driven software. Sample measurement time of less than 20 minutes was attained by pre-equilibration of the samples in low cost precision preheaters and by prediction of equilibrium measurements. Tests at the TA55 Plutonium Facility at Los Alamos National Laboratory, on typical samples, indicates that the instrument meets all the measurement requirements.« less

  20. Initial Experience With A Prototype Storage System At The University Of North Carolina

    NASA Astrophysics Data System (ADS)

    Creasy, J. L.; Loendorf, D. D.; Hemminger, B. M.

    1986-06-01

    A prototype archiving system manufactured by the 3M Corporation has been in place at the University of North Carolina for approximately 12 months. The system was installed as a result of a collaboration between 3M and UNC, with 3M seeking testing of their system, and UNC realizing the need for an archiving system as an essential part of their PACS test-bed facilities. System hardware includes appropriate network and disk interface devices as well as media for both short and long term storage of images and their associated information. The system software includes those procedures necessary to communicate with the network interface elements(NIEs) as well as those procedures necessary to interpret the ACR-NEMA header blocks and to store the images. A subset of the total ACR-NEMA header is parsed and stored in a relational database system. The entire header is stored on disk with the completed study. Interactive programs have been developed that allow radiologists to easily retrieve information about the archived images and to send the full images to a viewing console. Initial experience with the system has consisted primarily of hardware and software debugging. Although the system is ACR-NEMA compatable, further objective and subjective assessments of system performance is awaiting the connection of compatable consoles and acquisition devices to the network.

  1. Systematic comparisons between PRISM version 1.0.0, BAP, and CSMIP ground-motion processing

    USGS Publications Warehouse

    Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A series of benchmark tests was run by comparing results of the Processing and Review Interface for Strong Motion data (PRISM) software version 1.0.0 to Basic Strong-Motion Accelerogram Processing Software (BAP; Converse and Brady, 1992), and to California Strong Motion Instrumentation Program (CSMIP) processing (Shakal and others, 2003, 2004). These tests were performed by using the MatLAB implementation of PRISM, which is equivalent to its public release version in Java language. Systematic comparisons were made in time and frequency domains of records processed in PRISM and BAP, and in CSMIP, by using a set of representative input motions with varying resolutions, frequency content, and amplitudes. Although the details of strong-motion records vary among the processing procedures, there are only minor differences among the waveforms for each component and within the frequency passband common to these procedures. A comprehensive statistical evaluation considering more than 1,800 ground-motion components demonstrates that differences in peak amplitudes of acceleration, velocity, and displacement time series obtained from PRISM and CSMIP processing are equal to or less than 4 percent for 99 percent of the data, and equal to or less than 2 percent for 96 percent of the data. Other statistical measures, including the Euclidian distance (L2 norm) and the windowed root mean square level of processed time series, also indicate that both processing schemes produce statistically similar products.

  2. Implementation of an experimental fault-tolerant memory system

    NASA Technical Reports Server (NTRS)

    Carter, W. C.; Mccarthy, C. E.

    1976-01-01

    The experimental fault-tolerant memory system described in this paper has been designed to enable the modular addition of spares, to validate the theoretical fault-secure and self-testing properties of the translator/corrector, to provide a basis for experiments using the new testing and correction processes for recovery, and to determine the practicality of such systems. The hardware design and implementation are described, together with methods of fault insertion. The hardware/software interface, including a restricted single error correction/double error detection (SEC/DED) code, is specified. Procedures are carefully described which, (1) test for specified physical faults, (2) ensure that single error corrections are not miscorrections due to triple faults, and (3) enable recovery from double errors.

  3. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  4. Evaluating a Service-Oriented Architecture

    DTIC Science & Technology

    2007-09-01

    See the description on page 13. SaaS Software as a service ( SaaS ) is a software delivery model where customers don’t own a copy of the application... serviceability REST Representational State Transfer RIA rich internet application RPC remote procedure call SaaS software as a service SAML Security...Evaluating a Service -Oriented Architecture Phil Bianco, Software Engineering Institute Rick Kotermanski, Summa Technologies Paulo Merson

  5. Application of SNMP on CATV

    NASA Astrophysics Data System (ADS)

    Huang, Hong-bin; Liu, Wei-ping; Chen, Shun-er; Zheng, Liming

    2005-02-01

    A new type of CATV network management system developed by universal MCU, which supports SNMP, is proposed in this paper. From the point of view in both hardware and software, the function and method of every modules inside the system, which include communications in the physical layer, protocol process, data process, and etc, are analyzed. In our design, the management system takes IP MAN as data transmission channel and every controlled object in the management structure has a SNMP agent. In the SNMP agent developed, there are four function modules, including physical layer communication module, protocol process module, internal data process module and MIB management module. In the paper, the structure and function of every module are designed and demonstrated while the related hardware circuit, software flow as well as the experimental results are tested. Furthermore, by introducing RTOS into the software programming, the universal MCU procedure can conducts such multi-thread management as fast Ethernet controller driving, TCP/IP process, serial port signal monitoring and so on, which greatly improves efficiency of CPU.

  6. Definition and testing of the hydrologic component of the pilot land data system

    NASA Technical Reports Server (NTRS)

    Ragan, Robert M.; Sircar, Jayanta K.

    1987-01-01

    The specific aim was to develop within the Pilot Land Data System (PLDS) software design environment, an easily implementable and user friendly geometric correction procedure to readily enable the georeferencing of imagery data from the Advanced Very High Resolution Radiometer (AVHRR) onboard the NOAA series spacecraft. A software subsystem was developed within the guidelines set by the PLDS development environment utilizing NASA Goddard Space Flight Center (GSFC) Image Analysis Facility's (IAF's) Land Analysis Software (LAS) coding standards. The IAS current program development environment, the Transportable Applications Executive (TAE), operates under a VAX VMS operating system and was used as the user interface. A brief overview of the ICARUS algorithm that was implemented in the set of functions developed, is provided. The functional specifications decription is provided, and a list of the individual programs and directory names containing the source and executables installed in the IAF system are listed. A user guide is provided for the LAS system documentation format for the three functions developed.

  7. Approximated adjusted fractional Bayes factors: A general method for testing informative hypotheses.

    PubMed

    Gu, Xin; Mulder, Joris; Hoijtink, Herbert

    2018-05-01

    Informative hypotheses are increasingly being used in psychological sciences because they adequately capture researchers' theories and expectations. In the Bayesian framework, the evaluation of informative hypotheses often makes use of default Bayes factors such as the fractional Bayes factor. This paper approximates and adjusts the fractional Bayes factor such that it can be used to evaluate informative hypotheses in general statistical models. In the fractional Bayes factor a fraction parameter must be specified which controls the amount of information in the data used for specifying an implicit prior. The remaining fraction is used for testing the informative hypotheses. We discuss different choices of this parameter and present a scheme for setting it. Furthermore, a software package is described which computes the approximated adjusted fractional Bayes factor. Using this software package, psychological researchers can evaluate informative hypotheses by means of Bayes factors in an easy manner. Two empirical examples are used to illustrate the procedure. © 2017 The British Psychological Society.

  8. A Browser-Server-Based Tele-audiology System That Supports Multiple Hearing Test Modalities

    PubMed Central

    Yao, Daoyuan; Givens, Gregg

    2015-01-01

    Abstract Introduction: Millions of global citizens suffering from hearing disorders have limited or no access to much needed hearing healthcare. Although tele-audiology presents a solution to alleviate this problem, existing remote hearing diagnosis systems support only pure-tone tests, leaving speech and other test procedures unsolved, due to the lack of software and hardware to enable communication required between audiologists and their remote patients. This article presents a comprehensive remote hearing test system that integrates the two most needed hearing test procedures: a pure-tone audiogram and a speech test. Materials and Methods: This enhanced system is composed of a Web application server, an embedded smart Internet-Bluetooth® (Bluetooth SIG, Kirkland, WA) gateway (or console device), and a Bluetooth-enabled audiometer. Several graphical user interfaces and a relational database are hosted on the application server. The console device has been designed to support the tests and auxiliary communication between the local site and the remote site. Results: The study was conducted at an audiology laboratory. Pure-tone audiogram and speech test results from volunteers tested with this tele-audiology system are comparable with results from the traditional face-to-face approach. Conclusions: This browser-server–based comprehensive tele-audiology offers a flexible platform to expand hearing services to traditionally underserved groups. PMID:25919376

  9. A Browser-Server-Based Tele-audiology System That Supports Multiple Hearing Test Modalities.

    PubMed

    Yao, Jianchu Jason; Yao, Daoyuan; Givens, Gregg

    2015-09-01

    Millions of global citizens suffering from hearing disorders have limited or no access to much needed hearing healthcare. Although tele-audiology presents a solution to alleviate this problem, existing remote hearing diagnosis systems support only pure-tone tests, leaving speech and other test procedures unsolved, due to the lack of software and hardware to enable communication required between audiologists and their remote patients. This article presents a comprehensive remote hearing test system that integrates the two most needed hearing test procedures: a pure-tone audiogram and a speech test. This enhanced system is composed of a Web application server, an embedded smart Internet-Bluetooth(®) (Bluetooth SIG, Kirkland, WA) gateway (or console device), and a Bluetooth-enabled audiometer. Several graphical user interfaces and a relational database are hosted on the application server. The console device has been designed to support the tests and auxiliary communication between the local site and the remote site. The study was conducted at an audiology laboratory. Pure-tone audiogram and speech test results from volunteers tested with this tele-audiology system are comparable with results from the traditional face-to-face approach. This browser-server-based comprehensive tele-audiology offers a flexible platform to expand hearing services to traditionally underserved groups.

  10. Software support in automation of medicinal product evaluations.

    PubMed

    Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen

    2005-01-01

    Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.

  11. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  12. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation)

    PubMed Central

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-01-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760

  13. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation).

    PubMed

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-08-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.

  14. A procedure for automated land use mapping using remotely sensed multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.

    1975-01-01

    A system of processing remotely sensed multispectral scanner data by computer programs to produce color-coded land use maps for large areas is described. The procedure is explained, the software and the hardware are described, and an analogous example of the procedure is presented. Detailed descriptions of the multispectral scanners currently in use are provided together with a summary of the background of current land use mapping techniques. The data analysis system used in the procedure and the pattern recognition software used are functionally described. Current efforts by the NASA Earth Resources Laboratory to evaluate operationally a less complex and less costly system are discussed in a separate section.

  15. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  16. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    NASA Astrophysics Data System (ADS)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  17. Sensor control of robot arc welding

    NASA Technical Reports Server (NTRS)

    Sias, F. R., Jr.

    1985-01-01

    A basic problem in the application of robots for welding which is how to guide a torch along a weld seam using sensory information was studied. Improvement of the quality and consistency of certain Gas Tungsten Arc welds on the Space Shuttle Main Engine (SSME) that are too complex geometrically for conventional automation and therefore are done by hand was examined. The particular problems associated with space shuttle main egnine (SSME) manufacturing and weld-seam tracking with an emphasis on computer vision methods were analyzed. Special interface software for the MINC computr are developed which will allow it to be used both as a test system to check out the robot interface software and later as a development tool for further investigation of sensory systems to be incorporated in welding procedures.

  18. Parameter Optimization for Feature and Hit Generation in a General Unknown Screening Method-Proof of Concept Study Using a Design of Experiment Approach for a High Resolution Mass Spectrometry Procedure after Data Independent Acquisition.

    PubMed

    Elmiger, Marco P; Poetzsch, Michael; Steuer, Andrea E; Kraemer, Thomas

    2018-03-06

    High resolution mass spectrometry and modern data independent acquisition (DIA) methods enable the creation of general unknown screening (GUS) procedures. However, even when DIA is used, its potential is far from being exploited, because often, the untargeted acquisition is followed by a targeted search. Applying an actual GUS (including untargeted screening) produces an immense amount of data that must be dealt with. An optimization of the parameters regulating the feature detection and hit generation algorithms of the data processing software could significantly reduce the amount of unnecessary data and thereby the workload. Design of experiment (DoE) approaches allow a simultaneous optimization of multiple parameters. In a first step, parameters are evaluated (crucial or noncrucial). Second, crucial parameters are optimized. The aim in this study was to reduce the number of hits, without missing analytes. The obtained parameter settings from the optimization were compared to the standard settings by analyzing a test set of blood samples spiked with 22 relevant analytes as well as 62 authentic forensic cases. The optimization lead to a marked reduction of workload (12.3 to 1.1% and 3.8 to 1.1% hits for the test set and the authentic cases, respectively) while simultaneously increasing the identification rate (68.2 to 86.4% and 68.8 to 88.1%, respectively). This proof of concept study emphasizes the great potential of DoE approaches to master the data overload resulting from modern data independent acquisition methods used for general unknown screening procedures by optimizing software parameters.

  19. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  20. Guidance on Software Maintenance. Final Report. Reports on Computer Science and Technology.

    ERIC Educational Resources Information Center

    Martin, Roger J.; Osborne, Wilma M.

    Based on informal discussions with personnel at selected federal agencies and private sector organizations and on additional research, this publication addresses issues and problems of software maintenance and suggests actions and procedures which can help software maintenance organizations meet the growing demands of maintaining existing systems.…

  1. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  2. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  3. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  4. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  5. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  6. 75 FR 38945 - Airworthiness Directives; The Boeing Company Model 777-200 and -300 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-07

    ..., 2006. The service bulletin describes procedures for installing new operational software in the cabin... loading the new cabin services system central storage device software and CSCP OPS into the MMC. FAA's... cabin services system central storage device software and cabin system control panel operational...

  7. Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.

  8. C-Language Integrated Production System, Version 5.1

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh VU; Culbert, Chris; Savely, Robert T.; Mccoy, Daniel J.; Giarratano, Joseph

    1992-01-01

    CLIPS 5.1 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming provides representation of knowledge by use of heuristics. Object-oriented programming enables modeling of complex systems as modular components. Procedural programming enables CLIPS to represent knowledge in ways similar to those allowed in such languages as C, Pascal, Ada, and LISP. Working with CLIPS 5.1, one can develop expert-system software by use of rule-based programming only, object-oriented programming only, procedural programming only, or combinations of the three.

  9. Testing Scientific Software: A Systematic Literature Review

    PubMed Central

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  10. Hubble Systems Optimize Hospital Schedules

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Don Rosenthal, a former Ames Research Center computer scientist who helped design the Hubble Space Telescope's scheduling software, co-founded Allocade Inc. of Menlo Park, California, in 2004. Allocade's OnCue software helps hospitals reclaim unused capacity and optimize constantly changing schedules for imaging procedures. After starting to use the software, one medical center soon reported noticeable improvements in efficiency, including a 12 percent increase in procedure volume, 35 percent reduction in staff overtime, and significant reductions in backlog and technician phone time. Allocade now offers versions for outpatient and inpatient magnetic resonance imaging (MRI), ultrasound, interventional radiology, nuclear medicine, Positron Emission Tomography (PET), radiography, radiography-fluoroscopy, and mammography.

  11. BAT3 Analyzer: Real-Time Data Display and Interpretation Software for the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3)

    USGS Publications Warehouse

    Winston, Richard B.; Shapiro, Allen M.

    2007-01-01

    The BAT3 Analyzer provides real-time display and interpretation of fluid pressure responses and flow rates measured during geochemical sampling, hydraulic testing, or tracer testing conducted with the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3) (Shapiro, 2007). Real-time display of the data collected with the Multifunction BAT3 allows the user to ensure that the downhole apparatus is operating properly, and that test procedures can be modified to correct for unanticipated hydraulic responses during testing. The BAT3 Analyzer can apply calibrations to the pressure transducer and flow meter data to display physically meaningful values. Plots of the time-varying data can be formatted for a specified time interval, and either saved to files, or printed. Libraries of calibrations for the pressure transducers and flow meters can be created, updated and reloaded to facilitate the rapid set up of the software to display data collected during testing with the Multifunction BAT3. The BAT3 Analyzer also has the functionality to estimate calibrations for pressure transducers and flow meters using data collected with the Multifunction BAT3 in conjunction with corroborating check measurements. During testing with the Multifunction BAT3, and also after testing has been completed, hydraulic properties of the test interval can be estimated by comparing fluid pressure responses with model results; a variety of hydrogeologic conceptual models of the formation are available for interpreting fluid-withdrawal, fluid-injection, and slug tests.

  12. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  13. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  14. Aeroservoelastic Wind-Tunnel Tests of a Free-Flying, Joined-Wing SensorCraft Model for Gust Load Alleviation

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Castelluccio, Mark A.; Coulson, David A.; Heeg, Jennifer

    2011-01-01

    A team comprised of the Air Force Research Laboratory (AFRL), Boeing, and the NASA Langley Research Center conducted three aeroservoelastic wind-tunnel tests in the Transonic Dynamics Tunnel to demonstrate active control technologies relevant to large, exible vehicles. In the first of these three tests, a full-span, aeroelastically scaled, wind-tunnel model of a joined-wing SensorCraft vehicle was mounted to a force balance to acquire a basic aerodynamic data set. In the second and third tests, the same wind-tunnel model was mated to a new, two-degree-of-freedom, beam mount. This mount allowed the full-span model to translate vertically and pitch. Trimmed flight at -10% static margin and gust load alleviation were successfully demonstrated. The rigid body degrees of freedom required that the model be own in the wind tunnel using an active control system. This risky mode of testing necessitated that a model arrestment system be integrated into the new mount. The safe and successful completion of these free-flying tests required the development and integration of custom hardware and software. This paper describes the many systems, software, and procedures that were developed as part of this effort. The balance and free ying wind-tunnel tests will be summarized. The design of the trim and gust load alleviation control laws along with the associated results will also be discussed.

  15. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  16. Application of reiteration of Hankel singular value decomposition in quality control

    NASA Astrophysics Data System (ADS)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2017-07-01

    Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.

  17. Reduction to Outside the Atmosphere and Statistical Tests Used in Geneva Photometry

    NASA Technical Reports Server (NTRS)

    Rufener, F.

    1984-01-01

    Conditions for creating a precise photometric system are investigated. The analytical and discriminatory potentials of a photometry obviously result from the localization of the passbands in the spectrum; they do, however, also depend critically on the precision attained. This precision is the result of two different types of precautions. Two procedures which contribute in an efficient manner to achieving greater precision are examined. These two methods are known as hardware related precision and software related precision.

  18. Device for data-acquisition from transient signals: kinetic considerations

    PubMed Central

    Sampedro, A. Sanchez; Vives, S. Sagrado

    1990-01-01

    This paper reports on the evaluation and testing of a home-made device. Data-acquisition, treatment of transient signals and the hardware and software involved are discussed. Some practical aspects are developed in order to power the autonomy of procedures using the device. Kinetic and multi-signal calculations are considered in order to cover the actual tendencies in continuous-flow analysis. Somepractical advantages versus the use of classical chart recorders or commercial computerized-instrument devices are pointed out. PMID:18925275

  19. A Self-Diagnostic System for the M6 Accelerometer

    NASA Technical Reports Server (NTRS)

    Flanagan, Patrick M.; Lekki, John

    2001-01-01

    The design of a Self-Diagnostic (SD) accelerometer system for the Space Shuttle Main Engine is presented. This retrofit system connects diagnostic electronic hardware and software to the current M6 accelerometer system. This paper discusses the general operation of the M6 accelerometer SD system and procedures for developing and evaluating the SD system. Signal processing techniques using M6 accelerometer diagnostic data are explained. Test results include diagnostic data responding to changing ambient temperature, mounting torque and base mounting impedance.

  20. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  1. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  2. High-level virtual reality simulator for endourologic procedures of lower urinary tract.

    PubMed

    Reich, Oliver; Noll, Margarita; Gratzke, Christian; Bachmann, Alexander; Waidelich, Raphaela; Seitz, Michael; Schlenker, Boris; Baumgartner, Reinhold; Hofstetter, Alfons; Stief, Christian G

    2006-06-01

    To analyze the limitations of existing simulators for urologic techniques, and then test and evaluate a novel virtual reality (VR) simulator for endourologic procedures of the lower urinary tract. Surgical simulation using VR has the potential to have a tremendous impact on surgical training, testing, and certification. Endourologic procedures seem to be an ideal target for VR systems. The URO-Trainer features genuine VR, obtained from digital video footage of more than 400 endourologic diagnostic and therapeutic procedures, as well as data from cross-sectional imaging. The software offers infinite random variations of the anatomy and pathologic features for diagnosis and surgical intervention. An advanced haptic force feedback is incorporated. Virtual cystoscopy and resection of bladder tumors were evaluated by 24 medical students and 12 residents at our department. The system was assessed by more than 150 international urologists with varying experience at different conventions and workshops from March 2003 to September 2004. Because of these evaluations and constant evolutions, the final version provides a genuine representation of endourologic procedures. Objective data are generated by a tutoring system that has documented evident teaching benefits for medical students and residents in cystoscopy and treatment of bladder tumors. The URO-Trainer represents the latest generation of endoscopy simulators. Authentic visual and haptic sensations, unlimited virtual cases, and an intelligent tutoring system make this modular system an important improvement in computer-based training and quality control in urology.

  3. Performance Improvements to the Naval Postgraduate School Turbopropulsion Labs Transonic Axially Splittered Rotor

    DTIC Science & Technology

    2013-12-01

    Implementation of current NPS TPL design procedure that uses COTS software (MATLAB, SolidWorks, and ANSYS - CFX ) for the geometric rendering and...procedure that uses commercial-off-the-shelf software (MATLAB, SolidWorks, and ANSYS - CFX ) for the geometric rendering and analysis was modified and... CFX The CFD simulation program in ANSYS Workbench. CFX -Pre CFX boundary conditions and solver settings module. CFX -Solver CFX solver program. CFX

  4. Estimation of sample size and testing power (Part 4).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-01-01

    Sample size estimation is necessary for any experimental or survey research. An appropriate estimation of sample size based on known information and statistical knowledge is of great significance. This article introduces methods of sample size estimation of difference test for data with the design of one factor with two levels, including sample size estimation formulas and realization based on the formulas and the POWER procedure of SAS software for quantitative data and qualitative data with the design of one factor with two levels. In addition, this article presents examples for analysis, which will play a leading role for researchers to implement the repetition principle during the research design phase.

  5. On-line evaluation of multiloop digital controller performance

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.

    1993-01-01

    The purpose of this presentation is to inform the Guidance and Control community of capabilities which were developed by the Aeroservoelasticity Branch to evaluate the performance of multivariable control laws, on-line, during wind-tunnel testing. The capabilities are generic enough to be useful for all kinds of on-line analyses involving multivariable control in experimental testing. Consequently, it was decided to present this material at this workshop even though it has been presented elsewhere. Topics covered include: essential on-line analysis requirements; on-line analysis capabilities; on-line analysis software; frequency domain procedures; controller performance evaluation frequency-domain flutter suppression; and plant determination.

  6. Co-streaming classes: a follow-up study in improving the user experience to better reach users.

    PubMed

    Hayes, Barrie E; Handler, Lara J; Main, Lindsey R

    2011-01-01

    Co-streaming classes have enabled library staff to extend open classes to distance education students and other users. Student evaluations showed that the model could be improved. Two areas required attention: audio problems experienced by online participants and staff teaching methods. Staff tested equipment and adjusted software configuration to improve user experience. Staff training increased familiarity with specialized teaching techniques and troubleshooting procedures. Technology testing and staff training were completed, and best practices were developed and applied. Class evaluations indicate improvements in classroom experience. Future plans include expanding co-streaming to more classes and on-going data collection, evaluation, and improvement of classes.

  7. Parallel Domain Decomposition Formulation and Software for Large-Scale Sparse Symmetrical/Unsymmetrical Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Watson, Willie R. (Technical Monitor)

    2005-01-01

    The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.

  8. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    PubMed

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i.e., a 36% reduction. The on-site data collection did not produce time saving, however this is a temporary weakness that will be annihilated very soon in the future after officers are more acquainted with the software. The phase of evaluation, processing and analysis carried out in the office was dramatically shortened, i.e., a 69% reduction. Another benefit was the standardization which allowed fast and consistent data analysis and evaluation. Even if all these benefits are remarkable, the most valuable benefit of the new procedure was the reduction of the police officers mistakes during the manual operations of survey and data evaluation. Because of these benefits, the satisfaction questionnaires administrated to the police officers after the testing phase showed very good acceptance of the procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Evaluation of management measures of software development. Volume 1: Analysis summary

    NASA Technical Reports Server (NTRS)

    Page, J.; Card, D.; Mcgarry, F.

    1982-01-01

    The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.

  10. Quality-assurance plan for groundwater activities, U.S. Geological Survey, Washington Water Science Center

    USGS Publications Warehouse

    Kozar, Mark D.; Kahle, Sue C.

    2013-01-01

    This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.

  11. Open core control software for surgical robots.

    PubMed

    Arata, Jumpei; Kozuka, Hiroaki; Kim, Hyung Wook; Takesue, Naoyuki; Vladimirov, B; Sakaguchi, Masamichi; Tokuda, Junichi; Hata, Nobuhiko; Chinzei, Kiyoyuki; Fujimoto, Hideo

    2010-05-01

    In these days, patients and doctors in operation room are surrounded by many medical devices as resulting from recent advancement of medical technology. However, these cutting-edge medical devices are working independently and not collaborating with each other, even though the collaborations between these devices such as navigation systems and medical imaging devices are becoming very important for accomplishing complex surgical tasks (such as a tumor removal procedure while checking the tumor location in neurosurgery). On the other hand, several surgical robots have been commercialized, and are becoming common. However, these surgical robots are not open for collaborations with external medical devices in these days. A cutting-edge "intelligent surgical robot" will be possible in collaborating with surgical robots, various kinds of sensors, navigation system and so on. On the other hand, most of the academic software developments for surgical robots are "home-made" in their research institutions and not open to the public. Therefore, open source control software for surgical robots can be beneficial in this field. From these perspectives, we developed Open Core Control software for surgical robots to overcome these challenges. In general, control softwares have hardware dependencies based on actuators, sensors and various kinds of internal devices. Therefore, these control softwares cannot be used on different types of robots without modifications. However, the structure of the Open Core Control software can be reused for various types of robots by abstracting hardware dependent parts. In addition, network connectivity is crucial for collaboration between advanced medical devices. The OpenIGTLink is adopted in Interface class which plays a role to communicate with external medical devices. At the same time, it is essential to maintain the stable operation within the asynchronous data transactions through network. In the Open Core Control software, several techniques for this purpose were introduced. Virtual fixture is well known technique as a "force guide" for supporting operators to perform precise manipulation by using a master-slave robot. The virtual fixture for precise and safety surgery was implemented on the system to demonstrate an idea of high-level collaboration between a surgical robot and a navigation system. The extension of virtual fixture is not a part of the Open Core Control system, however, the function such as virtual fixture cannot be realized without a tight collaboration between cutting-edge medical devices. By using the virtual fixture, operators can pre-define an accessible area on the navigation system, and the area information can be transferred to the robot. In this manner, the surgical console generates the reflection force when the operator tries to get out from the pre-defined accessible area during surgery. The Open Core Control software was implemented on a surgical master-slave robot and stable operation was observed in a motion test. The tip of the surgical robot was displayed on a navigation system by connecting the surgical robot with a 3D position sensor through the OpenIGTLink. The accessible area was pre-defined before the operation, and the virtual fixture was displayed as a "force guide" on the surgical console. In addition, the system showed stable performance in a duration test with network disturbance. In this paper, a design of the Open Core Control software for surgical robots and the implementation of virtual fixture were described. The Open Core Control software was implemented on a surgical robot system and showed stable performance in high-level collaboration works. The Open Core Control software is developed to be a widely used platform of surgical robots. Safety issues are essential for control software of these complex medical devices. It is important to follow the global specifications such as a FDA requirement "General Principles of Software Validation" or IEC62304. For following these regulations, it is important to develop a self-test environment. Therefore, a test environment is now under development to test various interference in operation room such as a noise of electric knife by considering safety and test environment regulations such as ISO13849 and IEC60508. The Open Core Control software is currently being developed software in open-source manner and available on the Internet. A communization of software interface is becoming a major trend in this field. Based on this perspective, the Open Core Control software can be expected to bring contributions in this field.

  12. Poster - Thur Eve - 10: Long term stability of VMAT quality assurance parameters using an EPID.

    PubMed

    Pekar, J; Diamond, K R

    2012-07-01

    The rapidly growing use of volumetric modulated arc therapy (VMAT) treatments in radiation therapy calls for a quantitative, automated, and reliable quality assurance (QA) procedure that can be used routinely in the clinical setting. In this work, we present a series VMAT QA procedures used to assess dynamic multi-leaf collimator (MLC) positional accuracy, variable dose-rate accuracy, and MLC leaf speed accuracy. The QA procedures were performed using amorphous silicon electronic portal imaging devices (EPID) to determine the long term stability of the measured parameters on two Varian linear accelerators. The measurements were repeated weekly on both linear accelerators for a period of three months and the EPID images were analyzed using custom Matlab software. The results of the picket fence tests indicate that MLC leaf positions can be identified to within 0.11 mm and 0.15 mm for static gantry delivery and VMAT delivery respectively. In addition, the dose-rate, gantry speed and MLC leaf speed tests both show very good stability over the measurement period. The measurements thus far, suggest that a number of the dosimetry tests may be suitable for quarterly QA for Varian iX and Trilogy linacs. However, additional measurements are required to confirm the frequency with which each test is required for safe and reliable VMAT delivery at our centre. © 2012 American Association of Physicists in Medicine.

  13. From Print to Pixels: Practitioners' Reflections on the Use of Qualitative Data Analysis Software.

    ERIC Educational Resources Information Center

    Gilbert, Linda S.

    This paper studied how individual qualitative researchers perceive that their research procedures and perspectives have been influenced by the adoption of computer assisted qualitative data software. The study focused on Nud*Ist software (non-numerical Unstructured Data; Indexing, Searching, and Theorizing). The seven participants ranged from new…

  14. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... of public domain computer software. (a) General. This section prescribes the procedures for... software under section 805 of Public Law 101-650, 104 Stat. 5089 (1990). Documents recorded in the...

  15. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  16. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  17. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  18. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  19. Software measurement guidebook

    NASA Technical Reports Server (NTRS)

    Bassman, Mitchell J.; Mcgarry, Frank; Pajerski, Rose

    1994-01-01

    This software Measurement Guidebook presents information on the purpose and importance of measurement. It discusses the specific procedures and activities of a measurement program and the roles of the people involved. The guidebook also clarifies the roles that measurement can and must play in the goal of continual, sustained improvement for all software production and maintenance efforts.

  20. GIS Methodic and New Database for Magmatic Rocks. Application for Atlantic Oceanic Magmatism.

    NASA Astrophysics Data System (ADS)

    Asavin, A. M.

    2001-12-01

    There are several geochemical Databases in INTERNET available now. There one of the main peculiarities of stored geochemical information is geographical coordinates of each samples in those Databases. As rule the software of this Database use spatial information only for users interface search procedures. In the other side, GIS-software (Geographical Information System software),for example ARC/INFO software which using for creation and analyzing special geological, geochemical and geophysical e-map, have been deeply involved with geographical coordinates for of samples. We join peculiarities GIS systems and relational geochemical Database from special software. Our geochemical information system created in Vernadsky Geological State Museum and institute of Geochemistry and Analytical Chemistry from Moscow. Now we tested system with data of geochemistry oceanic rock from Atlantic and Pacific oceans, about 10000 chemical analysis. GIS information content consist from e-map covers Wold Globes. Parts of these maps are Atlantic ocean covers gravica map (with grid 2''), oceanic bottom hot stream, altimeteric maps, seismic activity, tectonic map and geological map. Combination of this information content makes possible created new geochemical maps and combination of spatial analysis and numerical geochemical modeling of volcanic process in ocean segment. Now we tested information system on thick client technology. Interface between GIS system Arc/View and Database resides in special multiply SQL-queries sequence. The result of the above gueries were simple DBF-file with geographical coordinates. This file act at the instant of creation geochemical and other special e-map from oceanic region. We used more complex method for geophysical data. From ARC\\View we created grid cover for polygon spatial geophysical information.

  1. 48 CFR 242.503-2 - Post-award conference procedure.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...

  2. 48 CFR 242.503-2 - Post-award conference procedure.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...

  3. 48 CFR 242.503-2 - Post-award conference procedure.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...

  4. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  5. Control of Technology Transfer at JPL

    NASA Technical Reports Server (NTRS)

    Oliver, Ronald

    2006-01-01

    Controlled Technology: 1) Design: preliminary or critical design data, schematics, technical flow charts, SNV code/diagnostics, logic flow diagrams, wirelist, ICDs, detailed specifications or requirements. 2) Development: constraints, computations, configurations, technical analyses, acceptance criteria, anomaly resolution, detailed test plans, detailed technical proposals. 3) Production: process or how-to: assemble, operated, repair, maintain, modify. 4) Manufacturing: technical instructions, specific parts, specific materials, specific qualities, specific processes, specific flow. 5) Operations: how-to operate, contingency or standard operating plans, Ops handbooks. 6) Repair: repair instructions, troubleshooting schemes, detailed schematics. 7) Test: specific procedures, data, analysis, detailed test plan and retest plans, detailed anomaly resolutions, detailed failure causes and corrective actions, troubleshooting, trended test data, flight readiness data. 8) Maintenance: maintenance schedules and plans, methods for regular upkeep, overhaul instructions. 9) Modification: modification instructions, upgrades kit parts, including software

  6. Analysis of Photogrammetry Data from ISIM Mockup

    NASA Technical Reports Server (NTRS)

    Nowak, Maria; Hill, Mike

    2007-01-01

    During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software, proprietary software owned by Geodetic Systems Inc. The primary objectives of the metrology performed on the ISIM mock-up were (1) to quantify the accuracy of the INCA3 photogrammetry camera on a representative full scale version of the ISIM structure at ambient temperature by comparing the measurements obtained with this camera to measurements using the Leica laser tracker system and (2), empirically determine the smallest increment of target position movement that can be resolved by the PG camera in the test setup, i.e., precision, or resolution. In addition, the geometrical details of the test setup defined during the mockup testing, such as target locations and camera positions, will contribute to the final design of the photogrammetry system to be used on the ISIM Flight Structure.

  7. Aeroservoelastic Testing of Free Flying Wind Tunnel Models Part 1: A Sidewall Supported Semispan Model Tested for Gust Load Alleviation and Flutter Suppression

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Vetter, Travis K.; Penning, Kevin B.; Coulson, David A.; Heeg, Jennifer.

    2013-01-01

    of a two part document. Part 2 is titled: "Aeroservoelastic Testing of Free Flying Wind Tunnel Models, Part 2: A Centerline Supported Fullspan Model Tested for Gust Load Alleviation." A team comprised of the Air Force Research Laboratory (AFRL), Northrop Grumman, Lockheed Martin, and the NASA Langley Research Center conducted three aeroservoelastic wind tunnel tests in the Transonic Dynamics Tunnel to demonstrate active control technologies relevant to large, flexible vehicles. In the first of these three tests, a semispan, aeroelastically scaled, wind tunnel model of a flying wing SensorCraft vehicle was mounted to a force balance to demonstrate gust load alleviation. In the second and third tests, the same wing was mated to a new, multi-degree of freedom, sidewall mount. This mount allowed the half-span model to translate vertically and pitch at the wing root, allowing better simulation of the full span vehicle's rigid body modes. Gust load alleviation (GLA) and Body freedom flutter (BFF) suppression were successfully demonstrated. The rigid body degrees-of-freedom required that the model be flown in the wind tunnel using an active control system. This risky mode of testing necessitated that a model arrestment system be integrated into the new mount. The safe and successful completion of these free flying tests required the development and integration of custom hardware and software. This paper describes the many systems, software, and procedures that were developed as part of this effort.

  8. Remote monitoring of videourodynamics using smart phone and free instant messaging software.

    PubMed

    Hsieh, Po-Fan; Chang, Chao-Hsiang; Lien, Chi-Shun; Wu, Hsi-Chin; Hsiao, Po-Jen; Chou, Eric Chieh-Lung

    2013-11-01

    To evaluate the feasibility of using smart phones plus free instant messaging software for remote monitoring of videourodynamics. From November 2011 to October 2012, 85 females with voiding disorders were enrolled for videourodynamic tests. The patients were assigned to videourodynamics remotely monitored by the attending physician by using iPhone/iPad and Skype (group 1) and videourodynamics with the attending physician present (group 2). The procedural time and videourodynamic qualities, assessed by the frequency of adherence to the modified Sullivan criteria, in each group were recorded and compared. There were 44 and 41 patients in group 1 and group 2, respectively. The mean procedural time was comparable between group 1 and group 2 (56.3 vs. 54.4 min, P = 0.25). The frequencies of adherence to the modified Sullivan criteria were similar in each group. The qualities of videourodynamics under the attending physician's remote or direct monitoring were both appropriate. Based on the convenience of Internet, the popularity of smart phones and the intention to make the urologists use their time more efficiently, our study provides remote monitoring as an alternative way for performing videourodynamics. © 2013 Wiley Periodicals, Inc.

  9. Avoiding Human Error in Mission Operations: Cassini Flight Experience

    NASA Technical Reports Server (NTRS)

    Burk, Thomas A.

    2012-01-01

    Operating spacecraft is a never-ending challenge and the risk of human error is ever- present. Many missions have been significantly affected by human error on the part of ground controllers. The Cassini mission at Saturn has not been immune to human error, but Cassini operations engineers use tools and follow processes that find and correct most human errors before they reach the spacecraft. What is needed are skilled engineers with good technical knowledge, good interpersonal communications, quality ground software, regular peer reviews, up-to-date procedures, as well as careful attention to detail and the discipline to test and verify all commands that will be sent to the spacecraft. Two areas of special concern are changes to flight software and response to in-flight anomalies. The Cassini team has a lot of practical experience in all these areas and they have found that well-trained engineers with good tools who follow clear procedures can catch most errors before they get into command sequences to be sent to the spacecraft. Finally, having a robust and fault-tolerant spacecraft that allows ground controllers excellent visibility of its condition is the most important way to ensure human error does not compromise the mission.

  10. Advanced Communication and Control Solutions of Distributed Energy Resources (DER)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asgeirsson, Haukur; Seguin, Richard; Sherding, Cameron

    2007-01-10

    This report covers work performed in Phase II of a two phase project whose objective was to demonstrate the aggregation of multiple Distributed Energy Resources (DERs) and to offer them into the energy market. The Phase I work (DE-FC36-03CH11161) created an integrated, but distributed, system and procedures to monitor and control multiple DERs from numerous manufacturers connected to the electric distribution system. Procedures were created which protect the distribution network and personnel that may be working on the network. Using the web as the communication medium for control and monitoring of the DERs, the integration of information and security wasmore » accomplished through the use of industry standard protocols such as secure SSL,VPN and ICCP. The primary objective of Phase II was to develop the procedures for marketing the power of the Phase I aggregated DERs in the energy market, increase the number of DER units, and implement the marketing procedures (interface with ISOs) for the DER generated power. The team partnered with the Midwest Independent System Operator (MISO), the local ISO, to address the energy market and demonstrate the economic dispatch of DERs in response to market signals. The selection of standards-based communication technologies offers the ability of the system to be deployed and integrated with other utilities’ resources. With the use of a data historian technology to facilitate the aggregation, the developed algorithms and procedures can be verified, audited, and modified. The team has demonstrated monitoring and control of multiple DERs as outlined in phase I report including procedures to perform these operations in a secure and safe manner. In Phase II, additional DER units were added. We also expanded on our phase I work to enhance communication security and to develop the market model of having DERs, both customer and utility owned, participate in the energy market. We are proposing a two-part DER energy market model--a utility need business model and an independent energy aggregator-business model. The approach of developing two group models of DER energy participation in the market is unique. The Detroit Edison (DECo, Utility)-led team includes: DTE Energy Technologies (Dtech, DER provider), Electrical Distribution Design (EDD, Virginia Tech company supporting EPRI’s Distribution Engineering Workstation, DEW), Systems Integration Specialists Company (SISCO, economic scheduling and real-time protocol integrator), and OSIsoft (PI software system for managing real-time information). This team is focused on developing the application engineering, including software systems necessary for DER’s integration, control and sale into the market place. Phase II Highlights Installed and tested an ICCP link with SSL (security) between DECo, the utility, and DTE Energy Technologies (DTECH), the aggregator, making DER data available to the utility for both monitoring and control. Installed and tested PI process book with circuit & DER operational models for DECo SOC/ROC operator’s use for monitoring of both utility circuit and customer DER parameters. The PI Process Book models also included DER control for the DECo SOC/ROC operators, which was tested and demonstrated control. The DER Tagging and Operating Procedures were developed, which allowed that control to be done in a safe manner, were modified for required MOC/MISO notification procedures. The Distribution Engineering Workstation (DEW) was modified to include temperature normalized load research statistics, using a 30 hour day-ahead weather feed. This allowed day-ahead forecasting of the customer load profile and the entire circuit to determine overload and low voltage problems. This forecast at the point of common coupling was passed to DTech DR SOC for use in their economic dispatch algorithm. Standard Work Instructions were developed for DER notification, sale, and operation into the MISO market. A software mechanism consisting of a suite of new and revised functionality was developed that integrated with the local ISO such that offers can be made electronically without human intervention. A suite of software was developed by DR SOC enabling DER usage in real time and day-ahead: Generation information file exchange with PI and the utility power flow A utility day-ahead information file Energy Offer Web Service Market Result Web Service Real-Time Meter Data Web Service Real-Time Notification Web Service Registered over 20 DER with MISO in Demand Response Market and demonstrated electronic sale to MISO.« less

  11. 47 CFR 1.2202 - Competitive bidding design options.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Section 1.2202 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants...) Procedures that utilize mathematical computer optimization software, such as integer programming, to evaluate... evaluating bids using a ranking based on specified factors. (B) Procedures that combine computer optimization...

  12. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  13. Conversion from Tree to Graph Representation of Requirements

    NASA Technical Reports Server (NTRS)

    Mayank, Vimal; Everett, David Frank; Shmunis, Natalya; Austin, Mark

    2009-01-01

    A procedure and software to implement the procedure have been devised to enable conversion from a tree representation to a graph representation of the requirements governing the development and design of an engineering system. The need for this procedure and software and for other requirements-management tools arises as follows: In systems-engineering circles, it is well known that requirements- management capability improves the likelihood of success in the team-based development of complex systems involving multiple technological disciplines. It is especially desirable to be able to visualize (in order to identify and manage) requirements early in the system- design process, when errors can be corrected most easily and inexpensively.

  14. Executable assertions and flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    Executable assertions are used to test flight control software. The techniques used for testing flight software; however, are different from the techniques used to test other kinds of software. This is because of the redundant nature of flight software. An experimental setup for testing flight software using executable assertions is described. Techniques for writing and using executable assertions to test flight software are presented. The error detection capability of assertions is studied and many examples of assertions are given. The issues of placement and complexity of assertions and the language features to support efficient use of assertions are discussed.

  15. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  16. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  17. Benchmarking of software tools for optical proximity correction

    NASA Astrophysics Data System (ADS)

    Jungmann, Angelika; Thiele, Joerg; Friedrich, Christoph M.; Pforr, Rainer; Maurer, Wilhelm

    1998-06-01

    The point when optical proximity correction (OPC) will become a routine procedure for every design is not far away. For such a daily use the requirements for an OPC tool go far beyond the principal functionality of OPC that was proven by a number of approaches and is documented well in literature. In this paper we first discuss the requirements for a productive OPC tool. Against these requirements a benchmarking was performed with three different OPC tools available on market (OPRX from TVT, OPTISSIMO from aiss and PROTEUS from TMA). Each of these tools uses a different approach to perform the correction (rules, simulation or model). To assess the accuracy of the correction, a test chip was fabricated, which contains corrections done by each software tool. The advantages and weakness of the several solutions are discussed.

  18. Extreme Mapping: Looking for Water on the Moon

    NASA Technical Reports Server (NTRS)

    Cohen, Tamar

    2016-01-01

    There are many challenges when exploring extreme environments. Gathering accurate data to build maps about places that you cannot go is incredibly complex. NASA supports scientists by remotely operating robotic rovers to explore uncharted territories. One potential upcoming mission is to look for water near a lunar pole (the Resource Prospector mission). Learn about the technical hurdles and research steps that NASA takes before the mission. NASA practices on Earth with Mission Analogs which simulate the proposed mission. This includes going to lunar-type landscapes, building field networks, testing out rovers, instruments and operational procedures. NASA sets up remote science back rooms just as there are for actual missions. NASA develops custom Ground Data Systems software to support scientific mission planning and monitoring over variable time delays, and separate commanding software and infrastructure to operate the rovers.

  19. Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) user's guide and system description

    NASA Technical Reports Server (NTRS)

    Lo, P. S.; Card, D.

    1983-01-01

    The Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) is explained. The various software facilities of the SEL, DBAM operating procedures, and DBAM system information are described. The relationships among DBAM components (baseline diagrams), component descriptions, overlay descriptions, indirect command file listings, file definitions, and sample data collection forms are provided.

  20. Preparation guide for class B software specification documents

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1979-01-01

    General conceptual requirements and specific application rules and procedures are provided for the production of software specification documents in conformance with deep space network software standards and class B standards. Class B documentation is identified as the appropriate level applicable to implementation, sustaining engineering, and operational uses by qualified personnel. Special characteristics of class B documents are defined.

  1. Dtest Testing Software

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  2. Bottom-up laboratory testing of the DKIST Visible Broadband Imager (VBI)

    NASA Astrophysics Data System (ADS)

    Ferayorni, Andrew; Beard, Andrew; Cole, Wes; Gregory, Scott; Wöeger, Friedrich

    2016-08-01

    The Daniel K. Inouye Solar Telescope (DKIST) is a 4-meter solar observatory under construction at Haleakala, Hawaii [1]. The Visible Broadband Imager (VBI) is a first light instrument that will record images at the highest possible spatial and temporal resolution of the DKIST at a number of scientifically important wavelengths [2]. The VBI is a pathfinder for DKIST instrumentation and a test bed for developing processes and procedures in the areas of unit, systems integration, and user acceptance testing. These test procedures have been developed and repeatedly executed during VBI construction in the lab as part of a "test early and test often" philosophy aimed at identifying and resolving issues early thus saving cost during integration test and commissioning on summit. The VBI team recently completed a bottom up end-to-end system test of the instrument in the lab that allowed the instrument's functionality, performance, and usability to be validated against documented system requirements. The bottom up testing approach includes four levels of testing, each introducing another layer in the control hierarchy that is tested before moving to the next level. First the instrument mechanisms are tested for positioning accuracy and repeatability using a laboratory position-sensing detector (PSD). Second the real-time motion controls are used to drive the mechanisms to verify speed and timing synchronization requirements are being met. Next the high-level software is introduced and the instrument is driven through a series of end-to-end tests that exercise the mechanisms, cameras, and simulated data processing. Finally, user acceptance testing is performed on operational and engineering use cases through the use of the instrument engineering graphical user interface (GUI). In this paper we present the VBI bottom up test plan, procedures, example test cases and tools used, as well as results from test execution in the laboratory. We will also discuss the benefits realized through completion of this testing, and share lessons learned from the bottoms up testing process.

  3. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  4. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  5. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  6. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  7. A 20-year period of orthotopic liver transplantation activity in a single center: a time series analysis performed using the R Statistical Software.

    PubMed

    Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U

    2009-05-01

    In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.

  8. Evaluation of three different validation procedures regarding the accuracy of template-guided implant placement: an in vitro study.

    PubMed

    Vasak, Christoph; Strbac, Georg D; Huber, Christian D; Lettner, Stefan; Gahleitner, André; Zechner, Werner

    2015-02-01

    The study aims to evaluate the accuracy of the NobelGuide™ (Medicim/Nobel Biocare, Göteborg, Sweden) concept maximally reducing the influence of clinical and surgical parameters. Moreover, the study was to compare and validate two validation procedures versus a reference method. Overall, 60 implants were placed in 10 artificial edentulous mandibles according to the NobelGuide™ protocol. For merging the pre- and postoperative DICOM data sets, three different fusion methods (Triple Scan Technique, NobelGuide™ Validation software, and AMIRA® software [VSG - Visualization Sciences Group, Burlington, MA, USA] as reference) were applied. Discrepancies between the virtual and the actual implant positions were measured. The mean deviations measured with AMIRA® were 0.49 mm (implant shoulder), 0.69 mm (implant apex), and 1.98°mm (implant axis). The Triple Scan Technique as well as the NobelGuide™ Validation software revealed similar deviations compared with the reference method. A significant correlation between angular and apical deviations was seen (r = 0.53; p < .001). A greater implant diameter was associated with greater deviations (p = .03). The Triple Scan Technique as a system-independent validation procedure as well as the NobelGuide™ Validation software are in accordance with the AMIRA® software. The NobelGuide™ system showed similar or less spatial and angular deviations compared with others. © 2013 Wiley Periodicals, Inc.

  9. SPSS and SAS programs for comparing Pearson correlations and OLS regression coefficients.

    PubMed

    Weaver, Bruce; Wuensch, Karl L

    2013-09-01

    Several procedures that use summary data to test hypotheses about Pearson correlations and ordinary least squares regression coefficients have been described in various books and articles. To our knowledge, however, no single resource describes all of the most common tests. Furthermore, many of these tests have not yet been implemented in popular statistical software packages such as SPSS and SAS. In this article, we describe all of the most common tests and provide SPSS and SAS programs to perform them. When they are applicable, our code also computes 100 × (1 - α)% confidence intervals corresponding to the tests. For testing hypotheses about independent regression coefficients, we demonstrate one method that uses summary data and another that uses raw data (i.e., Potthoff analysis). When the raw data are available, the latter method is preferred, because use of summary data entails some loss of precision due to rounding.

  10. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  11. ModBack - simplified contaminant source zone delineation using backtracking

    NASA Astrophysics Data System (ADS)

    Thielsch, K.; Herold, M.; Ptak, T.

    2012-12-01

    Contaminated groundwater poses a serious threat to drinking water resources all over the world. Even though contaminated water might be detected in observation wells, a proper clean up is often only successful if the source of the contamination is detected and subsequently removed, contained or remediated. The high costs of groundwater remediation could be possibly significantly reduced if, from the outset, a focus is placed on source zone detection. ModBack combines several existing modelling tools in one easy to use GIS-based interface helping to delineate potential contaminant source zones in the subsurface. The software is written in Visual Basic 3.5 and uses the ArcObjects library to implement all required GIS applications. It can run without modification on any Microsoft Windows based PC with sufficient RAM and at least Microsoft .NET Framework 3.5. Using ModBack requires additional installation of the following software: Processing Modflow Pro 7.0, ModPath, CSTREAM (Bayer-Raich et al., 2003), Golden Software Surfer and Microsoft Excel. The graphical user interface of ModBack is separated into four blocks of procedures dealing with: data input, groundwater modelling, backtracking and analyses. Geographical data input includes all georeferenced information pertaining to the study site. Information on subsurface contamination is gathered either by conventional sampling of monitoring wells or by conducting integral pumping tests at control planes with a specific sampling scheme. Hydraulic data from these pumping tests together with all other available information are then used to set up a groundwater flow model of the study site, which provides the flow field for transport simulations within the subsequent contamination backtracking procedures, starting from the defined control planes. The backtracking results are then analysed within ModBack. The potential areas of contamination source presence or absence are determined based on the procedure used by Jarsjö et al. (2005). The contaminant plume length can be estimated using plume length statistics, first order rate degradation equations or calculations based on site specific hydraulic and chemical parameters. Furthermore, an analytical tool is included to identify the distribution of contaminants across a control plane. All relevant output can be graphically displayed and saved as vector data to be later used in GIS software. ModBack has been already used to delimit the zones of source presence or absence at several test sites. With ModBack, a tool is now available which enables environmental consultants, engineers and environmental agencies to delineate possible sources of contamination already at the planning stage of site investigation and remediation measures, helping to significantly reduce costs of contaminated site management. Bayer-Raich, M., Jarsjö, J., Holder, T. and Ptak, T. (2003): "Numerical estimations of contaminant mass flow rate based on concentration measurements in pumping wells", ModelCare 2002: A Few Steps Closer to Reality, IAHS Publication No. 277, 10-16. Jarsjö, J., Bayer-Raich, M., Ptak, T. (2005): "Monitoring groundwater contamination and delineating source zones at industrial sites: Uncertainty analyses using integral pumping tests", Journal of Contaminant Hydrology, 79, 107-134

  12. Prospective Study of Neuroendoscopy versus Microscopy: 213 Cases of Microvascular Decompression for Trigeminal Neuralgia Performed by One Neurosurgeon.

    PubMed

    Xiang, Hui; Wu, Guangyong; Ouyang, Jia; Liu, Ruen

    2018-03-01

    To compare the efficacy and complications of microvascular decompression (MVD) by complete neuroendoscopy versus microscopy for 213 cases of trigeminal neuralgia (TN). Between January 2014 and January 2016, 213 patients with TN were randomly assigned to the neuroendoscopy (n = 105) or microscopy (n = 114) group for MVD via the suboccipital retrosigmoid approach. All procedures were performed by the same neurosurgeon. Follow-up was conducted by telephone interview. Statistical data were analyzed with the chi-square test, and a probability (P) value of ≤0.05 was considered statistically significant. Chi-square test was conducted using SAS 9.4 software (SAS Institute, Cary, North Carolina, USA). There were no statistical differences between the 2 groups in pain-free condition immediately post procedure, pain-free condition 1 year post procedure, hearing loss, facial hypoesthesia, transient ataxia, aseptic meningitis, intracranial infections, and herpetic lesions of the lips. There were no instances of death, facial paralysis, cerebral hemorrhage, or cerebrospinal fluid leakage in either group. There were no significant differences in the cure rates or incidences of surgical complications between neuroendoscopic and microscopic MVD. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Simulation Testing of Embedded Flight Software

    NASA Technical Reports Server (NTRS)

    Shahabuddin, Mohammad; Reinholtz, William

    2004-01-01

    Virtual Real Time (VRT) is a computer program for testing embedded flight software by computational simulation in a workstation, in contradistinction to testing it in its target central processing unit (CPU). The disadvantages of testing in the target CPU include the need for an expensive test bed, the necessity for testers and programmers to take turns using the test bed, and the lack of software tools for debugging in a real-time environment. By virtue of its architecture, most of the flight software of the type in question is amenable to development and testing on workstations, for which there is an abundance of commercially available debugging and analysis software tools. Unfortunately, the timing of a workstation differs from that of a target CPU in a test bed. VRT, in conjunction with closed-loop simulation software, provides a capability for executing embedded flight software on a workstation in a close-to-real-time environment. A scale factor is used to convert between execution time in VRT on a workstation and execution on a target CPU. VRT includes high-resolution operating- system timers that enable the synchronization of flight software with simulation software and ground software, all running on different workstations.

  14. Ground Software Maintenance Facility (GSMF) user's manual. Appendices NASA-CR-178806 NAS 1.26:178806 Rept-41849-G159-026-App HC A05/MF A01

    NASA Technical Reports Server (NTRS)

    Aquila, V.; Derrig, D.; Griffith, G.

    1986-01-01

    Procedures are presented that allow the user to assemble tasks, link, compile, backup the system, generate/establish/print display pages, cancel tasks in memory, and to TET an assembly task without having to enter the commands every time. A list of acronyms is provided. Software identification, payload checkout unit operating system services, data base generation, and MITRA operating procedures are also discussed.

  15. Performance evaluation of different types of particle representation procedures of Particle Swarm Optimization in Job-shop Scheduling Problems

    NASA Astrophysics Data System (ADS)

    Izah Anuar, Nurul; Saptari, Adi

    2016-02-01

    This paper addresses the types of particle representation (encoding) procedures in a population-based stochastic optimization technique in solving scheduling problems known in the job-shop manufacturing environment. It intends to evaluate and compare the performance of different particle representation procedures in Particle Swarm Optimization (PSO) in the case of solving Job-shop Scheduling Problems (JSP). Particle representation procedures refer to the mapping between the particle position in PSO and the scheduling solution in JSP. It is an important step to be carried out so that each particle in PSO can represent a schedule in JSP. Three procedures such as Operation and Particle Position Sequence (OPPS), random keys representation and random-key encoding scheme are used in this study. These procedures have been tested on FT06 and FT10 benchmark problems available in the OR-Library, where the objective function is to minimize the makespan by the use of MATLAB software. Based on the experimental results, it is discovered that OPPS gives the best performance in solving both benchmark problems. The contribution of this paper is the fact that it demonstrates to the practitioners involved in complex scheduling problems that different particle representation procedures can have significant effects on the performance of PSO in solving JSP.

  16. A.I.-based real-time support for high performance aircraft operations

    NASA Technical Reports Server (NTRS)

    Vidal, J. J.

    1985-01-01

    Artificial intelligence (AI) based software and hardware concepts are applied to the handling system malfunctions during flight tests. A representation of malfunction procedure logic using Boolean normal forms are presented. The representation facilitates the automation of malfunction procedures and provides easy testing for the embedded rules. It also forms a potential basis for a parallel implementation in logic hardware. The extraction of logic control rules, from dynamic simulation and their adaptive revision after partial failure are examined. It uses a simplified 2-dimensional aircraft model with a controller that adaptively extracts control rules for directional thrust that satisfies a navigational goal without exceeding pre-established position and velocity limits. Failure recovery (rule adjusting) is examined after partial actuator failure. While this experiment was performed with primitive aircraft and mission models, it illustrates an important paradigm and provided complexity extrapolations for the proposed extraction of expertise from simulation, as discussed. The use of relaxation and inexact reasoning in expert systems was also investigated.

  17. A survey on the effectiveness of using GeoGebra software towards lecturers' conceptual knowledge and procedural mathematics

    NASA Astrophysics Data System (ADS)

    Wan Salleh, Masturah; Sulaiman, Hajar

    2013-04-01

    The use of technology in the teaching of mathematics at the university level has long been introduced; but many among the lecturers, especially those that have taught for many years, still opt for a traditional teaching method, that is, by lecture talk. One reason is that lecturers themselves were not exposed to the technologies available and how it can assist in the teaching and learning procedures (T&L) in mathematics. GeoGebra is a mathematical software which is open and free and has just recently been introduced in Malaysia. Compared with the software Cabri Geometry and Geometer's Sketchpad (GSP), which only focus on geometry, GeoGebra is able to connect geometry, algebra and numerical representation. Realizing this, the researchers have conducted a study to expose the university lecturers on the use of GeoGebra in T&L. The researchers chose to do the research on mathematics lecturers at the Department of Computer Science and Mathematics (JSKM), Universiti Teknologi Mara (UiTM), Penang. The objective of this study is to determine whether an exposure to GeoGebra software can affect the conceptual knowledge and procedural teaching of mathematics at the university level. This study is a combination of descriptive and qualitative. One session was conducted in an open workshop for all the 45 lecturers. From that total, four people were selected as a sample. The sample was selected by using a simple random sampling method. This study used materials in the form of modules during the workshop. In terms of conceptual knowledge, the results showed that the GeoGebra software is appropriate, relevant and highly effective for in-depth understanding of the selected topics. While the procedural aspects of teaching, it can be one of the teaching aids and considerably facilitate the lecturers.

  18. A Genuine TEAM Player

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Qualtech Systems, Inc. developed a complete software system with capabilities of multisignal modeling, diagnostic analysis, run-time diagnostic operations, and intelligent interactive reasoners. Commercially available as the TEAMS (Testability Engineering and Maintenance System) tool set, the software can be used to reveal unanticipated system failures. The TEAMS software package is broken down into four companion tools: TEAMS-RT, TEAMATE, TEAMS-KB, and TEAMS-RDS. TEAMS-RT identifies good, bad, and suspect components in the system in real-time. It reports system health results from onboard tests, and detects and isolates failures within the system, allowing for rapid fault isolation. TEAMATE takes over from where TEAMS-RT left off by intelligently guiding the maintenance technician through the troubleshooting procedure, repair actions, and operational checkout. TEAMS-KB serves as a model management and collection tool. TEAMS-RDS (TEAMS-Remote Diagnostic Server) has the ability to continuously assess a system and isolate any failure in that system or its components, in real time. RDS incorporates TEAMS-RT, TEAMATE, and TEAMS-KB in a large-scale server architecture capable of providing advanced diagnostic and maintenance functions over a network, such as the Internet, with a web browser user interface.

  19. WRAP low level waste restricted waste management (LLW RWM) glovebox acceptance test report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leist, K.J.

    1997-11-24

    On April 22, 1997, the Low Level Waste Restricted Waste Management (LLW RWM) glovebox was tested using acceptance test procedure 13027A-87. Mr. Robert L. Warmenhoven served as test director, Mr. Kendrick Leist acted as test operator and test witness, and Michael Lane provided miscellaneous software support. The primary focus of the glovebox acceptance test was to examine glovebox control system interlocks, operator Interface Unit (OIU) menus, alarms, and messages. Basic drum port and lift table control sequences were demonstrated. OIU menus, messages, and alarm sequences were examined, with few exceptions noted. Barcode testing was bypassed, due to the lack ofmore » installed equipment as well as the switch from basic reliance on fixed bar code readers to the enhanced use of portable bar code readers. Bar code testing was completed during performance of the LLW RWM OTP. Mechanical and control deficiencies were documented as Test Exceptions during performance of this Acceptance Test. These items are attached as Appendix A to this report.« less

  20. Submarine pipeline on-bottom stability. Volume 2: Software and manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less

  1. AirSTAR Hardware and Software Design for Beyond Visual Range Flight Research

    NASA Technical Reports Server (NTRS)

    Laughter, Sean; Cox, David

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Airborne Subscale Transport Aircraft Research (AirSTAR) Unmanned Aerial System (UAS) is a facility developed to study the flight dynamics of vehicles in emergency conditions, in support of aviation safety research. The system was upgraded to have its operational range significantly expanded, going beyond the line of sight of a ground-based pilot. A redesign of the airborne flight hardware was undertaken, as well as significant changes to the software base, in order to provide appropriate autonomous behavior in response to a number of potential failures and hazards. Ground hardware and system monitors were also upgraded to include redundant communication links, including ADS-B based position displays and an independent flight termination system. The design included both custom and commercially available avionics, combined to allow flexibility in flight experiment design while still benefiting from tested configurations in reversionary flight modes. A similar hierarchy was employed in the software architecture, to allow research codes to be tested, with a fallback to more thoroughly validated flight controls. As a remotely piloted facility, ground systems were also developed to ensure the flight modes and system state were communicated to ground operations personnel in real-time. Presented in this paper is a general overview of the concept of operations for beyond visual range flight, and a detailed review of the airborne hardware and software design. This discussion is held in the context of the safety and procedural requirements that drove many of the design decisions for the AirSTAR UAS Beyond Visual Range capability.

  2. Interaction Models for Functional Regression.

    PubMed

    Usset, Joseph; Staicu, Ana-Maria; Maity, Arnab

    2016-02-01

    A functional regression model with a scalar response and multiple functional predictors is proposed that accommodates two-way interactions in addition to their main effects. The proposed estimation procedure models the main effects using penalized regression splines, and the interaction effect by a tensor product basis. Extensions to generalized linear models and data observed on sparse grids or with measurement error are presented. A hypothesis testing procedure for the functional interaction effect is described. The proposed method can be easily implemented through existing software. Numerical studies show that fitting an additive model in the presence of interaction leads to both poor estimation performance and lost prediction power, while fitting an interaction model where there is in fact no interaction leads to negligible losses. The methodology is illustrated on the AneuRisk65 study data.

  3. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  4. Modular Rocket Engine Control Software (MRECS)

    NASA Technical Reports Server (NTRS)

    Tarrant, C.; Crook, J.

    1998-01-01

    The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.

  5. Radiofrequency-Based Identification Medical Device: An Evaluable Solution for Surgical Sponge Retrieval?

    PubMed

    Lazzaro, Alessandra; Corona, Arianna; Iezzi, Luca; Quaresima, Silvia; Armisi, Luca; Piccolo, Ilaria; Medaglia, Carlo Maria; Sbrenni, Sergio; Sileri, Pierpaolo; Rosato, Nicola; Gaspari, Achille Lucio; Di Lorenzo, Nicola

    2017-06-01

    A retained surgical item in patients (gossypiboma) is a persisting problem, despite consistent improvements and existing guidelines in counting instruments and sponges. Previous experiences with radiofrequency identification technology (RFID) tracking sponges show that it could represent an innovation, in order to reduce the criticism and increase the effectiveness during surgical procedures. We present an automated system that allows reduction of errors and improves safety in the operating room. The system consists of 3 antennas, surgical sponges containing RFID tags, and dedicated software applications, with Wi-Fi real-time communication between devices. The first antenna provides the initial count of gauzes; the second a real-time counting during surgery, including the sponges thrown into the kick-bucket; and the third can be used in the event of uneven sponge count. The software allows management at all stages of the process. In vitro and in vivo tests were performed: the system provided excellent results in detecting sponges in patients' body. Hundred percent retained sponges were detected correctly, even when they were overlapped. No false positive or false negative was recorded. The counting procedure turned out to be more streamlined and efficient and it could save time in a standard procedure. The RFID system for sponge tracking was shown to be experimentally a reliable and feasible method to track sponges with a full detection accuracy in the operating room. The results indicate the system to be safe and effective with acceptable cost-effective parameters.

  6. Modeling and Simulation of Shuttle Launch and Range Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    The simulation and modeling test bed is based on a mockup of a space flight operations control suitable to experiment physical, procedural, software, hardware and psychological aspects of space flight operations. The test bed consists of a weather expert system to advise on the effect of weather to the launch operations. It also simulates toxic gas dispersion model, impact of human health risk, debris dispersion model in 3D visualization. Since all modeling and simulation is based on the internet, it could reduce the cost of operations of launch and range safety by conducting extensive research before a particular launch. Each model has an independent decision making module to derive the best decision for launch.

  7. A shift from significance test to hypothesis test through power analysis in medical research.

    PubMed

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  8. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    ERIC Educational Resources Information Center

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  9. Prototype Common Bus Spacecraft: Hover Test Implementation and Results. Revision, Feb. 26, 2009

    NASA Technical Reports Server (NTRS)

    Hine, Butler Preston; Turner, Mark; Marshall, William S.

    2009-01-01

    In order to develop the capability to evaluate control system technologies, NASA Ames Research Center (Ames) began a test program to build a Hover Test Vehicle (HTV) - a ground-based simulated flight vehicle. The HTV would integrate simulated propulsion, avionics, and sensors into a simulated flight structure, and fly that test vehicle in terrestrial conditions intended to simulate a flight environment, in particular for attitude control. The ultimate purpose of the effort at Ames is to determine whether the low-cost hardware and flight software techniques are viable for future low cost missions. To enable these engineering goals, the project sought to develop a team, processes and procedures capable of developing, building and operating a fully functioning vehicle including propulsion, GN&C, structure, power and diagnostic sub-systems, through the development of the simulated vehicle.

  10. Modeling the Rapid Boil-Off of a Cryogenic Liquid When Injected into a Low Pressure Cavity

    NASA Technical Reports Server (NTRS)

    Lira, Eric

    2016-01-01

    Many launch vehicle cryogenic applications require the modeling of injecting a cryogenic liquid into a low pressure cavity. The difficulty of such analyses lies in accurately predicting the heat transfer coefficient between the cold liquid and a warm wall in a low pressure environment. The heat transfer coefficient and the behavior of the liquid is highly dependent on the mass flow rate into the cavity, the cavity wall temperature and the cavity volume. Testing was performed to correlate the modeling performed using Thermal Desktop and Sinda Fluint Thermal and Fluids Analysis Software. This presentation shall describe a methodology to model the cryogenic process using Sinda Fluint, a description of the cryogenic test set up, a description of the test procedure and how the model was correlated to match the test results.

  11. Educational Software Acquisition for Microcomputers.

    ERIC Educational Resources Information Center

    Erikson, Warren; Turban, Efraim

    1985-01-01

    Examination of issues involved in acquiring appropriate microcomputer software for higher education focuses on the following points: developing your own software; finding commercially available software; using published evaluations; pre-purchase testing; customizing and adapting commercial software; post-purchase testing; and software use. A…

  12. Development Of Ultrasonic Testing Based On Delphi Program As A Learning Media In The Welding Material Study Of Detection And Welding Disables In The Environment Of Vocational Education

    NASA Astrophysics Data System (ADS)

    Dwi Cahyono, Bagus; Ainur, Chandra

    2018-04-01

    The development of science and technology has a direct impact on the preparation of qualified workers, including the preparation of vocational high school graduates. Law Number 20 the Year 2003 on National Education System explains that the purpose of vocational education is to prepare learners to be ready to work in certain fields. One of the learning materials in Vocational High School is welding and detecting welding defects. Introduction of welding and detecting welding defects, one way that can be done is by ultrasonic testing will be very difficult if only capitalize the book only. Therefore this study aims to adopt ultrasonic testing in a computer system. This system is called Delphi Program-based Ultrasonic Testing Expert System. This system is used to determine the classification and type of welding defects of the welded defect indicator knew. In addition to the system, there is a brief explanation of the notion of ultrasonic testing, calibration procedures and inspection procedures ultrasonic testing. In this system, ultrasonic input data testing that shows defects entered into the computer manually. This system is built using Delphi 7 software and Into Set Up Compiler as an installer. The method used in this research is Research and Development (R & D), with the following stages: (1) preliminary research; (2) manufacture of software design; (3) materials collection; (4) early product development; (5) validation of instructional media experts; (6) product analysis and revision; (8) media trials in learning; And (9) result of end product of instructional media. The result of the research shows that: (1) the result of feasibility test according to ultrasonic material testing expert that the system is feasible to be used as instructional media in welding material subject and welding defect detection in vocational education environment, because it contains an explanation about detection method of welding defect using method Ultrasonic testing in detail; (2) feasibility test results according to media experts, that this system has a very attractive visual, user friendly, compatible with windows and Linux and media size that is not too large; And (3) result of test by using data of indication of welding defect in PT PAL Surabaya, obtained classification data of welding defect in accordance with calculation of welding defect classification.

  13. Design and results of the pretest of the IDEFICS study.

    PubMed

    Suling, M; Hebestreit, A; Peplies, J; Bammann, K; Nappo, A; Eiben, G; Alvira, J M Fernández; Verbestel, V; Kovács, E; Pitsiladis, Y P; Veidebaum, T; Hadjigeorgiou, C; Knof, K; Ahrens, W

    2011-04-01

    During the preparatory phase of the baseline survey of the IDEFICS (Identification and prevention of dietary- and lifestyle-induced health effects in children and infants) study, standardised survey procedures including instruments, examinations, methods, biological sampling and software tools were developed and pretested for their feasibility, robustness and acceptability. A pretest was conducted of full survey procedures in 119 children aged 2-9 years in nine European survey centres (N(per centre)=4-27, mean 13.22). Novel techniques such as ultrasound measurements to assess subcutaneous fat and bone health, heart rate monitors combined with accelerometers and sensory taste perception tests were used. Biological sampling, physical examinations, sensory taste perception tests, parental questionnaire and medical interview required only minor amendments, whereas physical fitness tests required major adaptations. Callipers for skinfold measurements were favoured over ultrasonography, as the latter showed only a low-to-modest agreement with calliper measurements (correlation coefficients of r=-0.22 and r=0.67 for all children). The combination of accelerometers with heart rate monitors was feasible in school children only. Implementation of the computer-based 24-h dietary recall required a complex and intensive developmental stage. It was combined with the assessment of school meals, which was changed after the pretest from portion weighing to the more feasible observation of the consumed portion size per child. The inclusion of heel ultrasonometry as an indicator of bone stiffness was the most important amendment after the pretest. Feasibility and acceptability of all procedures had to be balanced against their scientific value. Extensive pretesting, training and subsequent refinement of the methods were necessary to assess the feasibility of all instruments and procedures in routine fieldwork and to exchange or modify procedures that would otherwise give invalid or misleading results.

  14. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  15. Healthcare software assurance.

    PubMed

    Cooper, Jason G; Pauley, Keith A

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.

  16. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  17. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  18. Technical Report: Benchmarking for Quasispecies Abundance Inference with Confidence Intervals from Metagenomic Sequence Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLoughlin, K.

    2016-01-22

    The software application “MetaQuant” was developed by our group at Lawrence Livermore National Laboratory (LLNL). It is designed to profile microbial populations in a sample using data from whole-genome shotgun (WGS) metagenomic DNA sequencing. Several other metagenomic profiling applications have been described in the literature. We ran a series of benchmark tests to compare the performance of MetaQuant against that of a few existing profiling tools, using real and simulated sequence datasets. This report describes our benchmarking procedure and results.

  19. An expert system prototype for aiding in the development of software functional requirements for NASA Goddard's command management system: A case study and lessons learned

    NASA Technical Reports Server (NTRS)

    Liebowitz, Jay

    1986-01-01

    At NASA Goddard, the role of the command management system (CMS) is to transform general requests for spacecraft opeerations into detailed operational plans to be uplinked to the spacecraft. The CMS is part of the NASA Data System which entails the downlink of science and engineering data from NASA near-earth satellites to the user, and the uplink of command and control data to the spacecraft. Presently, it takes one to three years, with meetings once or twice a week, to determine functional requirements for CMS software design. As an alternative approach to the present technique of developing CMS software functional requirements, an expert system prototype was developed to aid in this function. Specifically, the knowledge base was formulated through interactions with domain experts, and was then linked to an existing expert system application generator called 'Knowledge Engineering System (Version 1.3).' Knowledge base development focused on four major steps: (1) develop the problem-oriented attribute hierachy; (2) determine the knowledge management approach; (3) encode the knowledge base; and (4) validate, test, certify, and evaluate the knowledge base and the expert system prototype as a whole. Backcasting was accomplished for validating and testing the expert system prototype. Knowledge refinement, evaluation, and implementation procedures of the expert system prototype were then transacted.

  20. Automatic segmentation software in locally advanced rectal cancer: READY (REsearch program in Auto Delineation sYstem)-RECTAL 02: prospective study.

    PubMed

    Gambacorta, Maria A; Boldrini, Luca; Valentini, Chiara; Dinapoli, Nicola; Mattiucci, Gian C; Chiloiro, Giuditta; Pasini, Danilo; Manfrida, Stefania; Caria, Nicola; Minsky, Bruce D; Valentini, Vincenzo

    2016-07-05

    To validate autocontouring software (AS) in a clinical practice including a two steps delineation quality assurance (QA) procedure.The existing delineation agreement among experts for rectal cancer and the overlap and time criteria that have to be verified to allow the use of AS were defined.Median Dice Similarity Coefficient (MDSC), Mean slicewise Hausdorff Distances (MSHD) and Total-Time saving (TT) were analyzed.Two expert Radiation Oncologists reviewed CT-scans of 44 patients and agreed the reference-CTV: the first 14 consecutive cases were used to populate the software Atlas and 30 were used as Test.Each expert performed a manual (group A) and an automatic delineation (group B) of 15 Test patients.The delineations were compared with the reference contours.The overlap between the manual and automatic delineations with MDSC and MSHD and the TT were analyzed.Three acceptance criteria were set: MDSC ≥ 0.75, MSHD ≤1mm and TT sparing ≥ 50%.At least 2 criteria had to be met, one of which had to be TT saving, to validate the system.The MDSC was 0.75, MSHD 2.00 mm and the TT saving 55.5% between group A and group B. MDSC among experts was 0.84.Autosegmentation systems in rectal cancer partially met acceptability criteria with the present version.

  1. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  2. Preoperative Planning of Orthopedic Procedures using Digitalized Software Systems.

    PubMed

    Steinberg, Ely L; Segev, Eitan; Drexler, Michael; Ben-Tov, Tomer; Nimrod, Snir

    2016-06-01

    The progression from standard celluloid films to digitalized technology led to the development of new software programs to fulfill the needs of preoperative planning. We describe here preoperative digitalized programs and the variety of conditions for which those programs can be used to facilitate preparation for surgery. A PubMed search using the keywords "digitalized software programs," "preoperative planning" and "total joint arthroplasty" was performed for all studies regarding preoperative planning of orthopedic procedures that were published from 1989 to 2014 in English. Digitalized software programs are enabled to import and export all picture archiving communication system (PACS) files (i.e., X-rays, computerized tomograms, magnetic resonance images) from either the local working station or from any remote PACS. Two-dimension (2D) and 3D CT scans were found to be reliable tools with a high preoperative predicting accuracy for implants. The short learning curve, user-friendly features, accurate prediction of implant size, decreased implant stocks and low-cost maintenance makes digitalized software programs an attractive tool in preoperative planning of total joint replacement, fracture fixation, limb deformity repair and pediatric skeletal disorders.

  3. How the NWC handles software as product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinson, D.

    1997-11-01

    This tutorial provides a hands-on view of how the Nuclear Weapons Complex project should be handling (or planning to handle) software as a product in response to Engineering Procedure 401099. The SQAS has published the document SQAS96-002, Guidelines for NWC Processes for Handling Software Product, that will be the basis for the tutorial. The primary scope of the tutorial is on software products that result from weapons and weapons-related projects, although the information presented is applicable to many software projects. Processes that involve the exchange, review, or evaluation of software product between or among NWC sites, DOE, and external customersmore » will be described.« less

  4. Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.

    ERIC Educational Resources Information Center

    Pickett, John C.

    1984-01-01

    AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)

  5. The Associate Principal Astronomer for AI Management of Automatic Telescopes

    NASA Technical Reports Server (NTRS)

    Henry, Gregory W.

    1998-01-01

    This research program in scheduling and management of automatic telescopes had the following objectives: 1. To field test the 1993 Automatic Telescope Instruction Set (ATIS93) programming language, which was specifically developed to allow real-time control of an automatic telescope via an artificial intelligence scheduler running on a remote computer. 2. To develop and test the procedures for two-way communication between a telescope controller and remote scheduler via the Internet. 3. To test various concepts in Al scheduling being developed at NASA Ames Research Center on an automatic telescope operated by Tennessee State University at the Fairborn Observatory site in southern Arizona. and 4. To develop a prototype software package, dubbed the Associate Principal Astronomer, for the efficient scheduling and management of automatic telescopes.

  6. Factors That Affect Software Testability

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  7. SAO mission support software and data standards, version 1.0

    NASA Technical Reports Server (NTRS)

    Hsieh, P.

    1993-01-01

    This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.

  8. Space Geodesy Project Information and Configuration Management Procedure

    NASA Technical Reports Server (NTRS)

    Merkowitz, Stephen M.

    2016-01-01

    This plan defines the Space Geodesy Project (SGP) policies, procedures, and requirements for Information and Configuration Management (CM). This procedure describes a process that is intended to ensure that all proposed and approved technical and programmatic baselines and changes to the SGP hardware, software, support systems, and equipment are documented.

  9. A methodology for testing fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.

    1985-01-01

    A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.

  10. Computing in the presence of soft bit errors. [caused by single event upset on spacecraft

    NASA Technical Reports Server (NTRS)

    Rasmussen, R. D.

    1984-01-01

    It is shown that single-event-upsets (SEUs) due to cosmic rays are a significant source of single bit error in spacecraft computers. The physical mechanism of SEU, electron hole generation by means of Linear Energy Transfer (LET), it discussed with reference made to the results of a study of the environmental effects on computer systems of the Galileo spacecraft. Techniques for making software more tolerant of cosmic ray effects are considered, including: reducing the number of registers used by the software; continuity testing of variables; redundant execution of major procedures for error detection; and encoding state variables to detect single-bit changes. Attention is also given to design modifications which may reduce the cosmic ray exposure of on-board hardware. These modifications include: shielding components operating in LEO; removing low-power Schottky parts; and the use of CMOS diodes. The SEU parameters of different electronic components are listed in a table.

  11. [Dental arch form reverting by four-point method].

    PubMed

    Pan, Xiao-Gang; Qian, Yu-Fen; Weng, Si-En; Feng, Qi-Ping; Yu, Quan

    2008-04-01

    To explore a simple method of reverting individual dental arch form template for wire bending. Individual dental arch form was reverted by four-point method. By defining central point of bracket on bilateral lower second premolar and first molar, certain individual dental arch form could be generated. The arch form generating procedure was then be developed to computer software for printing arch form. Four-point method arch form was evaluated by comparing with direct model measurement on linear and angular parameters. The accuracy and reproducibility were assessed by paired t test and concordance correlation coefficient with Medcalc 9.3 software package. The arch form by four-point method was of good accuracy and reproducibility (linear concordance correlation coefficient was 0.9909 and angular concordance correlation coefficient was 0.8419). The dental arch form reverted by four-point method could reproduce the individual dental arch form.

  12. A virtual fluoroscopy system to verify seed positioning accuracy during prostate permanent seed implants.

    PubMed

    Sarkar, V; Gutierrez, A N; Stathakis, S; Swanson, G P; Papanikolaou, N

    2009-01-01

    The purpose of this project was to develop a software platform to produce a virtual fluoroscopic image as an aid for permanent prostate seed implants. Seed location information from a pre-plan was extracted and used as input to in-house developed software to produce a virtual fluoroscopic image. In order to account for differences in patient positioning on the day of treatment, the user was given the ability to make changes to the virtual image. The system has been shown to work as expected for all test cases. The system allows for quick (on average less than 10 sec) generation of a virtual fluoroscopic image of the planned seed pattern. The image can be used as a verification tool to aid the physician in evaluating how close the implant is to the planned distribution throughout the procedure and enable remedial action should a large deviation be observed.

  13. Implementing finite state machines in a computer-based teaching system

    NASA Astrophysics Data System (ADS)

    Hacker, Charles H.; Sitte, Renate

    1999-09-01

    Finite State Machines (FSM) are models for functions commonly implemented in digital circuits such as timers, remote controls, and vending machines. Teaching FSM is core in the curriculum of many university digital electronic or discrete mathematics subjects. Students often have difficulties grasping the theoretical concepts in the design and analysis of FSM. This has prompted the author to develop an MS-WindowsTM compatible software, WinState, that provides a tutorial style teaching aid for understanding the mechanisms of FSM. The animated computer screen is ideal for visually conveying the required design and analysis procedures. WinState complements other software for combinatorial logic previously developed by the author, and enhances the existing teaching package by adding sequential logic circuits. WinState enables the construction of a students own FSM, which can be simulated, to test the design for functionality and possible errors.

  14. Galileo Attitude Determination: Experiences with a Rotating Star Scanner

    NASA Technical Reports Server (NTRS)

    Merken, L.; Singh, G.

    1991-01-01

    The Galileo experience with a rotating star scanner is discussed in terms of problems encountered in flight, solutions implemented, and lessons learned. An overview of the Galileo project and the attitude and articulation control subsystem is given and the star scanner hardware and relevant software algorithms are detailed. The star scanner is the sole source of inertial attitude reference for this spacecraft. Problem symptoms observed in flight are discussed in terms of effects on spacecraft performance and safety. Sources of thse problems include contributions from flight software idiosyncrasies and inadequate validation of the ground procedures used to identify target stars for use by the autonomous on-board star identification algorithm. Problem fixes (some already implemented and some only proposed) are discussed. A general conclusion is drawn regarding the inherent difficulty of performing simulation tests to validate algorithms which are highly sensitive to external inputs of statistically 'rare' events.

  15. Development and preliminary evaluation of an ultrasonic motor actuated needle guide for 3T MRI-guided transperineal prostate interventions

    NASA Astrophysics Data System (ADS)

    Song, Sang-Eun; Tokuda, Junichi; Tuncali, Kemal; Tempany, Clare; Hata, Nobuhiko

    2012-02-01

    Image guided prostate interventions have been accelerated by Magnetic Resonance Imaging (MRI) and robotic technologies in the past few years. However, transrectal ultrasound (TRUS) guided procedure still remains as vast majority in clinical practice due to engineering and clinical complexity of the MRI-guided robotic interventions. Subsequently, great advantages and increasing availability of MRI have not been utilized at its maximum capacity in clinic. To benefit patients from the advantages of MRI, we developed an MRI-compatible motorized needle guide device "Smart Template" that resembles a conventional prostate template to perform MRI-guided prostate interventions with minimal changes in the clinical procedure. The requirements and specifications of the Smart Template were identified from our latest MRI-guided intervention system that has been clinically used in manual mode for prostate biopsy. Smart Template consists of vertical and horizontal crossbars that are driven by two ultrasonic motors via timing-belt and mitergear transmissions. Navigation software that controls the crossbar position to provide needle insertion positions was also developed. The software can be operated independently or interactively with an open-source navigation software, 3D Slicer, that has been developed for prostate intervention. As preliminary evaluation, MRI distortion and SNR test were conducted. Significant MRI distortion was found close to the threaded brass alloy components of the template. However, the affected volume was limited outside the clinical region of interest. SNR values over routine MRI scan sequences for prostate biopsy indicated insignificant image degradation during the presence of the robotic system and actuation of the ultrasonic motors.

  16. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  17. Detection And Mapping (DAM) package. Volume 4B: Software System Manual, part 2

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    Computer programs, graphic devices, and an integrated set of manual procedures designed for efficient production of precisely registered and formatted maps from digital data are presented. The software can be used on any Univac 1100 series computer. The software includes pre-defined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3.

  18. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    NASA Astrophysics Data System (ADS)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  19. Building Energy Model Development for Retrofit Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chasar, David; McIlvaine, Janet; Blanchard, Jeremy

    2012-09-30

    Based on previous research conducted by Pacific Northwest National Laboratory and Florida Solar Energy Center providing technical assistance to implement 22 deep energy retrofits across the nation, 6 homes were selected in Florida and Texas for detailed post-retrofit energy modeling to assess realized energy savings (Chandra et al, 2012). However, assessing realized savings can be difficult for some homes where pre-retrofit occupancy and energy performance are unknown. Initially, savings had been estimated using a HERS Index comparison for these homes. However, this does not account for confounding factors such as occupancy and weather. This research addresses a method to moremore » reliably assess energy savings achieved in deep energy retrofits for which pre-retrofit utility bills or occupancy information in not available. A metered home, Riverdale, was selected as a test case for development of a modeling procedure to account occupancy and weather factors, potentially creating more accurate estimates of energy savings. This “true up” procedure was developed using Energy Gauge USA software and post-retrofit homeowner information and utility bills. The 12 step process adjusts the post-retrofit modeling results to correlate with post-retrofit utility bills and known occupancy information. The “trued” post retrofit model is then used to estimate pre-retrofit energy consumption by changing the building efficiency characteristics to reflect the pre-retrofit condition, but keeping all weather and occupancy-related factors the same. This creates a pre-retrofit model that is more comparable to the post-retrofit energy use profile and can improve energy savings estimates. For this test case, a home for which pre- and post- retrofit utility bills were available was selected for comparison and assessment of the accuracy of the “true up” procedure. Based on the current method, this procedure is quite time intensive. However, streamlined processing spreadsheets or incorporation into existing software tools would improve the efficiency of the process. Retrofit activity appears to be gaining market share, and this would be a potentially valuable capability with relevance to marketing, program management, and retrofit success metrics.« less

  20. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Treesearch

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  1. Performance assessments of Android-powered military applications operating on tactical handheld devices

    NASA Astrophysics Data System (ADS)

    Weiss, Brian A.; Fronczek, Lisa; Morse, Emile; Kootbally, Zeid; Schlenoff, Craig

    2013-05-01

    Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications ("apps") to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology's (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.

  2. Open core control software for surgical robots

    PubMed Central

    Kozuka, Hiroaki; Kim, Hyung Wook; Takesue, Naoyuki; Vladimirov, B.; Sakaguchi, Masamichi; Tokuda, Junichi; Hata, Nobuhiko; Chinzei, Kiyoyuki; Fujimoto, Hideo

    2010-01-01

    Object In these days, patients and doctors in operation room are surrounded by many medical devices as resulting from recent advancement of medical technology. However, these cutting-edge medical devices are working independently and not collaborating with each other, even though the collaborations between these devices such as navigation systems and medical imaging devices are becoming very important for accomplishing complex surgical tasks (such as a tumor removal procedure while checking the tumor location in neurosurgery). On the other hand, several surgical robots have been commercialized, and are becoming common. However, these surgical robots are not open for collaborations with external medical devices in these days. A cutting-edge “intelligent surgical robot” will be possible in collaborating with surgical robots, various kinds of sensors, navigation system and so on. On the other hand, most of the academic software developments for surgical robots are “home-made” in their research institutions and not open to the public. Therefore, open source control software for surgical robots can be beneficial in this field. From these perspectives, we developed Open Core Control software for surgical robots to overcome these challenges. Materials and methods In general, control softwares have hardware dependencies based on actuators, sensors and various kinds of internal devices. Therefore, these control softwares cannot be used on different types of robots without modifications. However, the structure of the Open Core Control software can be reused for various types of robots by abstracting hardware dependent parts. In addition, network connectivity is crucial for collaboration between advanced medical devices. The OpenIGTLink is adopted in Interface class which plays a role to communicate with external medical devices. At the same time, it is essential to maintain the stable operation within the asynchronous data transactions through network. In the Open Core Control software, several techniques for this purpose were introduced. Virtual fixture is well known technique as a “force guide” for supporting operators to perform precise manipulation by using a master–slave robot. The virtual fixture for precise and safety surgery was implemented on the system to demonstrate an idea of high-level collaboration between a surgical robot and a navigation system. The extension of virtual fixture is not a part of the Open Core Control system, however, the function such as virtual fixture cannot be realized without a tight collaboration between cutting-edge medical devices. By using the virtual fixture, operators can pre-define an accessible area on the navigation system, and the area information can be transferred to the robot. In this manner, the surgical console generates the reflection force when the operator tries to get out from the pre-defined accessible area during surgery. Results The Open Core Control software was implemented on a surgical master–slave robot and stable operation was observed in a motion test. The tip of the surgical robot was displayed on a navigation system by connecting the surgical robot with a 3D position sensor through the OpenIGTLink. The accessible area was pre-defined before the operation, and the virtual fixture was displayed as a “force guide” on the surgical console. In addition, the system showed stable performance in a duration test with network disturbance. Conclusion In this paper, a design of the Open Core Control software for surgical robots and the implementation of virtual fixture were described. The Open Core Control software was implemented on a surgical robot system and showed stable performance in high-level collaboration works. The Open Core Control software is developed to be a widely used platform of surgical robots. Safety issues are essential for control software of these complex medical devices. It is important to follow the global specifications such as a FDA requirement “General Principles of Software Validation” or IEC62304. For following these regulations, it is important to develop a self-test environment. Therefore, a test environment is now under development to test various interference in operation room such as a noise of electric knife by considering safety and test environment regulations such as ISO13849 and IEC60508. The Open Core Control software is currently being developed software in open-source manner and available on the Internet. A communization of software interface is becoming a major trend in this field. Based on this perspective, the Open Core Control software can be expected to bring contributions in this field. PMID:20033506

  3. HAL/S programmer's guide. [space shuttle flight software language

    NASA Technical Reports Server (NTRS)

    Newbold, P. M.; Hotz, R. L.

    1974-01-01

    HAL/S is a programming language developed to satisfy the flight software requirements for the space shuttle program. The user's guide explains pertinent language operating procedures and described the various HAL/S facilities for manipulating integer, scalar, vector, and matrix data types.

  4. Using a fuzzy comprehensive evaluation method to determine product usability: A test case

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942

  5. Using a fuzzy comprehensive evaluation method to determine product usability: A test case.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.

  6. Surface extra-vehicular activity emergency scenario management: Tools, procedures, and geologically related implications

    NASA Astrophysics Data System (ADS)

    Zea, Luis; Diaz, Alejandro R.; Shepherd, Charles K.; Kumar, Ranganathan

    2010-07-01

    Extra-vehicular activities (EVAs) are an essential part of human space exploration, but involve inherently dangerous procedures which can put crew safety at risk during a space mission. To help mitigate this risk, astronauts' training programs spend substantial attention on preparing for surface EVA emergency scenarios. With the help of two Mars Desert Research Station (MDRS) crews (61 and 65), wearing simulated spacesuits, the most important of these emergency scenarios were examined at three different types of locations that geologically and environmentally resemble lunar and Martian landscapes. These three platforms were analyzed geologically as well as topographically (utilizing a laser range finder with slope estimation capabilities and a slope determination software). Emergency scenarios were separated into four main groups: (1) suit issues, (2) general physiological, (3) attacks and (4) others. Specific tools and procedures were developed to address each scenario. The tools and processes were tested in the field under Mars-analog conditions with the suited subjects for feasibility and speed of execution.

  7. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  8. An image-processing software package: UU and Fig for optical metrology applications

    NASA Astrophysics Data System (ADS)

    Chen, Lujie

    2013-06-01

    Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.

  9. A numerical procedure for failure mode detection of masonry arches reinforced with fiber reinforced polymeric materials

    NASA Astrophysics Data System (ADS)

    Galassi, S.

    2018-05-01

    In this paper a mechanical model of masonry arches strengthened with fibre-reinforced composite materials and the relevant numerical procedure for the analysis are proposed. The arch is modelled by using an assemblage of rigid blocks that are connected together and, also to the supporting structures, by mortar joints. The presence of the reinforcement, usually a sheet placed at the intrados or the extrados, prevents the occurrence of cracks that could activate possible collapse mechanisms, due to tensile failure of the mortar joints. Therefore, in a reinforced arch failure generally occurs in a different way from the URM arch. The numerical procedure proposed checks, as a function of an external incremental load, the inner stress state in the arch, in the reinforcement and in the adhesive layer. In so doing, it then provides a prediction of failure modes. Results obtained from experimental tests, carried out on four in-scale models performed in a laboratory, have been compared with those provided by the numerical procedure, implemented in ArchiVAULT, a software developed by the author. In this regard, the numerical procedure is an extension of previous works. Although additional experimental investigations are necessary, these former results confirm that the proposed numerical procedure is promising.

  10. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  11. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  12. Space shuttle orbiter avionics software: Post review report for the entry FACI (First Article Configuration Inspection). [including orbital flight tests integrated system

    NASA Technical Reports Server (NTRS)

    Markos, H.

    1978-01-01

    Status of the computer programs dealing with space shuttle orbiter avionics is reported. Specific topics covered include: delivery status; SSW software; SM software; DL software; GNC software; level 3/4 testing; level 5 testing; performance analysis, SDL readiness for entry first article configuration inspection; and verification assessment.

  13. PUS Services Software Building Block Automatic Generation for Space Missions

    NASA Astrophysics Data System (ADS)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the characterization of the reference avionics architecture and of the reference on- board software high-level architecture.

  14. Mechanically assisted 3D ultrasound for pre-operative assessment and guiding percutaneous treatment of focal liver tumors

    NASA Astrophysics Data System (ADS)

    Sadeghi Neshat, Hamid; Bax, Jeffery; Barker, Kevin; Gardi, Lori; Chedalavada, Jason; Kakani, Nirmal; Fenster, Aaron

    2014-03-01

    Image-guided percutaneous ablation is the standard treatment for focal liver tumors deemed inoperable and is commonly used to maintain eligibility for patients on transplant waitlists. Radiofrequency (RFA), microwave (MWA) and cryoablation technologies are all delivered via one or a number of needle-shaped probes inserted directly into the tumor. Planning is mostly based on contrast CT/MRI. While intra-procedural CT is commonly used to confirm the intended probe placement, 2D ultrasound (US) remains the main, and in some centers the only imaging modality used for needle guidance. Corresponding intraoperative 2D US with planning and other intra-procedural imaging modalities is essential for accurate needle placement. However, identification of matching features of interest among these images is often challenging given the limited field-of-view (FOV) and low quality of 2D US images. We have developed a passive tracking arm with a motorized scan-head and software tools to improve guiding capabilities of conventional US by large FOV 3D US scans that provides more anatomical landmarks that can facilitate registration of US with both planning and intra-procedural images. The tracker arm is used to scan the whole liver with a high geometrical accuracy that facilitates multi-modality landmark based image registration. Software tools are provided to assist with the segmentation of the ablation probes and tumors, find the 2D view that best shows the probe(s) from a 3D US image, and to identify the corresponding image from planning CT scans. In this paper, evaluation results from laboratory testing and a phase 1 clinical trial for planning and guiding RFA and MWA procedures using the developed system will be presented. Early clinical results show a comparable performance to intra-procedural CT that suggests 3D US as a cost-effective alternative with no side-effects in centers where CT is not available.

  15. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  16. Development and evaluation of an instrumented linkage system for total knee surgery.

    PubMed

    Walker, Peter S; Wei, Chih-Shing; Forman, Rachel E; Balicki, M A

    2007-10-01

    The principles and application of total knee surgery using optical tracking have been well demonstrated, but electromagnetic tracking may offer further advantages. We asked whether an instrumented linkage that attaches directly to the bone can maintain the accuracy of the optical and electromagnetic systems but be quicker, more convenient, and less expensive to use. Initial testing using a table-mounted digitizer to navigate a drill guide for placing pins to mount a cutting guide demonstrated the feasibility in terms of access and availability. A first version (called the Mark 1) instrumented linkage designed to fix directly to the bone was constructed and software was written to carry out a complete total knee replacement procedure. The results showed the system largely fulfilled these goals, but some surgeons found that using a visual display for pin placement was difficult and time consuming. As a result, a second version of a linkage system (called the K-Link) was designed to further develop the concept. User-friendly flexible software was developed for facilitating each step quickly and accurately while the placement of cutting guides was facilitated. We concluded that an instrumented linkage system could be a useful and potentially lower-cost option to the current systems for total knee replacement and could possibly have application to other surgical procedures.

  17. DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.

    PubMed

    Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A

    2017-01-01

    Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.

  18. T-Pattern Analysis and Cognitive Load Manipulation to Detect Low-Stake Lies: An Exploratory Study.

    PubMed

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare; Realdon, Olivia; Jonsson, Gudberg K; Anguera, M Teresa

    2018-01-01

    Deception has evolved to become a fundamental aspect of human interaction. Despite the prolonged efforts in many disciplines, there has been no definite finding of a univocally "deceptive" signal. This work proposes an approach to deception detection combining cognitive load manipulation and T-pattern methodology with the objective of: (a) testing the efficacy of dual task-procedure in enhancing differences between truth tellers and liars in a low-stakes situation; (b) exploring the efficacy of T-pattern methodology in discriminating truthful reports from deceitful ones in a low-stakes situation; (c) setting the experimental design and procedure for following research. We manipulated cognitive load to enhance differences between truth tellers and liars, because of the low-stakes lies involved in our experiment. We conducted an experimental study with a convenience sample of 40 students. We carried out a first analysis on the behaviors' frequencies coded through the observation software, using SPSS (22). The aim was to describe shape and characteristics of behavior's distributions and explore differences between groups. Datasets were then analyzed with Theme 6.0 software which detects repeated patterns (T-patterns) of coded events (non-verbal behaviors) that regularly or irregularly occur within a period of observation. A descriptive analysis on T-pattern frequencies was carried out to explore differences between groups. An in-depth analysis on more complex patterns was performed to get qualitative information on the behavior structure expressed by the participants. Results show that the dual-task procedure enhances differences observed between liars and truth tellers with T-pattern methodology; moreover, T-pattern detection reveals a higher variety and complexity of behavior in truth tellers than in liars. These findings support the combination of cognitive load manipulation and T-pattern methodology for deception detection in low-stakes situations, suggesting the testing of directional hypothesis on a larger probabilistic sample of population.

  19. T-Pattern Analysis and Cognitive Load Manipulation to Detect Low-Stake Lies: An Exploratory Study

    PubMed Central

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare; Realdon, Olivia; Jonsson, Gudberg K.; Anguera, M. Teresa

    2018-01-01

    Deception has evolved to become a fundamental aspect of human interaction. Despite the prolonged efforts in many disciplines, there has been no definite finding of a univocally “deceptive” signal. This work proposes an approach to deception detection combining cognitive load manipulation and T-pattern methodology with the objective of: (a) testing the efficacy of dual task-procedure in enhancing differences between truth tellers and liars in a low-stakes situation; (b) exploring the efficacy of T-pattern methodology in discriminating truthful reports from deceitful ones in a low-stakes situation; (c) setting the experimental design and procedure for following research. We manipulated cognitive load to enhance differences between truth tellers and liars, because of the low-stakes lies involved in our experiment. We conducted an experimental study with a convenience sample of 40 students. We carried out a first analysis on the behaviors’ frequencies coded through the observation software, using SPSS (22). The aim was to describe shape and characteristics of behavior’s distributions and explore differences between groups. Datasets were then analyzed with Theme 6.0 software which detects repeated patterns (T-patterns) of coded events (non-verbal behaviors) that regularly or irregularly occur within a period of observation. A descriptive analysis on T-pattern frequencies was carried out to explore differences between groups. An in-depth analysis on more complex patterns was performed to get qualitative information on the behavior structure expressed by the participants. Results show that the dual-task procedure enhances differences observed between liars and truth tellers with T-pattern methodology; moreover, T-pattern detection reveals a higher variety and complexity of behavior in truth tellers than in liars. These findings support the combination of cognitive load manipulation and T-pattern methodology for deception detection in low-stakes situations, suggesting the testing of directional hypothesis on a larger probabilistic sample of population. PMID:29551986

  20. Operative record using intraoperative digital data in neurosurgery.

    PubMed

    Houkin, K; Kuroda, S; Abe, H

    2000-01-01

    The purpose of this study was to develop a new method for more efficient and accurate operative records using intra-operative digital data in neurosurgery, including macroscopic procedures and microscopic procedures under an operating microscope. Macroscopic procedures were recorded using a digital camera and microscopic procedures were also recorded using a microdigital camera attached to an operating microscope. Operative records were then recorded digitally and filed in a computer using image retouch software and database base software. The time necessary for editing of the digital data and completing the record was less than 30 minutes. Once these operative records are digitally filed, they are easily transferred and used as database. Using digital operative records along with digital photography, neurosurgeons can document their procedures more accurately and efficiently than by the conventional method (handwriting). A complete digital operative record is not only accurate but also time saving. Construction of a database, data transfer and desktop publishing can be achieved using the intra-operative data, including intra-operative photographs.

Top