Sample records for software testing collection

  1. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  2. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  3. Software error data collection and categorization

    NASA Technical Reports Server (NTRS)

    Ostrand, T. J.; Weyuker, E. J.

    1982-01-01

    Software errors detected during development of an interactive special purpose editor system were studied. This product was followed during nine months of coding, unit testing, function testing, and system testing. A new error categorization scheme was developed.

  4. Software Reliability, Measurement, and Testing. Volume 2. Guidebook for Software Reliability Measurement and Testing

    DTIC Science & Technology

    1992-04-01

    contractor’s existing data collection, analysis and corrective action system shall be utilized, with modification only as necessary to meet the...either from test or from analysis of field data . The procedures of MIL-STD-756B assume that the reliability of a 18 DEFINE IDENTIFY SOFTWARE LIFE CYCLE...to generate sufficient data to report a statistically valid reliability figure for a class of software. Casual data gathering accumulates data more

  5. Development and validation of techniques for improving software dependability

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    A collection of document abstracts are presented on the topic of improving software dependability through NASA grant NAG-1-1123. Specific topics include: modeling of error detection; software inspection; test cases; Magnetic Stereotaxis System safety specifications and fault trees; and injection of synthetic faults into software.

  6. Artificial Neural Networks-Based Software for Measuring Heat Collection Rate and Heat Loss Coefficient of Water-in-Glass Evacuated Tube Solar Water Heaters

    PubMed Central

    Liu, Zhijian; Liu, Kejun; Li, Hao; Zhang, Xinyu; Jin, Guangya; Cheng, Kewei

    2015-01-01

    Measurements of heat collection rate and heat loss coefficient are crucial for the evaluation of in service water-in-glass evacuated tube solar water heaters. However, conventional measurement requires expensive detection devices and undergoes a series of complicated procedures. To simplify the measurement and reduce the cost, software based on artificial neural networks for measuring heat collection rate and heat loss coefficient of water-in-glass evacuated tube solar water heaters was developed. Using multilayer feed-forward neural networks with back-propagation algorithm, we developed and tested our program on the basis of 915measuredsamples of water-in-glass evacuated tube solar water heaters. This artificial neural networks-based software program automatically obtained accurate heat collection rateand heat loss coefficient using simply "portable test instruments" acquired parameters, including tube length, number of tubes, tube center distance, heat water mass in tank, collector area, angle between tubes and ground and final temperature. Our results show that this software (on both personal computer and Android platforms) is efficient and convenient to predict the heat collection rate and heat loss coefficient due to it slow root mean square errors in prediction. The software now can be downloaded from http://t.cn/RLPKF08. PMID:26624613

  7. Artificial Neural Networks-Based Software for Measuring Heat Collection Rate and Heat Loss Coefficient of Water-in-Glass Evacuated Tube Solar Water Heaters.

    PubMed

    Liu, Zhijian; Liu, Kejun; Li, Hao; Zhang, Xinyu; Jin, Guangya; Cheng, Kewei

    2015-01-01

    Measurements of heat collection rate and heat loss coefficient are crucial for the evaluation of in service water-in-glass evacuated tube solar water heaters. However, conventional measurement requires expensive detection devices and undergoes a series of complicated procedures. To simplify the measurement and reduce the cost, software based on artificial neural networks for measuring heat collection rate and heat loss coefficient of water-in-glass evacuated tube solar water heaters was developed. Using multilayer feed-forward neural networks with back-propagation algorithm, we developed and tested our program on the basis of 915 measured samples of water-in-glass evacuated tube solar water heaters. This artificial neural networks-based software program automatically obtained accurate heat collection rate and heat loss coefficient using simply "portable test instruments" acquired parameters, including tube length, number of tubes, tube center distance, heat water mass in tank, collector area, angle between tubes and ground and final temperature. Our results show that this software (on both personal computer and Android platforms) is efficient and convenient to predict the heat collection rate and heat loss coefficient due to it slow root mean square errors in prediction. The software now can be downloaded from http://t.cn/RLPKF08.

  8. An experiment in software reliability: Additional analyses using data from automated replications

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Lauterbach, Linda A.

    1988-01-01

    A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.

  9. Selecting Really Excellent Software for Young Adults.

    ERIC Educational Resources Information Center

    Polly, Jean Armour

    1985-01-01

    This article discusses criteria of a good computer software package to aid the public librarian in the building, weeding, and maintenance of a software collection for young adults. Highlights include manuals or documentation; bells, whistles, and color; and the true test of time. (EJS)

  10. 78 FR 77646 - Proposed Information Collection; Comment Request; 2014 Census Site Test

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-24

    ... pre-notification containing instructions about how to respond to the test online. Some households will... Adaptive Design Strategies portion will test a method of managing data collection by dynamically adapting... methodology. The objectives of this component of the test are to: Design and develop software solutions...

  11. Small-scale fixed wing airplane software verification flight test

    NASA Astrophysics Data System (ADS)

    Miller, Natasha R.

    The increased demand for micro Unmanned Air Vehicles (UAV) driven by military requirements, commercial use, and academia is creating a need for the ability to quickly and accurately conduct low Reynolds Number aircraft design. There exist several open source software programs that are free or inexpensive that can be used for large scale aircraft design, but few software programs target the realm of low Reynolds Number flight. XFLR5 is an open source, free to download, software program that attempts to take into consideration viscous effects that occur at low Reynolds Number in airfoil design, 3D wing design, and 3D airplane design. An off the shelf, remote control airplane was used as a test bed to model in XFLR5 and then compared to flight test collected data. Flight test focused on the stability modes of the 3D plane, specifically the phugoid mode. Design and execution of the flight tests were accomplished for the RC airplane using methodology from full scale military airplane test procedures. Results from flight test were not conclusive in determining the accuracy of the XFLR5 software program. There were several sources of uncertainty that did not allow for a full analysis of the flight test results. An off the shelf drone autopilot was used as a data collection device for flight testing. The precision and accuracy of the autopilot is unknown. Potential future work should investigate flight test methods for small scale UAV flight.

  12. General-Purpose Electronic System Tests Aircraft

    NASA Technical Reports Server (NTRS)

    Glover, Richard D.

    1989-01-01

    Versatile digital equipment supports research, development, and maintenance. Extended aircraft interrogation and display system is general-purpose assembly of digital electronic equipment on ground for testing of digital electronic systems on advanced aircraft. Many advanced features, including multiple 16-bit microprocessors, pipeline data-flow architecture, advanced operating system, and resident software-development tools. Basic collection of software includes program for handling many types of data and for displays in various formats. User easily extends basic software library. Hardware and software interfaces to subsystems provided by user designed for flexibility in configuration to meet user's requirements.

  13. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  14. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  15. Software development predictors, error analysis, reliability models and software metric analysis

    NASA Technical Reports Server (NTRS)

    Basili, Victor

    1983-01-01

    The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.

  16. Multi-version software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1989-01-01

    A number of experimental and theoretical issues associated with the practical use of multi-version software to provide run-time tolerance to software faults were investigated. A specialized tool was developed and evaluated for measuring testing coverage for a variety of metrics. The tool was used to collect information on the relationships between software faults and coverage provided by the testing process as measured by different metrics (including data flow metrics). Considerable correlation was found between coverage provided by some higher metrics and the elimination of faults in the code. Back-to-back testing was continued as an efficient mechanism for removal of un-correlated faults, and common-cause faults of variable span. Software reliability estimation methods was also continued based on non-random sampling, and the relationship between software reliability and code coverage provided through testing. New fault tolerance models were formulated. Simulation studies of the Acceptance Voting and Multi-stage Voting algorithms were finished and it was found that these two schemes for software fault tolerance are superior in many respects to some commonly used schemes. Particularly encouraging are the safety properties of the Acceptance testing scheme.

  17. Imaging Sensor Flight and Test Equipment Software

    NASA Technical Reports Server (NTRS)

    Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa

    2007-01-01

    The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes at user-selected locations.

  18. Development of a Unix/VME data acquisition system

    NASA Astrophysics Data System (ADS)

    Miller, M. C.; Ahern, S.; Clark, S. M.

    1992-01-01

    The current status of a Unix-based VME data acquisition development project is described. It is planned to use existing Fortran data collection software to drive the existing CAMAC electronics via a VME CAMAC branch driver card and associated Daresbury Unix driving software. The first usable Unix driver has been written and produces single-action CAMAC cycles from test software. The data acquisition code has been implemented in test mode under Unix with few problems and effort is now being directed toward finalizing calls to the CAMAC-driving software and ultimate evaluation of the complete system.

  19. Reliability Testing Using the Vehicle Durability Simulator

    DTIC Science & Technology

    2017-11-20

    remote parameter control (RPC) software. The software is specifically designed for the data collection, analysis, and simulation processes outlined in...4516. 3. TOP 02-2-505 Inspection and Preliminary Operation of Vehicles, 4 February 1987. 4. Multi-Shaker Test and Control : Design , Test, and...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 20-11-2017 2. REPORT

  20. Some Methods of Applied Numerical Analysis to 3d Facial Reconstruction Software

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Ianeş, Emilia; Roşu, Doina

    2010-09-01

    This paper deals with the collective work performed by medical doctors from the University Of Medicine and Pharmacy Timisoara and engineers from the Politechnical Institute Timisoara in the effort to create the first Romanian 3d reconstruction software based on CT or MRI scans and to test the created software in clinical practice.

  1. Establishing Qualitative Software Metrics in Department of the Navy Programs

    DTIC Science & Technology

    2015-10-29

    dedicated to provide the highest quality software to its users. In doing, there is a need for a formalized set of Software Quality Metrics . The goal...of this paper is to establish the validity of those necessary Quality metrics . In our approach we collected the data of over a dozen programs...provide the necessary variable data for our formulas and tested the formulas for validity. Keywords: metrics ; software; quality I. PURPOSE Space

  2. NDAS Hardware Translation Layer Development

    NASA Technical Reports Server (NTRS)

    Nazaretian, Ryan N.; Holladay, Wendy T.

    2011-01-01

    The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.

  3. Business Intelligence Applied to the ALMA Software Integration Process

    NASA Astrophysics Data System (ADS)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  4. SEDS1 mission software verification using a signal simulator

    NASA Technical Reports Server (NTRS)

    Pierson, William E.

    1992-01-01

    The first flight of the Small Expendable Deployer System (SEDS1) is schedule to fly as the secondary payload of a Delta 2 in March, 1993. The objective of the SEDS1 mission is to collect data to validate the concept of tethered satellite systems and to verify computer simulations used to predict their behavior. SEDS1 will deploy a 50 lb. instrumented satellite as an end mass using a 20 km tether. Langley Research Center is providing the end mass instrumentation, while the Marshall Space Flight Center is designing and building the deployer. The objective of the experiment is to test the SEDS design concept by demonstrating that the system will satisfactorily deploy the full 20 km tether without stopping prematurely, come to a smooth stop on the application of a brake, and cut the tether at the proper time after it swings to the local vertical. Also, SEDS1 will collect data which will be used to test the accuracy of tether dynamics models used to stimulate this type of deployment. The experiment will last about 1.5 hours and complete approximately 1.5 orbits. Radar tracking of the Delta II and end mass is planned. In addition, the SEDS1 on-board computer will continuously record, store, and transmit mission data over the Delta II S-band telemetry system. The Data System will count tether windings as the tether unwinds, log the times of each turn and other mission events, monitor tether tension, and record the temperature of system components. A summary of the measurements taken during the SEDS1 are shown. The Data System will also control the tether brake and cutter mechanisms. Preliminary versions of two major sections of the flight software, the data telemetry modules and the data collection modules, were developed and tested under the 1990 NASA/ASEE Summer Faculty Fellowship Program. To facilitate the debugging of these software modules, a prototype SEDS Data System was programmed to simulate turn count signals. During the 1991 summer program, the concept of simulating signals produced by the SEDS electronics systems and circuits was expanded and more precisely defined. During the 1992 summer program, the SEDS signal simulator was programmed to test the requirements of the SEDS Mission software, and this simulator will be used in the formal verification of the SEDS Mission Software. The formal test procedures specification was written which incorporates the use of the signal simulator to test the SEDS Mission Software and which incorporates procedures for testing the other major component of the SEDS software, the Monitor Software.

  5. Developing Simulated Cyber Attack Scenarios Against Virtualized Adversary Networks

    DTIC Science & Technology

    2017-03-01

    MAST is a custom software framework originally designed to facilitate the training of network administrators on live networks using SimWare. The MAST...or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services ...scenario development and testing in a virtual test environment. Commercial and custom software tools that provide the ability to conduct network

  6. Annual Research Progress Report.

    DTIC Science & Technology

    1979-09-30

    will be trained in SLRL test procedures and the methodology will be developed for the incorporation of test materials into the standard rearing diet ...requirements exist for system software maintenance and development of software to report dosing data, to calculate diet preparation data, to manage collected...influence of diet and exercise on myo- globin and metmyoglobin reductase were evaluated in the rat. The activity of inetmyo- globin reductase was

  7. Solar Constant (SOLCON) Experiment: Ground Support Equipment (GSE) software development

    NASA Technical Reports Server (NTRS)

    Gibson, M. Alan; Thomas, Susan; Wilson, Robert

    1991-01-01

    The Solar Constant (SOLCON) Experiment, the objective of which is to determine the solar constant value and its variability, is scheduled for launch as part of the Space Shuttle/Atmospheric Laboratory for Application and Science (ATLAS) spacelab mission. The Ground Support Equipment (GSE) software was developed to monitor and analyze the SOLCON telemetry data during flight and to test the instrument on the ground. The design and development of the GSE software are discussed. The SOLCON instrument was tested during Davos International Solar Intercomparison, 1989 and the SOLCON data collected during the tests are analyzed to study the behavior of the instrument.

  8. 75 FR 14145 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... made in response to usability testing as well as internal review. Changes have also been made to... burden of reporting using the electronic software and improve usability. The affected questions can be...

  9. Data Link Test and Analysis System/ATCRBS Transponder Test System Technical Reference

    DOT National Transportation Integrated Search

    1990-05-01

    This document references material for personnel using or making software changes : to the Data Link Test and Analysis System (DATAS) for Air Traffic Control Radar : Beacon System (ATCRBS) transponder testing and data collection. This is one of : a se...

  10. Support for Diagnosis of Custom Computer Hardware

    NASA Technical Reports Server (NTRS)

    Molock, Dwaine S.

    2008-01-01

    The Coldfire SDN Diagnostics software is a flexible means of exercising, testing, and debugging custom computer hardware. The software is a set of routines that, collectively, serve as a common software interface through which one can gain access to various parts of the hardware under test and/or cause the hardware to perform various functions. The routines can be used to construct tests to exercise, and verify the operation of, various processors and hardware interfaces. More specifically, the software can be used to gain access to memory, to execute timer delays, to configure interrupts, and configure processor cache, floating-point, and direct-memory-access units. The software is designed to be used on diverse NASA projects, and can be customized for use with different processors and interfaces. The routines are supported, regardless of the architecture of a processor that one seeks to diagnose. The present version of the software is configured for Coldfire processors on the Subsystem Data Node processor boards of the Solar Dynamics Observatory. There is also support for the software with respect to Mongoose V, RAD750, and PPC405 processors or their equivalents.

  11. Application of industry-standard guidelines for the validation of avionics software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  12. DAQ: Software Architecture for Data Acquisition in Sounding Rockets

    NASA Technical Reports Server (NTRS)

    Ahmad, Mohammad; Tran, Thanh; Nichols, Heidi; Bowles-Martinez, Jessica N.

    2011-01-01

    A multithreaded software application was developed by Jet Propulsion Lab (JPL) to collect a set of correlated imagery, Inertial Measurement Unit (IMU) and GPS data for a Wallops Flight Facility (WFF) sounding rocket flight. The data set will be used to advance Terrain Relative Navigation (TRN) technology algorithms being researched at JPL. This paper describes the software architecture and the tests used to meet the timing and data rate requirements for the software used to collect the dataset. Also discussed are the challenges of using commercial off the shelf (COTS) flight hardware and open source software. This includes multiple Camera Link (C-link) based cameras, a Pentium-M based computer, and Linux Fedora 11 operating system. Additionally, the paper talks about the history of the software architecture's usage in other JPL projects and its applicability for future missions, such as cubesats, UAVs, and research planes/balloons. Also talked about will be the human aspect of project especially JPL's Phaeton program and the results of the launch.

  13. Survey Email Scheduling and Monitoring in eRCTs (SESAMe): A Digital Tool to Improve Data Collection in Randomized Controlled Clinical Trials.

    PubMed

    Skonnord, Trygve; Steen, Finn; Skjeie, Holgeir; Fetveit, Arne; Brekke, Mette; Klovning, Atle

    2016-11-22

    Electronic questionnaires can ease data collection in randomized controlled trials (RCTs) in clinical practice. We found no existing software that could automate the sending of emails to participants enrolled into an RCT at different study participant inclusion time points. Our aim was to develop suitable software to facilitate data collection in an ongoing multicenter RCT of low back pain (the Acuback study). For the Acuback study, we determined that we would need to send a total of 5130 emails to 270 patients recruited at different centers and at 19 different time points. The first version of the software was tested in a pilot study in November 2013 but was unable to deliver multiuser or Web-based access. We resolved these shortcomings in the next version, which we tested on the Web in February 2014. Our new version was able to schedule and send the required emails in the full-scale Acuback trial that started in March 2014. The system architecture evolved through an iterative, inductive process between the project study leader and the software programmer. The program was tested and updated when errors occurred. To evaluate the development of the software, we used a logbook, a research assistant dialogue, and Acuback trial participant queries. We have developed a Web-based app, Survey Email Scheduling and Monitoring in eRCTs (SESAMe), that monitors responses in electronic surveys and sends reminders by emails or text messages (short message service, SMS) to participants. The overall response rate for the 19 surveys in the Acuback study increased from 76.4% (655/857) before we introduced reminders to 93.11% (1149/1234) after the new function (P<.001). Further development will aim at securing encryption and data storage. The SESAMe software facilitates consecutive patient data collection in RCTs and can be used to increase response rates and quality of research, both in general practice and in other clinical trial settings. ©Trygve Skonnord, Finn Steen, Holgeir Skjeie, Arne Fetveit, Mette Brekke, Atle Klovning. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.11.2016.

  14. Using Web Metric Software to Drive: Mobile Website Development

    ERIC Educational Resources Information Center

    Tidal, Junior

    2011-01-01

    Many libraries have developed mobile versions of their websites. In order to understand their users, web developers have conducted both usability tests and focus groups, yet analytical software and web server logs can also be used to better understand users. Using data collected from these tools, the Ursula C. Schwerin Library has made informed…

  15. Recommended approach to sofware development

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.; Page, J.; Eslinger, S.; Church, V.; Merwarth, P.

    1983-01-01

    A set of guideline for an organized, disciplined approach to software development, based on data collected and studied for 46 flight dynamics software development projects. Methods and practices for each phase of a software development life cycle that starts with requirements analysis and ends with acceptance testing are described; maintenance and operation is not addressed. For each defined life cycle phase, guidelines for the development process and its management, and the products produced and their reviews are presented.

  16. Detection of Subsurface Defects in Levees in Correlation to Weather Conditions Utilizing Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Martinez, I. A.; Eisenmann, D.

    2012-12-01

    Ground Penetrating Radar (GPR) has been used for many years in successful subsurface detection of conductive and non-conductive objects in all types of material including different soils and concrete. Typical defect detection is based on subjective examination of processed scans using data collection and analysis software to acquire and analyze the data, often requiring a developed expertise or an awareness of how a GPR works while collecting data. Processing programs, such as GSSI's RADAN analysis software are then used to validate the collected information. Iowa State University's Center for Nondestructive Evaluation (CNDE) has built a test site, resembling a typical levee used near rivers, which contains known sub-surface targets of varying size, depth, and conductivity. Scientist at CNDE have developed software with the enhanced capabilities, to decipher a hyperbola's magnitude and amplitude for GPR signal processing. With this enhanced capability, the signal processing and defect detection capabilities for GPR have the potential to be greatly enhanced. This study will examine the effects of test parameters, antenna frequency (400MHz), data manipulation methods (which include data filters and restricting the range of depth in which the chosen antenna's signal can reach), and real-world conditions using this test site (such as varying weather conditions) , with the goal of improving GPR tests sensitivity for differing soil conditions.

  17. Engineering intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Warren, Kimberly C.; Goodman, Bradley A.

    1993-01-01

    We have defined an object-oriented software architecture for Intelligent Tutoring Systems (ITS's) to facilitate the rapid development, testing, and fielding of ITS's. This software architecture partitions the functionality of the ITS into a collection of software components with well-defined interfaces and execution concept. The architecture was designed to isolate advanced technology components, partition domain dependencies, take advantage of the increased availability of commercial software packages, and reduce the risks involved in acquiring ITS's. A key component of the architecture, the Executive, is a publish and subscribe message handling component that coordinates all communication between ITS components.

  18. Healthwatch-2 System Overview

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Mosher, Marianne; Huff, Edward M.

    2004-01-01

    Healthwatch-2 (HW-2) is a research tool designed to facilitate the development and testing of in-flight health monitoring algorithms. HW-2 software is written in C/C++ and executes on an x86-based computer running the Linux operating system. The executive module has interfaces for collecting various signal data, such as vibration, torque, tachometer, and GPS. It is designed to perform in-flight time or frequency averaging based on specifications defined in a user-supplied configuration file. Averaged data are then passed to a user-supplied algorithm written as a Matlab function. This allows researchers a convenient method for testing in-flight algorithms. In addition to its in-flight capabilities, HW-2 software is also capable of reading archived flight data and processing it as if collected in-flight. This allows algorithms to be developed and tested in the laboratory before being flown. Currently HW-2 has passed its checkout phase and is collecting data on a Bell OH-58C helicopter operated by the U.S. Army at NASA Ames Research Center.

  19. ARC-2007-ACD07-0140-002

    NASA Image and Video Library

    2007-07-31

    David L. Iverson of NASA Ames Research Center, Moffett Field, California (in foreground) led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. Also, Charles Lee is pictured. During its develoment, researchers used the software to analyze archived gyroscope records. In these tests, users noticed problems with the gyroscopes long before the current systems flagged glitches. Testers trained using several months of normal space station gyroscope data collected by the International Space Station Mission Control Center at NASA Johnson Space Center, Houston. Promising tests results convinced officials to start using the software in 2007.

  20. BrightStat.com: free statistics online.

    PubMed

    Stricker, Daniel

    2008-10-01

    Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.

  1. Software metrics: The key to quality software on the NCC project

    NASA Technical Reports Server (NTRS)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  2. Bird Vision System

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Bird Vision system is a multicamera photogrammerty software application that runs on a Microsoft Windows XP platform and was developed at Kennedy Space Center by ASRC Aerospace. This software system collects data about the locations of birds within a volume centered on the Space Shuttle and transmits it in real time to the laptop computer of a test director in the Launch Control Center (LCC) Firing Room.

  3. HITCal: a software tool for analysis of video head impulse test responses.

    PubMed

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  4. Operational Based Vision Assessment Automated Vision Test Collection User Guide

    DTIC Science & Technology

    2017-05-15

    repeatability to support correlation analysis. The AVT research grade tests also support interservice, international, industry, and academic partnerships...software, provides information concerning various menu options and operation of the test, and provides a brief description of each of the automated vision...2802, 6 Jun 2017. TABLE OF CONTENTS (concluded) Section Page 7.0 OBVA VISION TEST DESCRIPTIONS

  5. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  6. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  7. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  8. Pilot Study of an Open-source Image Analysis Software for Automated Screening of Conventional Cervical Smears.

    PubMed

    Sanyal, Parikshit; Ganguli, Prosenjit; Barui, Sanghita; Deb, Prabal

    2018-01-01

    The Pap stained cervical smear is a screening tool for cervical cancer. Commercial systems are used for automated screening of liquid based cervical smears. However, there is no image analysis software used for conventional cervical smears. The aim of this study was to develop and test the diagnostic accuracy of a software for analysis of conventional smears. The software was developed using Python programming language and open source libraries. It was standardized with images from Bethesda Interobserver Reproducibility Project. One hundred and thirty images from smears which were reported Negative for Intraepithelial Lesion or Malignancy (NILM), and 45 images where some abnormality has been reported, were collected from the archives of the hospital. The software was then tested on the images. The software was able to segregate images based on overall nuclear: cytoplasmic ratio, coefficient of variation (CV) in nuclear size, nuclear membrane irregularity, and clustering. 68.88% of abnormal images were flagged by the software, as well as 19.23% of NILM images. The major difficulties faced were segmentation of overlapping cell clusters and separation of neutrophils. The software shows potential as a screening tool for conventional cervical smears; however, further refinement in technique is required.

  9. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  10. An automated system for creep testing

    NASA Technical Reports Server (NTRS)

    Spiegel, F. Xavier; Weigman, Bernard J.

    1992-01-01

    A completely automated data collection system was devised to measure, analyze, and graph creep versus time using a PC, a 16 channel multiplexed analog to digital converter, and low friction potentiometers to measure length. The sampling rate for each experiment can be adjusted in the software to meet the needs of the material tested. Data is collected and stored on a diskette for permanent record and also for later data analysis on a different machine.

  11. BAT3 Analyzer: Real-Time Data Display and Interpretation Software for the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3)

    USGS Publications Warehouse

    Winston, Richard B.; Shapiro, Allen M.

    2007-01-01

    The BAT3 Analyzer provides real-time display and interpretation of fluid pressure responses and flow rates measured during geochemical sampling, hydraulic testing, or tracer testing conducted with the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3) (Shapiro, 2007). Real-time display of the data collected with the Multifunction BAT3 allows the user to ensure that the downhole apparatus is operating properly, and that test procedures can be modified to correct for unanticipated hydraulic responses during testing. The BAT3 Analyzer can apply calibrations to the pressure transducer and flow meter data to display physically meaningful values. Plots of the time-varying data can be formatted for a specified time interval, and either saved to files, or printed. Libraries of calibrations for the pressure transducers and flow meters can be created, updated and reloaded to facilitate the rapid set up of the software to display data collected during testing with the Multifunction BAT3. The BAT3 Analyzer also has the functionality to estimate calibrations for pressure transducers and flow meters using data collected with the Multifunction BAT3 in conjunction with corroborating check measurements. During testing with the Multifunction BAT3, and also after testing has been completed, hydraulic properties of the test interval can be estimated by comparing fluid pressure responses with model results; a variety of hydrogeologic conceptual models of the formation are available for interpreting fluid-withdrawal, fluid-injection, and slug tests.

  12. FPGA Based Reconfigurable ATM Switch Test Bed

    NASA Technical Reports Server (NTRS)

    Chu, Pong P.; Jones, Robert E.

    1998-01-01

    Various issues associated with "FPGA Based Reconfigurable ATM Switch Test Bed" are presented in viewgraph form. Specific topics include: 1) Network performance evaluation; 2) traditional approaches; 3) software simulation; 4) hardware emulation; 5) test bed highlights; 6) design environment; 7) test bed architecture; 8) abstract sheared-memory switch; 9) detailed switch diagram; 10) traffic generator; 11) data collection circuit and user interface; 12) initial results; and 13) the following conclusions: Advances in FPGA make hardware emulation feasible for performance evaluation, hardware emulation can provide several orders of magnitude speed-up over software simulation; due to the complexity of hardware synthesis process, development in emulation is much more difficult than simulation and requires knowledge in both networks and digital design.

  13. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    NASA Astrophysics Data System (ADS)

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  14. Pairwise-Comparison Software

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1995-01-01

    Pairwise comparison (PWC) is computer program that collects data for psychometric scaling techniques now used in cognitive research. It applies technique of pairwise comparisons, which is one of many techniques commonly used to acquire the data necessary for analyses. PWC administers task, collects data from test subject, and formats data for analysis. Written in Turbo Pascal v6.0.

  15. Frequency Domain Identification Toolbox

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen

    1996-01-01

    This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.

  16. Measurement of the area of venous ulcers using two software programs 1

    PubMed Central

    Eberhardt, Thaís Dresch; de Lima, Suzinara Beatriz Soares; Lopes, Luis Felipe Dias; Borges, Eline de Lima; Weiller, Teresinha Heck; da Fonseca, Graziele Gorete Portella

    2016-01-01

    ABSTRACT Objective: to compare the measurement area of venous ulcers using AutoCAD(r) and Image Tool software. Method: this was an assessment of reproducibility tests conducted in a angiology clinic of a university hospital. Data were collected from 21 patients with venous ulcers, in the period from March to July of 2015, using a collection form and photograph of wounds. Five nurses (evaluators) of the hospital skin wound study group participated. The wounds were measured using both software programs. Data were analyzed using intraclass correlation coefficient, concordance correlation coefficient and Bland-Altman analysis. The study met the ethical aspects in accordance with current legislation. Results: the size of ulcers varied widely, however, without significant difference between the measurements; an excellent intraclass and concordance correlation was found between both software programs, which seem to be more accurate when measuring a wound area >10 cm². Conclusion: the use of both software programs is appropriate for measurement of venous ulcers, appearing to be more accurate when used to measure a wound area > 10 cm². PMID:27992028

  17. Organizational Analysis of the United States Army Evaluation Center

    DTIC Science & Technology

    2014-12-01

    analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis

  18. [Development of ophthalmologic software for handheld devices].

    PubMed

    Grottone, Gustavo Teixeira; Pisa, Ivan Torres; Grottone, João Carlos; Debs, Fernando; Schor, Paulo

    2006-01-01

    The formulas for calculation of intraocular lenses have evolved since the first theoretical formulas by Fyodorov. Among the second generation formulas, the SRK-I formula has a simple calculation, taking into account a calculation that only involved anteroposterior length, IOL constant and average keratometry. With the evolution of those formulas, complexicity increased making the reconfiguration of parameters in special situations impracticable. In this way the production and development of software for such a purpose, can help surgeons to recalculate those values if needed. To idealize, develop and test a Brazilian software for calculation of IOL dioptric power for handheld computers. For the development and programming of software for calculation of IOL, we used PocketC program (OrbWorks Concentrated Software, USA). We compared the results collected from a gold-standard device (Ultrascan/Alcon Labs) with the simulation of 100 fictitious patients, using the same IOL parameters. The results were grouped for ULTRASCAN data and SOFTWARE data. Using SRK/T formula the range of those parameters included a keratometry varying between 35 and 55D, axial length between 20 and 28 mm, IOL constants of 118.7, 118.3 and 115.8. Using Wilcoxon test, it was shown that the groups do not differ (p=0.314). We had a variation in the Ultrascan sample between 11.82 and 27.97. In the tested program sample the variation was practically similar (11.83-27.98). The average of the Ultrascan group was 20.93. The software group had a similar average. The standard deviation of the samples was also similar (4.53). The precision of IOL software for handheld devices was similar to that of the standard devices using the SRK/T formula. The software worked properly, was steady without bugs in tested models of operational system.

  19. Simultaneous real-time data collection methods

    NASA Technical Reports Server (NTRS)

    Klincsek, Thomas

    1992-01-01

    This paper describes the development of electronic test equipment which executes, supervises, and reports on various tests. This validation process uses computers to analyze test results and report conclusions. The test equipment consists of an electronics component and the data collection and reporting unit. The PC software, display screens, and real-time data-base are described. Pass-fail procedures and data replay are discussed. The OS2 operating system and Presentation Manager user interface system were used to create a highly interactive automated system. The system outputs are hardcopy printouts and MS DOS format files which may be used as input for other PC programs.

  20. Evaluating the feasibility of using online software to collect patient information in a chiropractic practice-based research network.

    PubMed

    Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël

    2016-03-01

    Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. To assess the feasibility of using online software to collect quality patient information. The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients' perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties.

  1. Unified Engineering Software System

    NASA Technical Reports Server (NTRS)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  2. Data collection procedures for the Software Engineering Laboratory (SEL) database

    NASA Technical Reports Server (NTRS)

    Heller, Gerard; Valett, Jon; Wild, Mary

    1992-01-01

    This document is a guidebook to collecting software engineering data on software development and maintenance efforts, as practiced in the Software Engineering Laboratory (SEL). It supersedes the document entitled Data Collection Procedures for the Rehosted SEL Database, number SEL-87-008 in the SEL series, which was published in October 1987. It presents procedures to be followed on software development and maintenance projects in the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC) for collecting data in support of SEL software engineering research activities. These procedures include detailed instructions for the completion and submission of SEL data collection forms.

  3. MoniQA: a general approach to monitor quality assurance

    NASA Astrophysics Data System (ADS)

    Jacobs, J.; Deprez, T.; Marchal, G.; Bosmans, H.

    2006-03-01

    MoniQA ("Monitor Quality Assurance") is a new, non-commercial, independent quality assurance software application developed in our medical physics team. It is a complete Java TM - based modular environment for the evaluation of radiological viewing devices and it thus fits in the global quality assurance network of our (film less) radiology department. The purpose of the software tool is to guide the medical physicist through an acceptance protocol and the radiologist through a constancy check protocol by presentation of the necessary test patterns and by automated data collection. Data are then sent to a central management system for further analysis. At the moment more than 55 patterns have been implemented, which can be grouped in schemes to implement protocols (i.e. AAPMtg18, DIN and EUREF). Some test patterns are dynamically created and 'drawn' on the viewing device with random parameters as is the case in a recently proposed new pattern for constancy testing. The software is installed on 35 diagnostic stations (70 monitors) in a film less radiology department. Learning time was very limited. A constancy check -with the new pattern that assesses luminance decrease, resolution problems and geometric distortion- takes only 2 minutes and 28 seconds per monitor. The modular approach of the software allows the evaluation of new or emerging test patterns. We will report on the software and its usability: practicality of the constancy check tests in our hospital and on the results from acceptance tests of viewing stations for digital mammography.

  4. Data collection and analysis software development for rotor dynamics testing in spin laboratory

    NASA Astrophysics Data System (ADS)

    Abdul-Aziz, Ali; Arble, Daniel; Woike, Mark

    2017-04-01

    Gas turbine engine components undergo high rotational loading another complex environmental conditions. Such operating environment leads these components to experience damages and cracks that can cause catastrophic failure during flights. There are traditional crack detections and health monitoring methodologies currently being used which rely on periodic routine maintenances, nondestructive inspections that often times involve engine and components dis-assemblies. These methods do not also offer adequate information about the faults, especially, if these faults at subsurface or not clearly evident. At NASA Glenn research center, the rotor dynamics laboratory is presently involved in developing newer techniques that are highly dependent on sensor technology to enable health monitoring and prediction of damage and cracks in rotor disks. These approaches are noninvasive and relatively economical. Spin tests are performed using a subscale test article mimicking turbine rotor disk undergoing rotational load. Non-contact instruments such as capacitive and microwave sensors are used to measure the blade tip gap displacement and blade vibrations characteristics in an attempt develop a physics based model to assess/predict the faults in the rotor disk. Data collection is a major component in this experimental-analytical procedure and as a result, an upgrade to an older version of the data acquisition software which is based on LabVIEW program has been implemented to support efficiently running tests and analyze the results. Outcomes obtained from the tests data and related experimental and analytical rotor dynamics modeling including key features of the updated software are presented and discussed.

  5. Drainage identification analysis and mapping, phase 2 : technical brief.

    DOT National Transportation Integrated Search

    2017-01-01

    This research studied, tested and rectified the compatibility issue related to the recent upgrades of : NJDOT vendor inspection software, and uploaded all collected data to make Drainage Identification : Analysis and Mapping System (DIAMS) current an...

  6. Real-time data collection in Linux: a case study.

    PubMed

    Finney, S A

    2001-05-01

    Multiuser UNIX-like operating systems such as Linux are often considered unsuitable for real-time data collection because of the potential for indeterminate timing latencies resulting from preemptive scheduling. In this paper, Linux is shown to be fully adequate for precisely controlled programming with millisecond resolution or better. The Linux system calls that subserve such timing control are described and tested and then utilized in a MIDI-based program for tapping and music performance experiments. The timing of this program, including data input and output, is shown to be accurate at the millisecond level. This demonstrates that Linux, with proper programming, is suitable for real-time experiment software. In addition, the detailed description and test of both the operating system facilities and the application program itself may serve as a model for publicly documenting programming methods and software performance on other operating systems.

  7. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  8. Noninvasive Fetal ECG: the PhysioNet/Computing in Cardiology Challenge 2013.

    PubMed

    Silva, Ikaro; Behar, Joachim; Sameni, Reza; Zhu, Tingting; Oster, Julien; Clifford, Gari D; Moody, George B

    2013-03-01

    The PhysioNet/CinC 2013 Challenge aimed to stimulate rapid development and improvement of software for estimating fetal heart rate (FHR), fetal interbeat intervals (FRR), and fetal QT intervals (FQT), from multichannel recordings made using electrodes placed on the mother's abdomen. For the challenge, five data collections from a variety of sources were used to compile a large standardized database, which was divided into training, open test, and hidden test subsets. Gold-standard fetal QRS and QT interval annotations were developed using a novel crowd-sourcing framework. The challenge organizers used the hidden test subset to evaluate 91 open-source software entries submitted by 53 international teams of participants in three challenge events, estimating FHR, FRR, and FQT using the hidden test subset, which was not available for study by participants. Two additional events required only user-submitted QRS annotations to evaluate FHR and FRR estimation accuracy using the open test subset available to participants. The challenge yielded a total of 91 open-source software entries. The best of these achieved average estimation errors of 187bpm 2 for FHR, 20.9 ms for FRR, and 152.7 ms for FQT. The open data sets, scoring software, and open-source entries are available at PhysioNet for researchers interested on working on these problems.

  9. General software design for multisensor data fusion

    NASA Astrophysics Data System (ADS)

    Zhang, Junliang; Zhao, Yuming

    1999-03-01

    In this paper a general method of software design for multisensor data fusion is discussed in detail, which adopts object-oriented technology under UNIX operation system. The software for multisensor data fusion is divided into six functional modules: data collection, database management, GIS, target display and alarming data simulation etc. Furthermore, the primary function, the components and some realization methods of each modular is given. The interfaces among these functional modular relations are discussed. The data exchange among each functional modular is performed by interprocess communication IPC, including message queue, semaphore and shared memory. Thus, each functional modular is executed independently, which reduces the dependence among functional modules and helps software programing and testing. This software for multisensor data fusion is designed as hierarchical structure by the inheritance character of classes. Each functional modular is abstracted and encapsulated through class structure, which avoids software redundancy and enhances readability.

  10. Analysis of Wallops Flight Test Data Through an Automated COTS System

    NASA Technical Reports Server (NTRS)

    Blackstock, Dexter Lee; Theobalds, Andre B.

    2005-01-01

    During the summer of 2004 NASA Langley Research Center flight tested a Synthetic Vision System (SVS) at the Reno/Tahoe International Airport (RNO) and the Wallops Flight Facility (WAL). The SVS included a Runway Incursion Prevention System (RIPS) to improve pilot situational awareness while operating near and on the airport surface. The flight tests consisted of air and ground operations to evaluate and validate the performance of the system. This paper describes the flight test and emphasizes how positioning data was collected, post processed and analyzed through the use of a COTS-derived software system. The system that was developed to analyze the data was constructed within the MATLAB(TM) environment. The software was modified to read the data, perform several if-then scenarios and produce the relevant graphs, figures and tables.

  11. Evaluating the feasibility of using online software to collect patient information in a chiropractic practice-based research network

    PubMed Central

    Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël

    2016-01-01

    Background: Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. Purpose: To assess the feasibility of using online software to collect quality patient information. Methods: The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients’ perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Results: Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Conclusions: Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties. PMID:27069272

  12. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  13. Advanced Mail Systems Scanner Technology. Executive Summary and Appendixes A-E.

    DTIC Science & Technology

    1980-10-01

    data base. 6. Perform color acquisition studies. 7. Investigate address and bar code reading. MASS MEMORY TECHNOLOGY 1. Collect performance data on...area of the 1728-by-2200 ICAS image memory and to transmit the data to any of the three color memories of the Comtal. Function table information can...for printing color images. The software allows the transmission of data from the ICAS frame-store memory via the MCU to the Dicomed. Software test

  14. LARGE BUILDING HVAC SIMULATION

    EPA Science Inventory

    The report discusses the monitoring and collection of data relating to indoor pressures and radon concentrations under several test conditions in a large school building in Bartow, Florida. The Florida Solar Energy Center (FSEC) used an integrated computational software, FSEC 3.0...

  15. Detection of faults and software reliability analysis

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1987-01-01

    Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.

  16. The Development of Point Doppler Velocimeter Data Acquisition and Processing Software

    NASA Technical Reports Server (NTRS)

    Cavone, Angelo A.

    2008-01-01

    In order to develop efficient and quiet aircraft and validate Computational Fluid Dynamic predications, aerodynamic researchers require flow parameter measurements to characterize flow fields about wind tunnel models and jet flows. A one-component Point Doppler Velocimeter (pDv), a non-intrusive, laser-based instrument, was constructed using a design/develop/test/validate/deploy approach. A primary component of the instrument is software required for system control/management and data collection/reduction. This software along with evaluation algorithms, advanced pDv from a laboratory curiosity to a production level instrument. Simultaneous pDv and pitot probe velocity measurements obtained at the centerline of a flow exiting a two-inch jet, matched within 0.4%. Flow turbulence spectra obtained with pDv and a hot-wire detected the primary and secondary harmonics with equal dynamic range produced by the fan driving the flow. Novel,hardware and software methods were developed, tested and incorporated into the system to eliminate and/or minimize error sources and improve system reliability.

  17. Optical fiber dispersion characterization study

    NASA Technical Reports Server (NTRS)

    Geeslin, A.; Arriad, A.; Riad, S. M.; Padgett, M. E.

    1979-01-01

    The theory, design, and results of optical fiber pulse dispersion measurements are considered. Both the hardware and software required to perform this type of measurement are described. Hardware includes a thermoelectrically cooled injection laser diode source, an 800 GHz gain bandwidth produce avalanche photodiode and an input mode scrambler. Software for a HP 9825 computer includes fast Fourier transform, inverse Fourier transform, and optimal compensation deconvolution. Test set construction details are also included. Test results include data collected on a 1 Km fiber, a 4 Km fiber, a fused spliced, eight 600 meter length fibers concatenated to form 4.8 Km, and up to nine optical connectors.

  18. The Organization of a Computer Software Collection Using an Information Storage and Retrieval Software Package.

    ERIC Educational Resources Information Center

    Davies, Denise M.

    1985-01-01

    Discusses design, development, and use of a database to provide organization and access to a computer software collection at the University of Hawaii School of Library Studies. Field specifications, samples of report forms, and a description of the physical organization of the software collection are included. (MBR)

  19. Development of a Pediatric Visual Field Test

    PubMed Central

    Miranda, Marco A.; Henson, David B.; Fenerty, Cecilia; Biswas, Susmito; Aslam, Tariq

    2016-01-01

    Purpose We describe a pediatric visual field (VF) test based on a computer game where software and hardware combine to provide an enjoyable test experience. Methods The test software consists of a platform-based computer game presented to the central VF. A storyline was created around the game as was a structure surrounding the computer monitor to enhance patients' experience. The patient is asked to help the central character collect magic coins (stimuli). To collect these coins a series of obstacles need to be overcome. The test was presented on a Sony PVM-2541A monitor calibrated from a central midpoint with a Minolta CS-100 photometer placed at 50 cm. Measurements were performed at 15 locations on the screen and the contrast calculated. Retinal sensitivity was determined by modulating stimulus in size. To test the feasibility of the novel approach 20 patients (4–16 years old) with no history of VF defects were recruited. Results For the 14 subjects completing the study, 31 ± 15 data points were collected on 1 eye of each patient. Mean background luminance and stimulus contrast were 9.9 ± 0.3 cd/m2 and 27.9 ± 0.1 dB, respectively. Sensitivity values obtained were similar to an adult population but variability was considerably higher – 8.3 ± 9.0 dB. Conclusions Preliminary data show the feasibility of a game-based VF test for pediatric use. Although the test was well accepted by the target population, test variability remained very high. Translational Relevance Traditional VF tests are not well tolerated by children. This study describes a child-friendly approach to test visual fields in the targeted population. PMID:27980876

  20. Use of the Photo-Electromyogram to Objectively Diagnose and Monitor Treatment of Post-TBI Light Sensitivity

    DTIC Science & Technology

    2012-10-01

    in place. Mark Ginsberg, one of our local jewelry story owners has acquired 3D extruding printers for medical instrumentation applications and will...comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE...tested out our software, which was written to control the monitor brightness, duration, and color for each visual stimulus. The software has been

  1. Collected software engineering papers, volume 2

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.

  2. Training the Next Generation in Space Situational Awareness Research

    NASA Astrophysics Data System (ADS)

    Colpo, D.; Reddy, V.; Arora, S.; Tucker, S.; Jeffries, L.; May, D.; Bronson, R.; Hunten, E.

    Traditional academic SSA research has relied on commercial off the shelf (COTS) systems for collecting metric and lightcurve data. COTS systems have several advantages over a custom built system including cost, easy integration, technical support and short deployment timescales. We at the University of Arizona took an alternative approach to develop a sensor system for space object characterization. Five engineering students designed and built two 0.6-meter F/4 electro-optical (EO) systems for collecting lightcurve and spectral data. All the design and fabrication work was carried out over the course of two semesters as part f their senior design project that is mandatory for the completion of their bachelors in engineering degree. The students designed over 200 individual parts using three-dimensional modeling software (SolidWorks), and conducted detailed optical design analysis using raytracing software (ZEMAX), with oversight and advice from faculty sponsor and Starizona, a local small business in Tucson. The components of the design were verified by test, analysis, inspection, or demonstration, per the process that the University of Arizona requires for each of its design projects. Methods to complete this project include mechanical FEA, optical testing methods (Foucault Knife Edge Test and Couder Mask Test), tests to verify the function of the thermometers, and a final pointing model test. A surprise outcome of our exercise is that the entire cost of the design and fabrication of these two EO systems was significantly lower than a COTS alternative. With careful planning and coordination we were also able to reduce to the deployment times to those for a commercial system. Our experience shows that development of hardware and software for SSA research could be accomplished in an academic environment that would enable the training of the next generation with active support from local small businesses.

  3. The Effect on Prospective Teachers of the Learning Environment Supported by Dynamic Statistics Software

    ERIC Educational Resources Information Center

    Koparan, Timur

    2016-01-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study…

  4. Rapid access to information resources in clinical biochemistry: medical applications of Personal Digital Assistants (PDA).

    PubMed

    Serdar, Muhittin A; Turan, Mustafa; Cihan, Murat

    2008-06-01

    Laboratory specialists currently need to access scientific-based information at anytime and anywhere. A considerable period of time and too much effort are required to access this information through existing accumulated data. Personal digital assistants (PDA) are supposed to provide an effective solution with commercial software for this problem. In this study, 11 commercial software products (UpToDate, ePocrates, Inforetrive, Pepid, eMedicine, FIRST Consult, and 5 laboratory e-books released by Skyscape and/or Isilo) were selected and the benefits of their use were evaluated by seven laboratory specialists. The assessment of the software was performed based on the number of the tests included, the software content of detailed information for each test-like process, method, interpretation of results, reference ranges, critical values, interferences, equations, pathophysiology, supplementary technical details such as sample collection principles, and additional information such as linked references, evidence-based data, test cost, etc. In terms of technique, the following items are considered: the amount of memory required to run the software, the graphical user interface, which is a user-friendly instrument, and the frequency of new and/or up-date releases. There is still no perfect program, as we have anticipated. Interpretation of laboratory results may require software with an integrated program. However, methodological data are mostly not included in the software evaluated. It seems that these shortcomings will be fixed in the near future, and PDAs and relevant medical applications will also become indispensable for all physicians including laboratory specialists in the field of training/education and in patient care.

  5. Secure Video Surveillance System Acquisition Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2009-12-04

    The SVSS Acquisition Software collects and displays video images from two cameras through a VPN, and store the images onto a collection controller. The software is configured to allow a user to enter a time window to display up to 2 1/2, hours of video review. The software collects images from the cameras at a rate of 1 image per second and automatically deletes images older than 3 hours. The software code operates in a linux environment and can be run in a virtual machine on Windows XP. The Sandia software integrates the different COTS software together to build themore » video review system.« less

  6. Diagnostic accuracy of a novel software technology for detecting pneumothorax in a porcine model.

    PubMed

    Summers, Shane M; Chin, Eric J; April, Michael D; Grisell, Ronald D; Lospinoso, Joshua A; Kheirabadi, Bijan S; Salinas, Jose; Blackbourne, Lorne H

    2017-09-01

    Our objective was to measure the diagnostic accuracy of a novel software technology to detect pneumothorax on Brightness (B) mode and Motion (M) mode ultrasonography. Ultrasonography fellowship-trained emergency physicians performed thoracic ultrasonography at baseline and after surgically creating a pneumothorax in eight intubated, spontaneously breathing porcine subjects. Prior to pneumothorax induction, we captured sagittal M-mode still images and B-mode videos of each intercostal space with a linear array transducer at 4cm of depth. After collection of baseline images, we placed a chest tube, injected air into the pleural space in 250mL increments, and repeated the ultrasonography for pneumothorax volumes of 250mL, 500mL, 750mL, and 1000mL. We confirmed pneumothorax with intrapleural digital manometry and ultrasound by expert sonographers. We exported collected images for interpretation by the software. We treated each individual scan as a single test for interpretation by the software. Excluding indeterminate results, we collected 338M-mode images for which the software demonstrated a sensitivity of 98% (95% confidence interval [CI] 92-99%), specificity of 95% (95% CI 86-99), positive likelihood ratio (LR+) of 21.6 (95% CI 7.1-65), and negative likelihood ratio (LR-) of 0.02 (95% CI 0.008-0.046). Among 364 B-mode videos, the software demonstrated a sensitivity of 86% (95% CI 81-90%), specificity of 85% (81-91%), LR+ of 5.7 (95% CI 3.2-10.2), and LR- of 0.17 (95% CI 0.12-0.22). This novel technology has potential as a useful adjunct to diagnose pneumothorax on thoracic ultrasonography. Published by Elsevier Inc.

  7. Advanced program development management software system. Software description and user's manual

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The objectives of this project were to apply emerging techniques and tools from the computer science discipline of paperless management to the activities of the Space Transportation and Exploration Office (PT01) in Marshall Space Flight Center (MSFC) Program Development, thereby enhancing the productivity of the workforce, the quality of the data products, and the collection, dissemination, and storage of information. The approach used to accomplish the objectives emphasized the utilization of finished form (off-the-shelf) software products to the greatest extent possible without impacting the performance of the end product, to pursue developments when necessary in the rapid prototyping environment to provide a mechanism for frequent feedback from the users, and to provide a full range of user support functions during the development process to promote testing of the software.

  8. Gravity Probe B data system description

    NASA Astrophysics Data System (ADS)

    Bennett, Norman R.

    2015-11-01

    The Gravity Probe B data system, developed, integrated, and tested by Lockheed Missiles & Space Company, and later Lockheed Martin Corporation, included flight and ground command, control, and communications software. The development was greatly facilitated, conceptually and by the transfer of key personnel, through Lockheed’s earlier flight and ground test software development for the Hubble Space Telescope (HST). Key design challenges included the tight mission timeline (17 months, 9 days of on-orbit operation), the need to tune the system once on-orbit, and limited 2 Kbps real-time data rates and ground asset availability. The result was a completely integrated space vehicle and Stanford mission operations center, which successfully collected and archived 97% of the ‘guide star valid’ data to support the science analysis. Lessons learned and incorporated from the HST flight software development and on-orbit support experience, and Lockheed’s independent research and development effort, will be discussed.

  9. A second generation experiment in fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    Information was collected on the efficacy of fault-tolerant software by conducting two large-scale controlled experiments. In the first, an empirical study of multi-version software (MVS) was conducted. The second experiment is an empirical evaluation of self testing as a method of error detection (STED). The purpose ot the MVS experiment was to obtain empirical measurement of the performance of multi-version systems. Twenty versions of a program were prepared at four different sites under reasonably realistic development conditions from the same specifications. The purpose of the STED experiment was to obtain empirical measurements of the performance of assertions in error detection. Eight versions of a program were modified to include assertions at two different sites under controlled conditions. The overall structure of the testing environment for the MVS experiment and its status are described. Work to date in the STED experiment is also presented.

  10. LevRad software as a tool to learn how to proceed with an evaluation of barriers.

    PubMed

    Ferreira, C C; Souza, S O

    2011-05-30

    We developed the software LevRad with the objective of teaching how to proceed in an analysis of barriers shielding against x-rays to minimize the contact of the professional or the student with x-rays and also to prevent wearing out of the x-ray equipment. Some tests of the software were made, and preliminary results indicate that LevRad is efficient as a complementary tool for the development of professionals related to diagnostic radiology. In the case of education, an advantage is gained when the beginner uses the software before his or her first contact with x-ray equipment in locu. The software introduces a basic knowledge about evaluation of barriers, prevents wearing out of the x—ray tube, reinforces teaching of evaluation of barriers, and reduces the collective effective dose by avoiding unnecessary exposures when possible.

  11. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  12. Development of a portable bicycle/pedestrian monitoring system for safety enhancement

    NASA Astrophysics Data System (ADS)

    Usher, Colin; Daley, W. D. R.

    2015-03-01

    Pedestrians involved in roadway accidents account for nearly 12 percent of all traffic fatalities and 59,000 injuries each year. Most injuries occur when pedestrians attempt to cross roads, and there have been noted differences in accident rates midblock vs. at intersections. Collecting data on pedestrian behavior is a time consuming manual process that is prone to error. This leads to a lack of quality information to guide the proper design of lane markings and traffic signals to enhance pedestrian safety. Researchers at the Georgia Tech Research Institute are developing and testing an automated system that can be rapidly deployed for data collection to support the analysis of pedestrian behavior at intersections and midblock crossings with and without traffic signals. This system will analyze the collected video data to automatically identify and characterize the number of pedestrians and their behavior. It consists of a mobile trailer with four high definition pan-tilt cameras for data collection. The software is custom designed and uses state of the art commercial pedestrian detection algorithms. We will be presenting the system hardware and software design, challenges, and results from the preliminary system testing. Preliminary results indicate the ability to provide representative quantitative data on pedestrian motion data more efficiently than current techniques.

  13. A Comparative Study of Point Cloud Data Collection and Processing

    NASA Astrophysics Data System (ADS)

    Pippin, J. E.; Matheney, M.; Gentle, J. N., Jr.; Pierce, S. A.; Fuentes-Pineda, G.

    2016-12-01

    Over the past decade, there has been dramatic growth in the acquisition of publicly funded high-resolution topographic data for scientific, environmental, engineering and planning purposes. These data sets are valuable for applications of interest across a large and varied user community. However, because of the large volumes of data produced by high-resolution mapping technologies and expense of aerial data collection, it is often difficult to collect and distribute these datasets. Furthermore, the data can be technically challenging to process, requiring software and computing resources not readily available to many users. This study presents a comparison of advanced computing hardware and software that is used to collect and process point cloud datasets, such as LIDAR scans. Activities included implementation and testing of open source libraries and applications for point cloud data processing such as, Meshlab, Blender, PDAL, and PCL. Additionally, a suite of commercial scale applications, Skanect and Cloudcompare, were applied to raw datasets. Handheld hardware solutions, a Structure Scanner and Xbox 360 Kinect V1, were tested for their ability to scan at three field locations. The resultant data projects successfully scanned and processed subsurface karst features ranging from small stalactites to large rooms, as well as a surface waterfall feature. Outcomes support the feasibility of rapid sensing in 3D at field scales.

  14. Dam Failure Inundation Map Project

    NASA Technical Reports Server (NTRS)

    Johnson, Carl; Iokepa, Judy; Dahlman, Jill; Michaud, Jene; Paylor, Earnest (Technical Monitor)

    2000-01-01

    At the end of the first year, we remain on schedule. Property owners were identified and contacted for land access purposes. A prototype software package has been completed and was demonstrated to the Division of Land and Natural Resources (DLNR), National Weather Service (NWS) and Pacific Disaster Center (PDC). A field crew gathered data and surveyed the areas surrounding two dams in Waimea. (A field report is included in the annual report.) Data sensitivity analysis was initiated and completed. A user's manual has been completed. Beta testing of the software was initiated, but not completed. The initial TNK and property owner data collection for the additional test sites on Oahu and Kauai have been initiated.

  15. Comparing On-Orbit and Ground Performance for an S-Band Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Chelmins, David T.; Welch, Bryan W.

    2014-01-01

    NASA's Space Communications and Navigation Testbed was installed on an external truss of the International Space Station in 2012. The testbed contains several software-defined radios (SDRs), including the Jet Propulsion Laboratory (JPL) SDR, which underwent performance testing throughout 2013 with NASAs Tracking and Data Relay Satellite System (TDRSS). On-orbit testing of the JPL SDR was conducted at S-band with the Glenn Goddard TDRSS waveform and compared against an extensive dataset collected on the ground prior to launch. This paper will focus on the development of a waveform power estimator on the ground post-launch and discuss the performance challenges associated with operating the power estimator in space.

  16. Comparing On-Orbit and Ground Performance for an S-Band Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Welch, Bryan

    2014-01-01

    NASA's Space Communications and Navigation Testbed was installed on an external truss of the International Space Station in 2012. The testbed contains several software-defined radios (SDRs), including the Jet Propulsion Laboratory (JPL) SDR, which underwent performance testing throughout 2013 with NASA's Tracking and Data Relay Satellite System (TDRSS). On-orbit testing of the JPL SDR was conducted at S-band with the Glenn Goddard TDRSS waveform and compared against an extensive dataset collected on the ground prior to launch. This paper will focus on the development of a waveform power estimator on the ground post-launch and discuss the performance challenges associated with operating the power estimator in space.

  17. GenoCore: A simple and fast algorithm for core subset selection from large genotype datasets.

    PubMed

    Jeong, Seongmun; Kim, Jae-Yoon; Jeong, Soon-Chun; Kang, Sung-Taeg; Moon, Jung-Kyung; Kim, Namshin

    2017-01-01

    Selecting core subsets from plant genotype datasets is important for enhancing cost-effectiveness and to shorten the time required for analyses of genome-wide association studies (GWAS), and genomics-assisted breeding of crop species, etc. Recently, a large number of genetic markers (>100,000 single nucleotide polymorphisms) have been identified from high-density single nucleotide polymorphism (SNP) arrays and next-generation sequencing (NGS) data. However, there is no software available for picking out the efficient and consistent core subset from such a huge dataset. It is necessary to develop software that can extract genetically important samples in a population with coherence. We here present a new program, GenoCore, which can find quickly and efficiently the core subset representing the entire population. We introduce simple measures of coverage and diversity scores, which reflect genotype errors and genetic variations, and can help to select a sample rapidly and accurately for crop genotype dataset. Comparison of our method to other core collection software using example datasets are performed to validate the performance according to genetic distance, diversity, coverage, required system resources, and the number of selected samples. GenoCore selects the smallest, most consistent, and most representative core collection from all samples, using less memory with more efficient scores, and shows greater genetic coverage compared to the other software tested. GenoCore was written in R language, and can be accessed online with an example dataset and test results at https://github.com/lovemun/Genocore.

  18. China SLAT Plan Template

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dietrich, Richard E.

    2016-07-01

    This document serves as the System-Level Acceptance Test (SLAT) Plan for Site Name, City, Country. This test plan is to provide independent testing of the Radiation Detection System (RDS) installed at Site Name to verify that Customs has been delivered a fully-functioning system as required by all contractual commitments. The system includes all installed hardware and software components. The SLAT plan will verify that separate components are working individually and collectively from a system perspective.

  19. Educational interactive multimedia software: The impact of interactivity on learning

    NASA Astrophysics Data System (ADS)

    Reamon, Derek Trent

    This dissertation discusses the design, development, deployment and testing of two versions of educational interactive multimedia software. Both versions of the software are focused on teaching mechanical engineering undergraduates about the fundamentals of direct-current (DC) motor physics and selection. The two versions of Motor Workshop software cover the same basic materials on motors, but differ in the level of interactivity between the students and the software. Here, the level of interactivity refers to the particular role of the computer in the interaction between the user and the software. In one version, the students navigate through information that is organized by topic, reading text, and viewing embedded video clips; this is referred to as "low-level interactivity" software because the computer simply presents the content. In the other version, the students are given a task to accomplish---they must design a small motor-driven 'virtual' vehicle that competes against computer-generated opponents. The interaction is guided by the software which offers advice from 'experts' and provides contextual information; we refer to this as "high-level interactivity" software because the computer is actively participating in the interaction. The software was used in two sets of experiments, where students using the low-level interactivity software served as the 'control group,' and students using the highly interactive software were the 'treatment group.' Data, including pre- and post-performance tests, questionnaire responses, learning style characterizations, activity tracking logs and videotapes were collected for analysis. Statistical and observational research methods were applied to the various data to test the hypothesis that the level of interactivity effects the learning situation, with higher levels of interactivity being more effective for learning. The results show that both the low-level and high-level interactive versions of the software were effective in promoting learning about the subject of motors. The focus of learning varied between users of the two versions, however. The low-level version was more effective for teaching concepts and terminology, while the high-level version seemed to be more effective for teaching engineering applications.

  20. Texas flexible pavements and overlays : calibration plans for M-E models and related software.

    DOT National Transportation Integrated Search

    2013-06-01

    This five-year project was initiated to collect materials and pavement performance data on a minimum of 100 highway test sections around the State of Texas, incorporating flexible pavements and overlays. Besides being used to calibrate and validate m...

  1. ScriptingRT: A Software Library for Collecting Response Latencies in Online Studies of Cognition

    PubMed Central

    Schubert, Thomas W.; Murteira, Carla; Collins, Elizabeth C.; Lopes, Diniz

    2013-01-01

    ScriptingRT is a new open source tool to collect response latencies in online studies of human cognition. ScriptingRT studies run as Flash applets in enabled browsers. ScriptingRT provides the building blocks of response latency studies, which are then combined with generic Apache Flex programming. Six studies evaluate the performance of ScriptingRT empirically. Studies 1–3 use specialized hardware to measure variance of response time measurement and stimulus presentation timing. Studies 4–6 implement a Stroop paradigm and run it both online and in the laboratory, comparing ScriptingRT to other response latency software. Altogether, the studies show that Flash programs developed in ScriptingRT show a small lag and an increased variance in response latencies. However, this did not significantly influence measured effects: The Stroop effect was reliably replicated in all studies, and the found effects did not depend on the software used. We conclude that ScriptingRT can be used to test response latency effects online. PMID:23805326

  2. System IDentification Programs for AirCraft (SIDPAC)

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2002-01-01

    A collection of computer programs for aircraft system identification is described and demonstrated. The programs, collectively called System IDentification Programs for AirCraft, or SIDPAC, were developed in MATLAB as m-file functions. SIDPAC has been used successfully at NASA Langley Research Center with data from many different flight test programs and wind tunnel experiments. SIDPAC includes routines for experiment design, data conditioning, data compatibility analysis, model structure determination, equation-error and output-error parameter estimation in both the time and frequency domains, real-time and recursive parameter estimation, low order equivalent system identification, estimated parameter error calculation, linear and nonlinear simulation, plotting, and 3-D visualization. An overview of SIDPAC capabilities is provided, along with a demonstration of the use of SIDPAC with real flight test data from the NASA Glenn Twin Otter aircraft. The SIDPAC software is available without charge to U.S. citizens by request to the author, contingent on the requestor completing a NASA software usage agreement.

  3. STARS: a software application for the EBEX autonomous daytime star cameras

    NASA Astrophysics Data System (ADS)

    Chapman, Daniel; Didier, Joy; Hanany, Shaul; Hillbrand, Seth; Limon, Michele; Miller, Amber; Reichborn-Kjennerud, Britt; Tucker, Greg; Vinokurov, Yury

    2014-07-01

    The E and B Experiment (EBEX) is a balloon-borne telescope designed to probe polarization signals in the CMB resulting from primordial gravitational waves, gravitational lensing, and Galactic dust emission. EBEX completed an 11 day flight over Antarctica in January 2013 and data analysis is underway. EBEX employs two star cameras to achieve its real-time and post-flight pointing requirements. We wrote a software application called STARS to operate, command, and collect data from each of the star cameras, and to interface them with the main flight computer. We paid special attention to make the software robust against potential in-flight failures. We report on the implementation, testing, and successful in flight performance of STARS.

  4. Precise Documentation: The Key to Better Software

    NASA Astrophysics Data System (ADS)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  5. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Weiss, David M.

    1983-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.

  6. Cataloging and Organizing Microcomputer Software--Where Do We Go from First Base?

    ERIC Educational Resources Information Center

    Choi, Susan E.

    This position paper addresses general topics to be considered when organizing library software collections. Tasks involved in organizing and cataloging educational software collections are discussed, including arrangement/classification; the type of catalog; descriptions of the software; the general materials designator; storage requirements; and…

  7. A Data Collection and Representation Framework for Software and Human-Computer Interaction Measurements.

    DTIC Science & Technology

    2000-01-04

    by Miara , Musselman, Navarro, and Shneiderman [ Miara et al. 1983] they found that indentation correlated strongly with comprehension. They tested 47...Dissertation, Auburn University, Auburn, AL, August 1996. MIARA , R.J., MUSSELMAN, JA., NAVARRO, JA., AND SHNEIDERMAN, B. 1983. Program Indentation and

  8. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  9. Brief Report: A Mobile Application to Treat Prosodic Deficits in Autism Spectrum Disorder and Other Communication Impairments: A Pilot Study.

    PubMed

    Simmons, Elizabeth Schoen; Paul, Rhea; Shic, Frederick

    2016-01-01

    This study examined the acceptability of a mobile application, SpeechPrompts, designed to treat prosodic disorders in children with ASD and other communication impairments. Ten speech-language pathologists (SLPs) in public schools and 40 of their students, 5-19 years with prosody deficits participated. Students received treatment with the software over eight weeks. Pre- and post-treatment speech samples and student engagement data were collected. Feedback on the utility of the software was also obtained. SLPs implemented the software with their students in an authentic education setting. Student engagement ratings indicated students' attention to the software was maintained during treatment. Although more testing is warranted, post-treatment prosody ratings suggest that SpeechPrompts has potential to be a useful tool in the treatment of prosodic disorders.

  10. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  11. Development and Flight Testing of an Autonomous Landing Gear Health-Monitoring System

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.

    2003-01-01

    Development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation; and, data acquisition, storage and retrieval.

  12. Guide to data collection

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Guidelines and recommendations are presented for the collection of software development data. Motivation and planning for, and implementation and management of, a data collection effort are discussed. Topics covered include types, sources, and availability of data; methods and costs of data collection; types of analyses supported; and warnings and suggestions based on software engineering laboratory (SEL) experiences. This document is intended as a practical guide for software managers and engineers, abstracted and generalized from 5 years of SEL data collection.

  13. A New Method for Global Optimization Based on Stochastic Differential Equations.

    DTIC Science & Technology

    1984-12-01

    Optimizacion Global de Funciones, Universidad Nacional Autonoma de M~xico, Instituto de Investigaciones en Matematicas Aplicadas y en Sistemas, Report...SIGMA package and its usage are described in full de - tail in Annex A5; the complete listing of the FORTRAN code is in Annex A6. 5. Test problems Since...software implemen- tation on a number of test problems: and therefore a collection of test problems naturally began to build up during project de - velopment

  14. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. COPS: Large-scale nonlinearly constrained optimization problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bondarenko, A.S.; Bortz, D.M.; More, J.J.

    2000-02-10

    The authors have started the development of COPS, a collection of large-scale nonlinearly Constrained Optimization Problems. The primary purpose of this collection is to provide difficult test cases for optimization software. Problems in the current version of the collection come from fluid dynamics, population dynamics, optimal design, and optimal control. For each problem they provide a short description of the problem, notes on the formulation of the problem, and results of computational experiments with general optimization solvers. They currently have results for DONLP2, LANCELOT, MINOS, SNOPT, and LOQO.

  16. Virtual Service, Real Data: Results of a Pilot Study.

    ERIC Educational Resources Information Center

    Kibbee, Jo; Ward, David; Ma, Wei

    2002-01-01

    Describes a pilot project at the University of Illinois at Urbana-Champaign reference and undergraduate libraries to test the feasibility of offering real-time online reference service via their Web site. Discusses software selection, policies and procedures, promotion and marketing, user interface, training and staffing, data collection, and…

  17. Towards a cross-platform software framework to support end-to-end hydrometeorological sensor network deployment

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Sam, R.; Piasecki, M.

    2016-12-01

    Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.

  18. How well does voice interaction work in space?

    NASA Technical Reports Server (NTRS)

    Morris, Randy B.; Whitmore, Mihriban; Adam, Susan C.

    1993-01-01

    The methods and results of an evaluation of the Voice Navigator software package are discussed. The first phase or ground phase of the study consisted of creating, or training, computer voice files of specific commands. This consisted of repeating each of six commands eight times. The files were then tested for recognition accuracy by the software aboard the microgravity aircraft. During the second phase, both voice training and testing were performed in microgravity. Inflight training was done due to problems encountered in phase one which were believed to be caused by ambient noise levels. Both quantitative and qualitative data were collected. Only one of the commands was found to offer consistently high recognition rates across subjects during the second phase.

  19. Recent enhancements to and applications of the SmartBrick structural health monitoring platform

    NASA Astrophysics Data System (ADS)

    Gunasekaran, A.; Cross, S.; Patel, N.; Sedigh, S.

    2012-04-01

    The SmartBrick network is an autonomous and wireless solution for structural health monitoring of civil infrastructures. The base station is currently in its third generation and has been laboratory- and field-tested in the United States and Italy. The second generation of the sensor nodes has been laboratory-tested as of publication. In this paper, we present recent enhancements made to hardware and software of the SmartBrick platform. Salient improvements described include the development of a new base station with fully-integrated long-range GSM (cellular) and short-range ZigBee communication. The major software improvement described in this paper is migration to the ZigBee PRO stack, which was carried out in the interest of interoperability. To broaden the application of the platform to critical environments that require survivability and fault tolerance, we have striven to achieve compliance with military standards in the areas of hardware, software, and communication. We describe these efforts and present a survey of the military standards investigated. Also described is instrumentation of a three-span experimental bridge in Washington County, Missouri; with the SmartBrick platform. The sensors, whose output is conditioned and multiplexed; include strain gauges, thermocouples, push potentiometers, and three-axis inclinometers. Data collected is stored on site and reported over the cellular network. Real-time alerts are generated if any monitored parameter falls outside its acceptable range. Redundant sensing and communication provide reliability and facilitate corroboration of the data collected. A web interface is used to issue remote configuration commands and to facilitate access to and visualization of the data collected.

  20. Statistical analysis of Turbine Engine Diagnostic (TED) field test data

    NASA Astrophysics Data System (ADS)

    Taylor, Malcolm S.; Monyak, John T.

    1994-11-01

    During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.

  1. Programmable, automated transistor test system

    NASA Technical Reports Server (NTRS)

    Truong, L. V.; Sundburg, G. R.

    1986-01-01

    A programmable, automated transistor test system was built to supply experimental data on new and advanced power semiconductors. The data will be used for analytical models and by engineers in designing space and aircraft electric power systems. A pulsed power technique was used at low duty cycles in a nondestructive test to examine the dynamic switching characteristic curves of power transistors in the 500 to 1000 V, 10 to 100 A range. Data collection, manipulation, storage, and output are operator interactive but are guided and controlled by the system software.

  2. Bayesian inference for psychology. Part II: Example applications with JASP.

    PubMed

    Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D

    2018-02-01

    Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.

  3. Online Learning Flight Control for Intelligent Flight Control Systems (IFCS)

    NASA Technical Reports Server (NTRS)

    Niewoehner, Kevin R.; Carter, John (Technical Monitor)

    2001-01-01

    The research accomplishments for the cooperative agreement 'Online Learning Flight Control for Intelligent Flight Control Systems (IFCS)' include the following: (1) previous IFC program data collection and analysis; (2) IFC program support site (configured IFC systems support network, configured Tornado/VxWorks OS development system, made Configuration and Documentation Management Systems Internet accessible); (3) Airborne Research Test Systems (ARTS) II Hardware (developed hardware requirements specification, developing environmental testing requirements, hardware design, and hardware design development); (4) ARTS II software development laboratory unit (procurement of lab style hardware, configured lab style hardware, and designed interface module equivalent to ARTS II faceplate); (5) program support documentation (developed software development plan, configuration management plan, and software verification and validation plan); (6) LWR algorithm analysis (performed timing and profiling on algorithm); (7) pre-trained neural network analysis; (8) Dynamic Cell Structures (DCS) Neural Network Analysis (performing timing and profiling on algorithm); and (9) conducted technical interchange and quarterly meetings to define IFC research goals.

  4. Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon

    NASA Technical Reports Server (NTRS)

    Comeaux, Kayla

    2011-01-01

    Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.

  5. Interactive specification acquisition via scenarios: A proposal

    NASA Technical Reports Server (NTRS)

    Hall, Robert J.

    1992-01-01

    Some reactive systems are most naturally specified by giving large collections of behavior scenarios. These collections not only specify the behavior of the system, but also provide good test suites for validating the implemented system. Due to the complexity of the systems and the number of scenarios, however, it appears that automated assistance is necessary to make this software development process workable. Interactive Specification Acquisition Tool (ISAT) is a proposed interactive system for supporting the acquisition and maintenance of a formal system specification from scenarios, as well as automatic synthesis of control code and automated test generation. This paper discusses the background, motivation, proposed functions, and implementation status of ISAT.

  6. Data Collection with Linux in the Undergraduate Physics Lab

    NASA Astrophysics Data System (ADS)

    Ramey, R. Dwayne

    2004-11-01

    Electronic data devices such as photogates can greatly facilitate data collection in the undergraduate physics laboratory. Unfortunately, these devices have several practical drawbacks. While the photogates themselves are not particularly expensive, manufacturers of these devices have created intermediary hardware devices for data buffering and manipulation. These devices, while useful in some contexts, greatly increase the overall price of data collection and, through the use of proprietary software, limit the ability of the enduser to customize the software. As an alternative, I outline the procedure for establishing a computer-based data collection system that consists of opensource software and user constructed connections. The data collection system consists of the wiring needed to connect a data device to a computer and the software needed to collect and manipulate data. Data devices can be connected to a computer through either through the USB port or the gameport of a sound card. Software capable of collecting and manipulating the data from a photogate type device on a Linux system has been developed and will be discrussed. Results for typical undergraduate photogate based experiments will be shown, error limits and data collect rates will be discussed for both the gameport and USB connections.

  7. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    PubMed

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i.e., a 36% reduction. The on-site data collection did not produce time saving, however this is a temporary weakness that will be annihilated very soon in the future after officers are more acquainted with the software. The phase of evaluation, processing and analysis carried out in the office was dramatically shortened, i.e., a 69% reduction. Another benefit was the standardization which allowed fast and consistent data analysis and evaluation. Even if all these benefits are remarkable, the most valuable benefit of the new procedure was the reduction of the police officers mistakes during the manual operations of survey and data evaluation. Because of these benefits, the satisfaction questionnaires administrated to the police officers after the testing phase showed very good acceptance of the procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. 242A Distributed Control System Year 2000 Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct yearmore » 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.« less

  9. Quantifying the movement of multiple insects using an optical insect counter

    USDA-ARS?s Scientific Manuscript database

    An optical insect counter (OIC) was designed and tested. The new system integrated a line-scan camera and a vertical light sheet along with data collection and image processing software to count numbers of flying insects crossing a vertical plane defined by the light sheet. The system also allows ...

  10. The Effectiveness of Cooperative Learning Activities in Enhancing EFL Learners' Fluency

    ERIC Educational Resources Information Center

    Alrayah, Hassan

    2018-01-01

    This research-paper aims at examining the effectiveness of cooperative learning activities in enhancing EFL learners' fluency. The researcher has used the descriptive approach, recorded interviews for testing fluency as tools of data collection and the software program SPSS as a tool for the statistical treatment of data. Research sample consists…

  11. Ionospheric Specifications for SAR Interferometry (ISSI)

    NASA Technical Reports Server (NTRS)

    Pi, Xiaoqing; Chapman, Bruce D; Freeman, Anthony; Szeliga, Walter; Buckley, Sean M.; Rosen, Paul A.; Lavalle, Marco

    2013-01-01

    The ISSI software package is designed to image the ionosphere from space by calibrating and processing polarimetric synthetic aperture radar (PolSAR) data collected from low Earth orbit satellites. Signals transmitted and received by a PolSAR are subject to the Faraday rotation effect as they traverse the magnetized ionosphere. The ISSI algorithms combine the horizontally and vertically polarized (with respect to the radar system) SAR signals to estimate Faraday rotation and ionospheric total electron content (TEC) with spatial resolutions of sub-kilometers to kilometers, and to derive radar system calibration parameters. The ISSI software package has been designed and developed to integrate the algorithms, process PolSAR data, and image as well as visualize the ionospheric measurements. A number of tests have been conducted using ISSI with PolSAR data collected from various latitude regions using the phase array-type L-band synthetic aperture radar (PALSAR) onboard Japan Aerospace Exploration Agency's Advanced Land Observing Satellite mission, and also with Global Positioning System data. These tests have demonstrated and validated SAR-derived ionospheric images and data correction algorithms.

  12. Data acquisition and PV module power production in upgraded TEP/AzRISE solar test yard

    NASA Astrophysics Data System (ADS)

    Bennett, Whit E.; Fishgold, Asher D.; Lai, Teh; Potter, Barrett G.; Simmons-Potter, Kelly

    2017-08-01

    The Tucson Electric Power (TEP)/University of Arizona AzRISE (Arizona Research Institute for Solar Energy) solar test yard is continuing efforts to improve standardization and data acquisition reliability throughout the facility. Data reliability is ensured through temperature-insensitive data acquisition devices with battery backups in the upgraded test yard. Software improvements allow for real-time analysis of collected data, while uploading to a web server. Sample data illustrates high fidelity monitoring of the burn-in period of a polycrystalline silicon photovoltaic module test string with no data failures over 365 days of data collection. In addition to improved DAQ systems, precision temperature monitoring has been implemented so that PV module backside temperatures are routinely obtained. Weather station data acquired at the test yard provides local ambient temperature, humidity, wind speed, and irradiance measurements that have been utilized to enable characterization of PV module performance over an extended test period

  13. Design and development of data acquisition system based on WeChat hardware

    NASA Astrophysics Data System (ADS)

    Wang, Zhitao; Ding, Lei

    2018-06-01

    Data acquisition system based on WeChat hardware provides methods for popularization and practicality of data acquisition. The whole system is based on WeChat hardware platform, where the hardware part is developed on DA14580 development board and the software part is based on Alibaba Cloud. We designed service module, logic processing module, data processing module and database module. The communication between hardware and software uses AirSync Protocal. We tested this system by collecting temperature and humidity data, and the result shows that the system can aquisite the temperature and humidity in real time according to settings.

  14. Porting the Starlink Software Collection to GNU Autotools

    NASA Astrophysics Data System (ADS)

    Gray, N.; Jenness, T.; Allan, A.; Berry, D. S.; Currie, M. J.; Draper, P. W.; Taylor, M. B.; Cavanagh, B.

    2005-12-01

    The Starlink software collection currently runs on three different Unix platforms and contains around 100 separate software items, totaling 2.5 million lines of code, in a mixture of languages. We have changed the build system from a hand-maintained collection of makefiles with hard-wired OS variants to a scheme involving feature-discovery via GNU Autoconf. As a result of this work, we have already ported the collection to Mac OS X and Cygwin. This had some unexpected benefits and costs, and valuable lessons.

  15. The effect on prospective teachers of the learning environment supported by dynamic statistics software

    NASA Astrophysics Data System (ADS)

    Koparan, Timur

    2016-02-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.

  16. The STARLINK software collection

    NASA Astrophysics Data System (ADS)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  17. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    PubMed

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  18. Virtual test: A student-centered software to measure student's critical thinking on human disease

    NASA Astrophysics Data System (ADS)

    Rusyati, Lilit; Firman, Harry

    2016-02-01

    The study "Virtual Test: A Student-Centered Software to Measure Student's Critical Thinking on Human Disease" is descriptive research. The background is importance of computer-based test that use element and sub element of critical thinking. Aim of this study is development of multiple choices to measure critical thinking that made by student-centered software. Instruments to collect data are (1) construct validity sheet by expert judge (lecturer and medical doctor) and professional judge (science teacher); and (2) test legibility sheet by science teacher and junior high school student. Participants consisted of science teacher, lecturer, and medical doctor as validator; and the students as respondent. Result of this study are describe about characteristic of virtual test that use to measure student's critical thinking on human disease, analyze result of legibility test by students and science teachers, analyze result of expert judgment by science teachers and medical doctor, and analyze result of trial test of virtual test at junior high school. Generally, result analysis shown characteristic of multiple choices to measure critical thinking was made by eight elements and 26 sub elements that developed by Inch et al.; complete by relevant information; and have validity and reliability more than "enough". Furthermore, specific characteristic of multiple choices to measure critical thinking are information in form science comic, table, figure, article, and video; correct structure of language; add source of citation; and question can guide student to critical thinking logically.

  19. VideoHacking: Automated Tracking and Quantification of Locomotor Behavior with Open Source Software and Off-the-Shelf Video Equipment.

    PubMed

    Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G

    2015-01-01

    Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.

  20. A Re-programmable Platform for Dynamic Burn-in Test of Xilinx Virtexll 3000 FPGA for Military and Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Roosta, Ramin; Wang, Xinchen; Sadigursky, Michael; Tracton, Phil

    2004-01-01

    Field Programmable Gate Arrays (FPGA) have played increasingly important roles in military and aerospace applications. Xilinx SRAM-based FPGAs have been extensively used in commercial applications. They have been used less frequently in space flight applications due to their susceptibility to single-event upsets. Reliability of these devices in space applications is a concern that has not been addressed. The objective of this project is to design a fully programmable hardware/software platform that allows (but is not limited to) comprehensive static/dynamic burn-in test of Virtex-II 3000 FPGAs, at speed test and SEU test. Conventional methods test very few discrete AC parameters (primarily switching) of a given integrated circuit. This approach will test any possible configuration of the FPGA and any associated performance parameters. It allows complete or partial re-programming of the FPGA and verification of the program by using read back followed by dynamic test. Designers have full control over which functional elements of the FPGA to stress. They can completely simulate all possible types of configurations/functions. Another benefit of this platform is that it allows collecting information on elevation of the junction temperature as a function of gate utilization, operating frequency and functionality. A software tool has been implemented to demonstrate the various features of the system. The software consists of three major parts: the parallel interface driver, main system procedure and a graphical user interface (GUI).

  1. Data Fusion Tool for Spiral Bevel Gear Condition Indicator Data

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Antolick, Lance J.; Branning, Jeremy S.; Thomas, Josiah

    2014-01-01

    Tests were performed on two spiral bevel gear sets in the NASA Glenn Spiral Bevel Gear Fatigue Test Rig to simulate the fielded failures of spiral bevel gears installed in a helicopter. Gear sets were tested until damage initiated and progressed on two or more gear or pinion teeth. During testing, gear health monitoring data was collected with two different health monitoring systems. Operational parameters were measured with a third data acquisition system. Tooth damage progression was documented with photographs taken at inspection intervals throughout the test. A software tool was developed for fusing the operational data and the vibration based gear condition indicator (CI) data collected from the two health monitoring systems. Results of this study illustrate the benefits of combining the data from all three systems to indicate progression of damage for spiral bevel gears. The tool also enabled evaluation of the effectiveness of each CI with respect to operational conditions and fault mode.

  2. Java Web Start based software for automated quantitative nuclear analysis of prostate cancer and benign prostate hyperplasia.

    PubMed

    Singh, Swaroop S; Kim, Desok; Mohler, James L

    2005-05-11

    Androgen acts via androgen receptor (AR) and accurate measurement of the levels of AR protein expression is critical for prostate research. The expression of AR in paired specimens of benign prostate and prostate cancer from 20 African and 20 Caucasian Americans was compared to demonstrate an application of this system. A set of 200 immunopositive and 200 immunonegative nuclei were collected from the images using a macro developed in Image Pro Plus. Linear Discriminant and Logistic Regression analyses were performed on the data to generate classification coefficients. Classification coefficients render the automated image analysis software independent of the type of immunostaining or image acquisition system used. The image analysis software performs local segmentation and uses nuclear shape and size to detect prostatic epithelial nuclei. AR expression is described by (a) percentage of immunopositive nuclei; (b) percentage of immunopositive nuclear area; and (c) intensity of AR expression among immunopositive nuclei or areas. The percent positive nuclei and percent nuclear area were similar by race in both benign prostate hyperplasia and prostate cancer. In prostate cancer epithelial nuclei, African Americans exhibited 38% higher levels of AR immunostaining than Caucasian Americans (two sided Student's t-tests; P < 0.05). Intensity of AR immunostaining was similar between races in benign prostate. The differences measured in the intensity of AR expression in prostate cancer were consistent with previous studies. Classification coefficients are required due to non-standardized immunostaining and image collection methods across medical institutions and research laboratories and helps customize the software for the specimen under study. The availability of a free, automated system creates new opportunities for testing, evaluation and use of this image analysis system by many research groups who study nuclear protein expression.

  3. Records Inventory Data Collection Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Brian A.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  4. WISE: Automated support for software project management and measurement. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  5. Earth radiation budget experiment software development

    NASA Technical Reports Server (NTRS)

    Edmonds, W. L.

    1985-01-01

    Computer programming and analysis efforts were carried out in support of the Earth Radiation Budget Experiment (ERBE) at NASA/Langley. The Earth Radiation Budget Experiment is described as well as data acquisition, analysis and modeling support for the testing of ERBE instruments. Also included are descriptions of the programs developed to analyze, format and display data collected during testing of the various ERBE instruments. Listings of the major programs developed under this contract are located in an appendix.

  6. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    NASA Technical Reports Server (NTRS)

    Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  7. A Distributed Simulation Software System for Multi-Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Burns, Richard; Davis, George; Cary, Everett

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  8. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  9. From field data collection to earth sciences dissemination: mobile examples in the digital era

    NASA Astrophysics Data System (ADS)

    Giardino, Marco; Ghiraldi, Luca; Palomba, Mauro; Perotti, Luigi

    2015-04-01

    In the framework of the technological and cultural revolution related to the massive diffusion of mobile devices, as smartphones and tablets, the information management and accessibility is changing, and many software houses and developer communities realized applications that can meet various people's needs. Modern collection, storing and sharing of data have radically changed, and advances in ICT increasingly involve field-based activities. Progresses in these researches and applications depend on three main components: hardware, software and web system. Since 2008 the geoSITLab multidisciplinary group (Earth Sciences Department and NatRisk Centre of the University of Torino and the Natural Sciences Museum of the Piemonte Region) is active in defining and testing methods for collecting, managing and sharing field information using mobile devices. Key issues include: Geomorphological Digital Mapping, Natural Hazards monitoring, Geoheritage assessment and applications for the teaching of Earth Sciences. An overview of the application studies is offered here, including the use of Mobile tools for data collection, the construction of relational databases for inventory activities and the test of Web-Mapping tools and mobile apps for data dissemination. The fil rouge of connection is a standardized digital approach allowing the use of mobile devices in each step of the process, which will be analysed within different projects set up by the research group (Geonathaz, EgeoFieldwork, Progeo Piemonte, GeomediaWeb). The hardware component mainly consists of the availability of handheld mobile devices (e.g. smartphones, PDAs and Tablets). The software component corresponds to applications for spatial data visualization on mobile devices, such as composite mobile GIS or simple location-based apps. The web component allows the integration of collected data into geodatabase based on client-server architecture, where the information can be easily loaded, uploaded and shared between field staff and data management team, in order to disseminate collected information to media or to inform the decision makers. Results demonstrated the possibility to record field observations in a fast and reliable way, using standardized formats that can improve the precision of collected information and lower the possibility of errors and data omission. Dedicated forms have been set up for gathering different thematic data (geologic/geomorphologic, faunal and floristic, path system…etc.). Field data allowed to arrange maps and SDI useful for many application purposes: from country-planning to disaster risk management, from Geoheritage management to Earth Science concepts dissemination.

  10. DATALINK. Records Inventory Data Collection Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, B.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  11. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  12. Benchmarking the ATLAS software through the Kit Validation engine

    NASA Astrophysics Data System (ADS)

    De Salvo, Alessandro; Brasolin, Franco

    2010-04-01

    The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.

  13. Building Large Collections of Chinese and English Medical Terms from Semi-Structured and Encyclopedia Websites

    PubMed Central

    Xu, Yan; Wang, Yining; Sun, Jian-Tao; Zhang, Jianwen; Tsujii, Junichi; Chang, Eric

    2013-01-01

    To build large collections of medical terms from semi-structured information sources (e.g. tables, lists, etc.) and encyclopedia sites on the web. The terms are classified into the three semantic categories, Medical Problems, Medications, and Medical Tests, which were used in i2b2 challenge tasks. We developed two systems, one for Chinese and another for English terms. The two systems share the same methodology and use the same software with minimum language dependent parts. We produced large collections of terms by exploiting billions of semi-structured information sources and encyclopedia sites on the Web. The standard performance metric of recall (R) is extended to three different types of Recall to take the surface variability of terms into consideration. They are Surface Recall (), Object Recall (), and Surface Head recall (). We use two test sets for Chinese. For English, we use a collection of terms in the 2010 i2b2 text. Two collections of terms, one for English and the other for Chinese, have been created. The terms in these collections are classified as either of Medical Problems, Medications, or Medical Tests in the i2b2 challenge tasks. The English collection contains 49,249 (Problems), 89,591 (Medications) and 25,107 (Tests) terms, while the Chinese one contains 66,780 (Problems), 101,025 (Medications), and 15,032 (Tests) terms. The proposed method of constructing a large collection of medical terms is both efficient and effective, and, most of all, independent of language. The collections will be made publicly available. PMID:23874426

  14. Building large collections of Chinese and English medical terms from semi-structured and encyclopedia websites.

    PubMed

    Xu, Yan; Wang, Yining; Sun, Jian-Tao; Zhang, Jianwen; Tsujii, Junichi; Chang, Eric

    2013-01-01

    To build large collections of medical terms from semi-structured information sources (e.g. tables, lists, etc.) and encyclopedia sites on the web. The terms are classified into the three semantic categories, Medical Problems, Medications, and Medical Tests, which were used in i2b2 challenge tasks. We developed two systems, one for Chinese and another for English terms. The two systems share the same methodology and use the same software with minimum language dependent parts. We produced large collections of terms by exploiting billions of semi-structured information sources and encyclopedia sites on the Web. The standard performance metric of recall (R) is extended to three different types of Recall to take the surface variability of terms into consideration. They are Surface Recall (R(S)), Object Recall (R(O)), and Surface Head recall (R(H)). We use two test sets for Chinese. For English, we use a collection of terms in the 2010 i2b2 text. Two collections of terms, one for English and the other for Chinese, have been created. The terms in these collections are classified as either of Medical Problems, Medications, or Medical Tests in the i2b2 challenge tasks. The English collection contains 49,249 (Problems), 89,591 (Medications) and 25,107 (Tests) terms, while the Chinese one contains 66,780 (Problems), 101,025 (Medications), and 15,032 (Tests) terms. The proposed method of constructing a large collection of medical terms is both efficient and effective, and, most of all, independent of language. The collections will be made publicly available.

  15. A usability evaluation of medical software at an expert conference setting.

    PubMed

    Bond, Raymond Robert; Finlay, Dewar D; Nugent, Chris D; Moore, George; Guldenring, Daniel

    2014-01-01

    A usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram [ECG] viewer) and the other is a medical research tool (electrode misplacement simulator [EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. The medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each session, participants completed a pre-test and a post-test questionnaire respectively. The average duration of a usability session at the conference was 34.69 min (SD=10.28). However, taking into account that 10 min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69 min (SD=10.28). Given we have shown that usability data can be collected at conferences, this paper details the advantages of conference-based usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign. Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. This study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their usability at an expert conference setting. The primary advantage of a conference-based usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Testing Scientific Software: A Systematic Literature Review

    PubMed Central

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  17. Investigation of periodically driven systems by x-ray absorption spectroscopy using asynchronous data collection mode

    NASA Astrophysics Data System (ADS)

    Singh, H.; Donetsky, D.; Liu, J.; Attenkofer, K.; Cheng, B.; Trelewicz, J. R.; Lubomirsky, I.; Stavitski, E.; Frenkel, A. I.

    2018-04-01

    We report the development, testing, and demonstration of a setup for modulation excitation spectroscopy experiments at the Inner Shell Spectroscopy beamline of National Synchrotron Light Source - II. A computer algorithm and dedicated software were developed for asynchronous data processing and analysis. We demonstrate the reconstruction of X-ray absorption spectra for different time points within the modulation pulse using a model system. This setup and the software are intended for a broad range of functional materials which exhibit structural and/or electronic responses to the external stimulation, such as catalysts, energy and battery materials, and electromechanical devices.

  18. Assessing the Potential of Low-Cost 3D Cameras for the Rapid Measurement of Plant Woody Structure

    PubMed Central

    Nock, Charles A; Taugourdeau, Olivier; Delagrange, Sylvain; Messier, Christian

    2013-01-01

    Detailed 3D plant architectural data have numerous applications in plant science, but many existing approaches for 3D data collection are time-consuming and/or require costly equipment. Recently, there has been rapid growth in the availability of low-cost, 3D cameras and related open source software applications. 3D cameras may provide measurements of key components of plant architecture such as stem diameters and lengths, however, few tests of 3D cameras for the measurement of plant architecture have been conducted. Here, we measured Salix branch segments ranging from 2–13 mm in diameter with an Asus Xtion camera to quantify the limits and accuracy of branch diameter measurement with a 3D camera. By scanning at a variety of distances we also quantified the effect of scanning distance. In addition, we also test the sensitivity of the program KinFu for continuous 3D object scanning and modeling as well as other similar software to accurately record stem diameters and capture plant form (<3 m in height). Given its ability to accurately capture the diameter of branches >6 mm, Asus Xtion may provide a novel method for the collection of 3D data on the branching architecture of woody plants. Improvements in camera measurement accuracy and available software are likely to further improve the utility of 3D cameras for plant sciences in the future. PMID:24287538

  19. Development of a beam test telescope based on the Alibava readout system

    NASA Astrophysics Data System (ADS)

    Marco-Hernández, R.

    2011-01-01

    A telescope for a beam test have been developed as a result of a collaboration among the University of Liverpool, Centro Nacional de Microelectrónica (CNM) of Barcelona and Instituto de Física Corpuscular (IFIC) of Valencia. This system is intended to carry out both analogue charge collection and spatial resolution measurements with different types of microstrip or pixel silicon detectors in a beam test environment. The telescope has four XY measurement as well as trigger planes (XYT board) and it can accommodate up to twelve devices under test (DUT board). The DUT board uses two Beetle ASICs for the readout of chilled silicon detectors. The board could operate in a self-triggering mode. The board features a temperature sensor and it can be mounted on a rotary stage. A peltier element is used for cooling the DUT. Each XYT board measures the track space points using two silicon strip detectors connected to two Beetle ASICs. It can also trigger on the particle tracks in the beam test. The board includes a CPLD which allows for the synchronization of the trigger signal to a common clock frequency, delaying and implementing coincidence with other XYT boards. An Alibava mother board is used to read out and to control each XYT/DUT board from a common trigger signal and a common clock signal. The Alibava board has a TDC on board to have a time stamp of each trigger. The data collected by each Alibava board is sent to a master card by means of a local data/address bus following a custom digital protocol. The master board distributes the trigger, clock and reset signals. It also merges the data streams from up to sixteen Alibava boards. The board has also a test channel for testing in a standard mode a XYT or DUT board. This board is implemented with a Xilinx development board and a custom patch board. The master board is connected with the DAQ software via 100M Ethernet. Track based alignment software has also been developed for the data obtained with the DAQ software.

  20. Evolution of Software-Only-Simulation at NASA IV and V

    NASA Technical Reports Server (NTRS)

    McCarty, Justin; Morris, Justin; Zemerick, Scott

    2014-01-01

    Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.

  1. Direct-Solve Image-Based Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.

    2009-01-01

    A method of wavefront sensing (more precisely characterized as a method of determining the deviation of a wavefront from a nominal figure) has been invented as an improved means of assessing the performance of an optical system as affected by such imperfections as misalignments, design errors, and fabrication errors. The method is implemented by software running on a single-processor computer that is connected, via a suitable interface, to the image sensor (typically, a charge-coupled device) in the system under test. The software collects a digitized single image from the image sensor. The image is displayed on a computer monitor. The software directly solves for the wavefront in a time interval of a fraction of a second. A picture of the wavefront is displayed. The solution process involves, among other things, fast Fourier transforms. It has been reported to the effect that some measure of the wavefront is decomposed into modes of the optical system under test, but it has not been reported whether this decomposition is postprocessing of the solution or part of the solution process.

  2. Development and Flight Testing of an Adaptive Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.

    2002-01-01

    On going development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle, and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. The expert system is parameterized, which makes it adaptable to be trained to both a user's subject reasoning and existing quantitative analytic tools. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation and, data acquisition, storage and retrieval.

  3. Medical Data Architecture Project Status

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Middour, C.; Lindsey, A.; Marker, N.; Wolfe, S.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2017-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current International Space Station (ISS) medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable an increasingly autonomous crew than the current ISS paradigm. The MDA will develop capabilities that support automated data collection, and the necessary functionality and challenges in executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. To attain this goal, the first year of the MDA project focused on reducing technical risk, developing documentation and instituting iterative development processes that established the basis for the first version of MDA software (or Test Bed 1). Test Bed 1 is based on a nominal operations scenario authored by the ExMC Element Scientist. This narrative was decomposed into a Concept of Operations that formed the basis for Test Bed 1 requirements. These requirements were successfully vetted through the MDA Test Bed 1 System Requirements Review, which permitted the MDA project to begin software code development and component integration. This paper highlights the MDA objectives, development processes, and accomplishments, and identifies the fiscal year 2017 milestones and deliverables in the upcoming year.

  4. First use of LHC Run 3 Conditions Database infrastructure for auxiliary data files in ATLAS

    NASA Astrophysics Data System (ADS)

    Aperio Bella, L.; Barberis, D.; Buttinger, W.; Formica, A.; Gallas, E. J.; Rinaldi, L.; Rybkin, G.; ATLAS Collaboration

    2017-10-01

    Processing of the large amount of data produced by the ATLAS experiment requires fast and reliable access to what we call Auxiliary Data Files (ADF). These files, produced by Combined Performance, Trigger and Physics groups, contain conditions, calibrations, and other derived data used by the ATLAS software. In ATLAS this data has, thus far for historical reasons, been collected and accessed outside the ATLAS Conditions Database infrastructure and related software. For this reason, along with the fact that ADF are effectively read by the software as binary objects, this class of data appears ideal for testing the proposed Run 3 conditions data infrastructure now in development. This paper describes this implementation as well as the lessons learned in exploring and refining the new infrastructure with the potential for deployment during Run 2.

  5. View_SPECPR: Software for Plotting Spectra (Installation Manual and User's Guide, Version 1.2)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2008-01-01

    This document describes procedures for installing and using the 'View_SPECPR' software system to plot spectra stored in SPECPR (SPECtrum Processing Routines) files. The View_SPECPR software is comprised of programs written in IDL (Interactive Data Language) that run within the ENVI (ENvironment for Visualizing Images) image processing system. SPECPR files are used by earth-remote-sensing scientists and planetary scientists for storing spectra collected by laboratory, field, and remote sensing instruments. A widely distributed SPECPR file is the U.S. Geological Survey (USGS) spectral library that contains thousands of spectra of minerals, vegetation, and man-made materials (Clark and others, 2007). SPECPR files contain reflectance data and associated wavelength and spectral resolution data, as well as meta-data on the time and date of collection and spectrometer settings. Furthermore, the SPECPR file automatically tracks changes to data records through its 'history' fields. For more details on the format and content of SPECPR files, see Clark (1993). For more details on ENVI, see ITT (2008). This program has been updated using an ENVI 4.5/IDL7.0 full license operating on a Windows XP operating system and requires the installation of the iTools components of IDL7.0; however, this program should work with full licenses on UNIX/LINUX systems. This software has not been tested with ENVI licenses on Windows Vista or Apple Operating Systems.

  6. Parallelization of Rocket Engine System Software (Press)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1996-01-01

    The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.

  7. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  8. Communal Resources in Open Source Software Development

    ERIC Educational Resources Information Center

    Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit

    2008-01-01

    Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…

  9. Age estimation using exfoliative cytology and radiovisiography: A comparative study

    PubMed Central

    Nallamala, Shilpa; Guttikonda, Venkateswara Rao; Manchikatla, Praveen Kumar; Taneeru, Sravya

    2017-01-01

    Introduction: Age estimation is one of the essential factors in establishing the identity of an individual. Among various methods, exfoliative cytology (EC) is a unique, noninvasive technique, involving simple, and pain-free collection of intact cells from the oral cavity for microscopic examination. Objective: The study was undertaken with an aim to estimate the age of an individual from the average cell size of their buccal smears calculated using image analysis morphometric software and the pulp–tooth area ratio in mandibular canine of the same individual using radiovisiography (RVG). Materials and Methods: Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using Papanicolaou stain. The average cell size was measured using image analysis software (Image-Pro Insight 8.0). The RVG images of mandibular canines were obtained, pulp and tooth areas were traced using AutoCAD 2010 software, and area ratio was calculated. The estimated age was then calculated using regression analysis. Results: The paired t-test between chronological age and estimated age by cell size and pulp–tooth area ratio was statistically nonsignificant (P > 0.05). Conclusion: In the present study, age estimated by pulp–tooth area ratio and EC yielded good results. PMID:29657491

  10. Age estimation using exfoliative cytology and radiovisiography: A comparative study.

    PubMed

    Nallamala, Shilpa; Guttikonda, Venkateswara Rao; Manchikatla, Praveen Kumar; Taneeru, Sravya

    2017-01-01

    Age estimation is one of the essential factors in establishing the identity of an individual. Among various methods, exfoliative cytology (EC) is a unique, noninvasive technique, involving simple, and pain-free collection of intact cells from the oral cavity for microscopic examination. The study was undertaken with an aim to estimate the age of an individual from the average cell size of their buccal smears calculated using image analysis morphometric software and the pulp-tooth area ratio in mandibular canine of the same individual using radiovisiography (RVG). Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using Papanicolaou stain. The average cell size was measured using image analysis software (Image-Pro Insight 8.0). The RVG images of mandibular canines were obtained, pulp and tooth areas were traced using AutoCAD 2010 software, and area ratio was calculated. The estimated age was then calculated using regression analysis. The paired t -test between chronological age and estimated age by cell size and pulp-tooth area ratio was statistically nonsignificant ( P > 0.05). In the present study, age estimated by pulp-tooth area ratio and EC yielded good results.

  11. Hardware in-the-Loop Demonstration of Real-Time Orbit Determination in High Earth Orbits

    NASA Technical Reports Server (NTRS)

    Moreau, Michael; Naasz, Bo; Leitner, Jesse; Carpenter, J. Russell; Gaylor, Dave

    2005-01-01

    This paper presents results from a study conducted at Goddard Space Flight Center (GSFC) to assess the real-time orbit determination accuracy of GPS-based navigation in a number of different high Earth orbital regimes. Measurements collected from a GPS receiver (connected to a GPS radio frequency (RF) signal simulator) were processed in a navigation filter in real-time, and resulting errors in the estimated states were assessed. For the most challenging orbit simulated, a 12 hour Molniya orbit with an apogee of approximately 39,000 km, mean total position and velocity errors were approximately 7 meters and 3 mm/s respectively. The study also makes direct comparisons between the results from the above hardware in-the-loop tests and results obtained by processing GPS measurements generated from software simulations. Care was taken to use the same models and assumptions in the generation of both the real-time and software simulated measurements, in order that the real-time data could be used to help validate the assumptions and models used in the software simulations. The study makes use of the unique capabilities of the Formation Flying Test Bed at GSFC, which provides a capability to interface with different GPS receivers and to produce real-time, filtered orbit solutions even when less than four satellites are visible. The result is a powerful tool for assessing onboard navigation performance in a wide range of orbital regimes, and a test-bed for developing software and procedures for use in real spacecraft applications.

  12. Modeling Antimicrobial Activity of Clorox(R) Using an Agar-Diffusion Test: A New Twist On an Old Experiment.

    ERIC Educational Resources Information Center

    Mitchell, James K.; Carter, William E.

    2000-01-01

    Describes using a computer statistical software package called Minitab to model the sensitivity of several microbes to the disinfectant NaOCl (Clorox') using the Kirby-Bauer technique. Each group of students collects data from one microbe, conducts regression analyses, then chooses the best-fit model based on the highest r-values obtained.…

  13. Data collection and evaluation for experimental computer science research

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1983-01-01

    The Software Engineering Laboratory was monitoring software development at NASA Goddard Space Flight Center since 1976. The data collection activities of the Laboratory and some of the difficulties of obtaining reliable data are described. In addition, the application of this data collection process to a current prototyping experiment is reviewed.

  14. Acquiring data in real time in Italy from the Antarctic Seismographic Argentinean Italian Network (ASAIN): testing the global capabilities of the EarthWorm and Antelope software suites.

    NASA Astrophysics Data System (ADS)

    Percy Plasencia Linares, Milton; Russi, Marino; Pesaresi, Damiano; Cravos, Claudio

    2010-05-01

    The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 7 seismic stations located in the Scotia Sea region in Antarctica and in Tierra del Fuego - Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links provided by the Instituto Antártico Argentino (IAA). Data is collected and archived primarily in Güralp Compress Format (GCF) through the Scream! software at OGS and IAA, and transmitted also in real time to the Observatories and Research Facilities for European Seismology (ORFEUS). The main real time seismic data acquisition and processing system of the ASAIN network is based on the EarthWorm 7.3 (Open Source) software suite installed on a Linux server at the OGS headquarters in Trieste. It runs several software modules for data collection, data archiving, data publication on dedicated web servers: wave_serverV, Winston Wave Server, and data analysis and realtime monitoring through Swarm program. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of the Antelope software suite, we also set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. The first tests indicated that more than 80% of the earthquakes with magnitude M>5.0 listed in the Preliminary Determination of Epicenters (PDE) catalogue of the National Earthquake Information Center (NEIC) of the United States Geological Survey (USGS) were also correctly automatically detected by Antelope, with an average location error of 0.05 degrees and average body wave magnitude Mb estimation error below 0.1. The average time difference between event origin time and the actual time of event determination by Antelope was of about 45': the comparison with 20', the IASPEI91 P-wave travel time for 180 degrees distance, and 25', the estimate of our test system data latency, indicate that Antelope is a serious candidate for regional and global early warning systems.

  15. Automated data collection in single particle electron microscopy

    PubMed Central

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  16. Single-crystal diffraction at the Extreme Conditions beamline P02.2: procedure for collecting and analyzing high-pressure single-crystal data.

    PubMed

    Rothkirch, André; Gatta, G Diego; Meyer, Mathias; Merkel, Sébastien; Merlini, Marco; Liermann, Hanns Peter

    2013-09-01

    Fast detectors employed at third-generation synchrotrons have reduced collection times significantly and require the optimization of commercial as well as customized software packages for data reduction and analysis. In this paper a procedure to collect, process and analyze single-crystal data sets collected at high pressure at the Extreme Conditions beamline (P02.2) at PETRA III, DESY, is presented. A new data image format called `Esperanto' is introduced that is supported by the commercial software package CrysAlis(Pro) (Agilent Technologies UK Ltd). The new format acts as a vehicle to transform the most common area-detector data formats via a translator software. Such a conversion tool has been developed and converts tiff data collected on a Perkin Elmer detector, as well as data collected on a MAR345/555, to be imported into the CrysAlis(Pro) software. In order to demonstrate the validity of the new approach, a complete structure refinement of boron-mullite (Al5BO9) collected at a pressure of 19.4 (2) GPa is presented. Details pertaining to the data collections and refinements of B-mullite are presented.

  17. Injection molding lens metrology using software configurable optical test system

    NASA Astrophysics Data System (ADS)

    Zhan, Cheng; Cheng, Dewen; Wang, Shanshan; Wang, Yongtian

    2016-10-01

    Optical plastic lens produced by injection molding machine possesses numerous advantages of light quality, impact resistance, low cost, etc. The measuring methods in the optical shop are mainly interferometry, profile meter. However, these instruments are not only expensive, but also difficult to alignment. The software configurable optical test system (SCOTS) is based on the geometry of the fringe refection and phase measuring deflectometry method (PMD), which can be used to measure large diameter mirror, aspheric and freeform surface rapidly, robustly, and accurately. In addition to the conventional phase shifting method, we propose another data collection method called as dots matrix projection. We also use the Zernike polynomials to correct the camera distortion. This polynomials fitting mapping distortion method has not only simple operation, but also high conversion precision. We simulate this test system to measure the concave surface using CODE V and MATLAB. The simulation results show that the dots matrix projection method has high accuracy and SCOTS has important significance for on-line detection in optical shop.

  18. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    PubMed

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.

  19. Using an architectural approach to integrate heterogeneous, distributed software components

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Purtilo, James M.

    1995-01-01

    Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.

  20. Executable assertions and flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    Executable assertions are used to test flight control software. The techniques used for testing flight software; however, are different from the techniques used to test other kinds of software. This is because of the redundant nature of flight software. An experimental setup for testing flight software using executable assertions is described. Techniques for writing and using executable assertions to test flight software are presented. The error detection capability of assertions is studied and many examples of assertions are given. The issues of placement and complexity of assertions and the language features to support efficient use of assertions are discussed.

  1. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  2. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  3. Collected Software Engineering Papers, Volume 10

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.

  4. Improving Data Collection and Analysis Interface for the Data Acquisition Software of the Spin Laboratory at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Curatolo, Ben S.; Woike, Mark R.

    2011-01-01

    In jet engines, turbines spin at high rotational speeds. The forces generated from these high speeds make the rotating components of the turbines susceptible to developing cracks that can lead to major engine failures. The current inspection technologies only allow periodic examinations to check for cracks and other anomalies due to the requirements involved, which often necessitate entire engine disassembly. Also, many of these technologies cannot detect cracks that are below the surface or closed when the crack is at rest. Therefore, to overcome these limitations, efforts at NASA Glenn Research Center are underway to develop techniques and algorithms to detect cracks in rotating engine components. As a part of these activities, a high-precision spin laboratory is being utilized to expand and conduct highly specialized tests to develop methodologies that can assist in detecting predetermined cracks in a rotating turbine engine rotor. This paper discusses the various features involved in the ongoing testing at the spin laboratory and elaborates on its functionality and on the supporting data system tools needed to enable successfully running optimal tests and collecting accurate results. The data acquisition system and the associated software were updated and customized to adapt to the changes implemented on the test rig system and to accommodate the data produced by various sensor technologies. Discussion and presentation of these updates and the new attributes implemented are herein reported

  5. QUEST/Ada: Query utility environment for software testing of Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1989-01-01

    Results of research and development efforts are presented for Task 1, Phase 2 of a general project entitled, The Development of a Program Analysis Environment for Ada. A prototype of the QUEST/Ada system was developed to collect data to determine the effectiveness of the rule-based testing paradigm. The prototype consists of five parts: the test data generator, the parser/scanner, the test coverage analyzer, a symbolic evaluator, and a data management facility, known as the Librarian. These components are discussed at length. Also presented is an experimental design for the evaluations, an overview of the project, and a schedule for its completion.

  6. Jet Noise Reduction

    NASA Technical Reports Server (NTRS)

    Kenny, Patrick

    2004-01-01

    The Acoustics Branch is responsible for reducing noise levels for jet and fan components on aircraft engines. To do this, data must be measured and calibrated accurately to ensure validity of test results. This noise reduction is accomplished by modifications to hardware such as jet nozzles, and by the use of other experimental hardware such as fluidic chevrons, elliptic cores, and fluidic shields. To insure validity of data calibration, a variety of software is used. This software adjusts the sound amplitude and frequency to be consistent with data taken on another day. Both the software and the hardware help make noise reduction possible. work properly. These software programs were designed to make corrections for atmosphere, shear, attenuation, electronic, and background noise. All data can be converted to a one-foot lossless condition, using the proper software corrections, making a reading independent of weather and distance. Also, data can be transformed from model scale to full scale for noise predictions of a real flight. Other programs included calculations of Over All Sound Pressure Level (OASPL), Effective Perceived Noise Level (EPNL). OASPL is the integration of sound with respect to frequency, and EPNL is weighted for a human s response to different sound frequencies and integrated with respect to time. With the proper software correction, data taken in the NATR are useful in determining ways to reduce noise. display any difference between two or more data files. Using this program and graphs of the data, the actual and predicted data can be compared. This software was tested on data collected at the Aero Acoustic Propulsion Laboratory (AAPL) using a variety of window types and overlaps. Similarly, short scripts were written to test each individual program in the software suite for verification. Each graph displays both the original points and the adjusted points connected with lines. During this summer, data points were taken during a live experiment at the AAPL to measure Nozzle Acoustic Test Rig (NATR) background noise levels. Six condenser microphones were placed in strategic locations around the dome and the inlet tunnel to measure different noise sources. From the control room the jet was monitored with the help of video cameras and other sensors. The data points were recorded, reduced, and plotted, and will be used to plan future modifications to the NATR. The primary goal to create data reduction test programs and provide verification was completed. As a result of the internship, I learned C/C++, UNIX/LINUX, Excel, and acoustic data processing methods. I also recorded data at the AAPL, then processed and plotted it. These data would be useful to compare against existing data. In addition, I adjusted software to work on the Mac OSX platform. And I used the available training resources.

  7. Dtest Testing Software

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  8. Analysis of Photogrammetry Data from ISIM Mockup, June 1, 2007

    NASA Technical Reports Server (NTRS)

    Nowak, Maria; Hill, Mike

    2007-01-01

    During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software.

  9. An Independent Orbit Determination Simulation for the OSIRIS-REx Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Getzandanner, Kenneth; Rowlands, David; Mazarico, Erwan; Antreasian, Peter; Jackman, Coralie; Moreau, Michael

    2016-01-01

    After arriving at the near-Earth asteroid (101955) Bennu in late 2018, the OSIRIS-REx spacecraft will execute a series of observation campaigns and orbit phases to accurately characterize Bennu and ultimately collect a sample of pristine regolith from its surface. While in the vicinity of Bennu, the OSIRIS-REx navigation team will rely on a combination of ground-based radiometric tracking data and optical navigation (OpNav) images to generate and deliver precision orbit determination products. Long before arrival at Bennu, the navigation team is performing multiple orbit determination simulations and thread tests to verify navigation performance and ensure interfaces between multiple software suites function properly. In this paper, we will summarize the results of an independent orbit determination simulation of the Orbit B phase of the mission performed to test the interface between the OpNav image processing and orbit determination software packages.

  10. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  11. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  12. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  13. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Format validation software testing... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying, as far as reasonable and practicable, that CEVAD's data testing software performs the checks, as...

  14. Software Architecture of Sensor Data Distribution In Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Lee, Charles; Alena, Richard; Stone, Thom; Ossenfort, John; Walker, Ed; Notario, Hugo

    2006-01-01

    Data from mobile and stationary sensors will be vital in planetary surface exploration. The distribution and collection of sensor data in an ad-hoc wireless network presents a challenge. Irregular terrain, mobile nodes, new associations with access points and repeaters with stronger signals as the network reconfigures to adapt to new conditions, signal fade and hardware failures can cause: a) Data errors; b) Out of sequence packets; c) Duplicate packets; and d) Drop out periods (when node is not connected). To mitigate the effects of these impairments, a robust and reliable software architecture must be implemented. This architecture must also be tolerant of communications outages. This paper describes such a robust and reliable software infrastructure that meets the challenges of a distributed ad hoc network in a difficult environment and presents the results of actual field experiments testing the principles and actual code developed.

  15. The effects of GeoGebra software on pre-service mathematics teachers' attitudes and views toward proof and proving

    NASA Astrophysics Data System (ADS)

    Zengin, Yılmaz

    2017-11-01

    The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used the 'Attitude Scale Towards Proof and Proving' and an open-ended questionnaire that were administered before and after the intervention as data collection tools. Paired samples t-test analysis was used for the analysis of quantitative data and content and descriptive analyses were utilized for the analysis of qualitative data. As a result of the data analysis, it was determined that GeoGebra software was an effective tool in increasing pre-service teachers' attitudes towards proof and proving.

  16. Development of yarn breakage detection software system based on machine vision

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu

    2017-10-01

    For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.

  17. DATALINK: Records inventory data collection software. User`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, B.A.

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products. It runs on virtually any computer us MS-DOS.

  18. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    ERIC Educational Resources Information Center

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  19. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  20. Simulation Testing of Embedded Flight Software

    NASA Technical Reports Server (NTRS)

    Shahabuddin, Mohammad; Reinholtz, William

    2004-01-01

    Virtual Real Time (VRT) is a computer program for testing embedded flight software by computational simulation in a workstation, in contradistinction to testing it in its target central processing unit (CPU). The disadvantages of testing in the target CPU include the need for an expensive test bed, the necessity for testers and programmers to take turns using the test bed, and the lack of software tools for debugging in a real-time environment. By virtue of its architecture, most of the flight software of the type in question is amenable to development and testing on workstations, for which there is an abundance of commercially available debugging and analysis software tools. Unfortunately, the timing of a workstation differs from that of a target CPU in a test bed. VRT, in conjunction with closed-loop simulation software, provides a capability for executing embedded flight software on a workstation in a close-to-real-time environment. A scale factor is used to convert between execution time in VRT on a workstation and execution on a target CPU. VRT includes high-resolution operating- system timers that enable the synchronization of flight software with simulation software and ground software, all running on different workstations.

  1. Advanced Chemistry Collection, 2nd Edition

    NASA Astrophysics Data System (ADS)

    2001-11-01

    Software requirements are given in Table 3. Some programs have additional special requirements. Please see the individual program abstracts at JCE Online or the documentation included on the CD-ROM for more specific information. Table 3. General software requirements for the Advanced Chemistry Collection.

    ComputerSystemOther Software(Required by one or more programs)
    Mac OS compatibleSystem 7.6.1 or higherAcrobat Reader (included)Mathcad; Mathematica;MacMolecule2; QuickTime 4; HyperCard Player
    Windows CompatibleWindows 2000, 98, 95, NT 4Acrobat Reader (included)Mathcad; Mathematica;PCMolecule2; QuickTime 4;HyperChem; Excel

    Literature Cited

    1. General Chemistry Collection, 5th ed.; J. Chem. Educ. Software, 2001, SP16.
    2. Advanced Chemistry Collection; J. Chem. Educ. Software, 2001, SP28.

  2. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  3. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  4. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  5. Accuracy of metric sex analysis of skeletal remains using Fordisc based on a recent skull collection.

    PubMed

    Ramsthaler, F; Kreutz, K; Verhoff, M A

    2007-11-01

    It has been generally accepted in skeletal sex determination that the use of metric methods is limited due to the population dependence of the multivariate algorithms. The aim of the study was to verify the applicability of software-based sex estimations outside the reference population group for which discriminant equations have been developed. We examined 98 skulls from recent forensic cases of known age, sex, and Caucasian ancestry from cranium collections in Frankfurt and Mainz (Germany) to determine the accuracy of sex determination using the statistical software solution Fordisc which derives its database and functions from the US American Forensic Database. In a comparison between metric analysis using Fordisc and morphological determination of sex, average accuracy for both sexes was 86 vs 94%, respectively, and males were identified more accurately than females. The ratio of the true test result rate to the false test result rate was not statistically different for the two methodological approaches at a significance level of 0.05 but was statistically different at a level of 0.10 (p=0.06). Possible explanations for this difference comprise different ancestry, age distribution, and socio-economic status compared to the Fordisc reference sample. It is likely that a discriminant function analysis on the basis of more similar European reference samples will lead to more valid and reliable sexing results. The use of Fordisc as a single method for the estimation of sex of recent skeletal remains in Europe cannot be recommended without additional morphological assessment and without a built-in software update based on modern European reference samples.

  6. Geometric processing workflow for vertical and oblique hyperspectral frame images collected using UAV

    NASA Astrophysics Data System (ADS)

    Markelin, L.; Honkavaara, E.; Näsi, R.; Nurminen, K.; Hakala, T.

    2014-08-01

    Remote sensing based on unmanned airborne vehicles (UAVs) is a rapidly developing field of technology. UAVs enable accurate, flexible, low-cost and multiangular measurements of 3D geometric, radiometric, and temporal properties of land and vegetation using various sensors. In this paper we present a geometric processing chain for multiangular measurement system that is designed for measuring object directional reflectance characteristics in a wavelength range of 400-900 nm. The technique is based on a novel, lightweight spectral camera designed for UAV use. The multiangular measurement is conducted by collecting vertical and oblique area-format spectral images. End products of the geometric processing are image exterior orientations, 3D point clouds and digital surface models (DSM). This data is needed for the radiometric processing chain that produces reflectance image mosaics and multiangular bidirectional reflectance factor (BRF) observations. The geometric processing workflow consists of the following three steps: (1) determining approximate image orientations using Visual Structure from Motion (VisualSFM) software, (2) calculating improved orientations and sensor calibration using a method based on self-calibrating bundle block adjustment (standard photogrammetric software) (this step is optional), and finally (3) creating dense 3D point clouds and DSMs using Photogrammetric Surface Reconstruction from Imagery (SURE) software that is based on semi-global-matching algorithm and it is capable of providing a point density corresponding to the pixel size of the image. We have tested the geometric processing workflow over various targets, including test fields, agricultural fields, lakes and complex 3D structures like forests.

  7. Modular Rocket Engine Control Software (MRECS)

    NASA Technical Reports Server (NTRS)

    Tarrant, C.; Crook, J.

    1998-01-01

    The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.

  8. Evaluation of ERTS multispectral signatures in relation to ground control signatures using a nested-sampling approach

    NASA Technical Reports Server (NTRS)

    Lyon, R. J. P. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Ground measured spectral signatures of wavelength bands matching ERTS MSS were collected using a radiometer at several Californian and Nevadan sites, and directly compared with similar data from ERTS CCTs. The comparison was tested at the highest possible spatial resolution for ERTS, using deconvoluted MSS data, and contrasted with that of ground measured spectra, originally from 1 meter squares. In the mobile traverses of the grassland sites, these one meter fields of view were integrated into eighty meter transects along the five km track across four major rock/soil types. Suitable software was developed to read the MSS CCT tapes, to shadeprint individual bands with user-determined greyscale stretching. Four new algorithms for unsupervised and supervised, normalized and unnormalized clustering were developed, into a program termed STANSORT. Parallel software allowed the field data to be calibrated, and by using concurrently continuously collected, upward- and downward-viewing, 4 band radiometers, bidirectional reflectances could be calculated.

  9. Evaluation of Potential Test Environments for Assessing the Impact of Multi-Sensor Data Fusion on Command and Control Operations in the HALIFAX Class Frigate

    DTIC Science & Technology

    2001-05-01

    specifying the gaming area and land masses , the meteorological and ocean environments, and the sea conditions that exist within the gaming area. Unlike...exchanges 1.12. Customise env1ronment for data collection 1.12.1. Extra audio and v1deo 1.12.2. Add on software modules 1.13. Standalone keypads...State boards No, could be added Manuals I Charts IT acpacs I Post its I etc. No, could be added 1.11 Customise environment for data collection Ability

  10. Data collection system. Volume 1, Overview and operators manual; Volume 2, Maintenance manual; Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caudell, R.B.; Bauder, M.E.; Boyer, W.B.

    1993-09-01

    Sandia National Laboratories (SNL) Instrumentation Development Department was tasked by the Defense Nuclear Agency (DNA) to record data on Tektronix RTD720 Digitizers on the HUNTERS TROPHY field test conducted at the Nevada Test Site (NTS) on September 18, 1992. This report contains a overview and description of the computer hardware and software that was used to acquire, reduce, and display the data. The document is divided into two volumes: an overview and operators manual (Volume 1) and a maintenance manual (Volume 2).

  11. Sensors Locate Radio Interference

    NASA Technical Reports Server (NTRS)

    2009-01-01

    After receiving a NASA Small Business Innovation Research (SBIR) contract from Kennedy Space Center, Soneticom Inc., based in West Melbourne, Florida, created algorithms for time difference of arrival and radio interferometry, which it used in its Lynx Location System (LLS) to locate electromagnetic interference that can disrupt radio communications. Soneticom is collaborating with the Federal Aviation Administration (FAA) to install and test the LLS at its field test center in New Jersey in preparation for deploying the LLS at commercial airports. The software collects data from each sensor in order to compute the location of the interfering emitter.

  12. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    ERIC Educational Resources Information Center

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  13. Educational Software Acquisition for Microcomputers.

    ERIC Educational Resources Information Center

    Erikson, Warren; Turban, Efraim

    1985-01-01

    Examination of issues involved in acquiring appropriate microcomputer software for higher education focuses on the following points: developing your own software; finding commercially available software; using published evaluations; pre-purchase testing; customizing and adapting commercial software; post-purchase testing; and software use. A…

  14. Forecasting in foodservice: model development, testing, and evaluation.

    PubMed

    Miller, J L; Thompson, P A; Orabella, M M

    1991-05-01

    This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.

  15. Web survey data collection and retrieval to plan teleradiology implementation

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Collmann, Jeff R.; Johnson, Jeffrey A.; Lindisch, David; Nguyen, Dan; Mun, Seong K.

    2003-05-01

    This case study details the experience of system engineers of the Imaging Science and Information Systems Center, Georgetown University Medical Center (ISIS) and radiologists from the department of Radiology in the implementation of a new Teleradiology system. The Teleradiology system enables radiologists to view medical images from remote sites under those circumstances where a resident radiologist needs assistance in evaluating the images after hours and during weekends; it also enables clinicians access to patients" medical images from different workstations within the hospital. The Implementation of the Teleradiology project was preceded by an evaluation phase to perform testing, gather users feedback using a web site and collect information that helped eliminate system bugs, complete recommendations regarding minimum hardware configuration and bandwidth and enhance system"s functions, this phase included a survey-based system assessment of computer configurations, Internet connections, problem identification, and recommendations for improvement, and a testing period with 2 radiologists and ISIS engineers; The second phase was designed to launch the system and make it available to all attending radiologists in the department. To accomplish the first phase of the project a web site was designed and ASP pages were created to enable users to securely logon and enter feedback and recommendations into an SQL database. This efficient, accurate data flow alleviated networking, software and hardware problems. Corrective recommendations were immediately forwarded to the software vendor. The vendor responded with software updates that better met the needs of the radiologists. The ISIS Center completed recommendations for minimum hardware and bandwidth requirements. This experience illustrates that the approach used in collecting the data and facilitating the teamwork between the system engineers and radiologists was instrumental in the project"s success. Major problems with the Teleradiology system were discovered and remedied early by linking the actual practice experience of the physicians to the system improvements.

  16. Challenges of the Cassini Test Bed Simulating the Saturnian Environment

    NASA Technical Reports Server (NTRS)

    Hernandez, Juan C.; Badaruddin, Kareem S.

    2007-01-01

    The Cassini-Huygens mission is a joint NASA and European Space Agency (ESA) mission to collect scientific data of the Saturnian system and is managed by the Jet Propulsion Laboratory (JPL). After having arrived in Saturn orbit and releasing the ESA's Huygens probe for a highly successful descent and landing mission on Saturn's moon Titan, the Cassini orbiter continues on its tour of Saturn, its satellites, and the Saturnian environment. JPL's Cassini Integrated Test laboratory (ITL) is a dedicated high fidelity test bed that verifies and validates command sequences and flight software before upload to the Cassini spacecraft. The ITL provides artificial stimuli that allow a highly accurate hardware-in-the-loop test bed model that tests the operation of the Cassini spacecraft on the ground. This enables accurate prediction and recreation of mission events and flight software and hardware behavior. As we discovered more about the Saturnian environment, a combination of creative test methods and simulation changes were necessary to simulate the harmful effect that the optical and physical environment has on the pointing performance of Cassini. This paper presents the challenges experienced and overcome in that endeavor to simulate and test the post Saturn Orbit Insertion (SOI) and Probe Relay tour phase of the Cassini mission.

  17. Obtaining continuous BrAC/BAC estimates in the field: A hybrid system integrating transdermal alcohol biosensor, Intellidrink smartphone app, and BrAC Estimator software tools.

    PubMed

    Luczak, Susan E; Hawkins, Ashley L; Dai, Zheng; Wichmann, Raphael; Wang, Chunming; Rosen, I Gary

    2018-08-01

    Biosensors have been developed to measure transdermal alcohol concentration (TAC), but converting TAC into interpretable indices of blood/breath alcohol concentration (BAC/BrAC) is difficult because of variations that occur in TAC across individuals, drinking episodes, and devices. We have developed mathematical models and the BrAC Estimator software for calibrating and inverting TAC into quantifiable BrAC estimates (eBrAC). The calibration protocol to determine the individualized parameters for a specific individual wearing a specific device requires a drinking session in which BrAC and TAC measurements are obtained simultaneously. This calibration protocol was originally conducted in the laboratory with breath analyzers used to produce the BrAC data. Here we develop and test an alternative calibration protocol using drinking diary data collected in the field with the smartphone app Intellidrink to produce the BrAC calibration data. We compared BrAC Estimator software results for 11 drinking episodes collected by an expert user when using Intellidrink versus breath analyzer measurements as BrAC calibration data. Inversion phase results indicated the Intellidrink calibration protocol produced similar eBrAC curves and captured peak eBrAC to within 0.0003%, time of peak eBrAC to within 18min, and area under the eBrAC curve to within 0.025% alcohol-hours as the breath analyzer calibration protocol. This study provides evidence that drinking diary data can be used in place of breath analyzer data in the BrAC Estimator software calibration procedure, which can reduce participant and researcher burden and expand the potential software user pool beyond researchers studying participants who can drink in the laboratory. Copyright © 2017. Published by Elsevier Ltd.

  18. Collected software engineering papers, volume 8

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period November 1989 through October 1990 is presented. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography. The seven presented papers are grouped into four major categories: (1) experimental research and evaluation of software measurement; (2) studies on models for software reuse; (3) a software tool evaluation; and (4) Ada technology and studies in the areas of reuse and specification.

  19. Collected software engineering papers, volume 7

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A collection is presented of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period Dec. 1988 to Oct. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the seven papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  20. Collected software engineering papers, volume 6

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A collection is presented of technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period 1 Jun. 1987 to 1 Jan. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the twelve papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  1. Automated assessment of pain in rats using a voluntarily accessed static weight-bearing test.

    PubMed

    Kim, Hung Tae; Uchimoto, Kazuhiro; Duellman, Tyler; Yang, Jay

    2015-11-01

    The weight-bearing test is one method to assess pain in rodent animal models; however, the acceptance of this convenient method is limited by the low throughput data acquisition and necessity of confining the rodents to a small chamber. We developed novel data acquisition hardware and software, data analysis software, and a conditioning protocol for an automated high throughput static weight-bearing assessment of pain. With this device, the rats voluntarily enter the weighing chamber, precluding the necessity to restrain the animals and thereby removing the potential stress-induced confounds as well as operator selection bias during data collection. We name this device the Voluntarily Accessed Static Incapacitance Chamber (VASIC). Control rats subjected to the VASIC device provided hundreds of weight-bearing data points in a single behavioral assay. Chronic constriction injury (CCI) surgery and paw pad injection of complete Freund's adjuvant (CFA) or carrageenan in rats generated hundreds of weight-bearing data during a 30 minute recording session. Rats subjected to CCI, CFA, or carrageenan demonstrated the expected bias in weight distribution favoring the un-operated leg, and the analgesic effect of i.p. morphine was demonstrated. In comparison with existing methods, brief water restriction encouraged the rats to enter the weighing chamber to access water, and an infrared detector confirmed the rat position with feet properly positioned on the footplates, triggering data collection. This allowed hands-off measurement of weight distribution data reducing operator selection bias. The VASIC device should enhance the hands-free parallel collection of unbiased weight-bearing data in a high throughput manner, allowing further testing of this behavioral measure as an effective assessment of pain in rodents. Copyright © 2015. Published by Elsevier Inc.

  2. Automated assessment of pain in rats using a voluntarily accessed static weight-bearing test

    PubMed Central

    Kim, Hung Tae; Uchimoto, Kazuhiro; Duellman, Tyler; Yang, Jay

    2015-01-01

    The weight-bearing test is one method to assess pain in rodent animal models; however, the acceptance of this convenient method is limited by the low throughput data acquisition and necessity of confining the rodents to a small chamber. New methods We developed novel data acquisition hardware and software, data analysis software, and a conditioning protocol for an automated high throughput static weight-bearing assessment of pain. With this device, the rats voluntarily enter the weighing chamber, precluding the necessity to restrain the animals and thereby removing the potential stress-induced confounds as well as operator selection bias during data collection. We name this device the Voluntarily Accessed Static Incapacitance Chamber (VASIC). Results Control rats subjected to the VASIC device provided hundreds of weight-bearing data points in a single behavioral assay. Chronic constriction injury (CCI) surgery and paw pad injection of complete Freund's adjuvant (CFA) or carrageenan in rats generated hundreds of weight-bearing data during a 30 minute recording session. Rats subjected to CCI, CFA, or carrageenan demonstrated the expected bias in weight distribution favoring the un-operated leg, and the analgesic effect of i.p. morphine was demonstrated. In comparison with existing methods, brief water restriction encouraged the rats to enter the weighing chamber to access water, and an infrared detector confirmed the rat position with feet properly positioned on the footplates, triggering data collection. This allowed hands-off measurement of weight distribution data reducing operator selection bias. Conclusion The VASIC device should enhance the hands-free parallel collection of unbiased weight-bearing data in a high throughput manner, allowing further testing of this behavioral measure as an effective assessment of pain in rodents. PMID:26143745

  3. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  4. Metric analysis and data validation across FORTRAN projects

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.; Phillips, Tsai-Yun

    1983-01-01

    The desire to predict the effort in developing or explaining the quality of software has led to the proposal of several metrics. As a step toward validating these metrics, the Software Engineering Laboratory (SEL) has analyzed the software science metrics, cyclomatic complexity, and various standard program measures for their relation to effort (including design through acceptance testing), development errors (both discrete and weighted according to the amount of time to locate and fix), and one another. The data investigated are collected from a project FORTRAN environment and examined across several projects at once, within individual projects and by reporting accuracy checks demonstrating the need to validate a database. When the data comes from individual programmers or certain validated projects, the metrics' correlations with actual effort seem to be strongest. For modules developed entirely by individual programmers, the validity ratios induce a statistically significant ordering of several of the metrics' correlations. When comparing the strongest correlations, neither software science's E metric cyclomatic complexity not source lines of code appears to relate convincingly better with effort than the others.

  5. LabVIEW control software for scanning micro-beam X-ray fluorescence spectrometer.

    PubMed

    Wrobel, Pawel; Czyzycki, Mateusz; Furman, Leszek; Kolasinski, Krzysztof; Lankosz, Marek; Mrenca, Alina; Samek, Lucyna; Wegrzynek, Dariusz

    2012-05-15

    Confocal micro-beam X-ray fluorescence microscope was constructed. The system was assembled from commercially available components - a low power X-ray tube source, polycapillary X-ray optics and silicon drift detector - controlled by an in-house developed LabVIEW software. A video camera coupled to optical microscope was utilized to display the area excited by X-ray beam. The camera image calibration and scan area definition software were also based entirely on LabVIEW code. Presently, the main area of application of the newly constructed spectrometer is 2-dimensional mapping of element distribution in environmental, biological and geological samples with micrometer spatial resolution. The hardware and the developed software can already handle volumetric 3-D confocal scans. In this work, a front panel graphical user interface as well as communication protocols between hardware components were described. Two applications of the spectrometer, to homogeneity testing of titanium layers and to imaging of various types of grains in air particulate matter collected on membrane filters, were presented. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Factors That Affect Software Testability

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamieson, Kevin; Davis, IV, Warren L.

    Active learning methods automatically adapt data collection by selecting the most informative samples in order to accelerate machine learning. Because of this, real-world testing and comparing active learning algorithms requires collecting new datasets (adaptively), rather than simply applying algorithms to benchmark datasets, as is the norm in (passive) machine learning research. To facilitate the development, testing and deployment of active learning for real applications, we have built an open-source software system for large-scale active learning research and experimentation. The system, called NEXT, provides a unique platform for realworld, reproducible active learning research. This paper details the challenges of building themore » system and demonstrates its capabilities with several experiments. The results show how experimentation can help expose strengths and weaknesses of active learning algorithms, in sometimes unexpected and enlightening ways.« less

  8. A standardized test battery for the study of synesthesia

    PubMed Central

    Eagleman, David M.; Kagan, Arielle D.; Nelson, Stephanie S.; Sagaram, Deepak; Sarma, Anand K.

    2014-01-01

    Synesthesia is an unusual condition in which stimulation of one modality evokes sensation or experience in another modality. Although discussed in the literature well over a century ago, synesthesia slipped out of the scientific spotlight for decades because of the difficulty in verifying and quantifying private perceptual experiences. In recent years, the study of synesthesia has enjoyed a renaissance due to the introduction of tests that demonstrate the reality of the condition, its automatic and involuntary nature, and its measurable perceptual consequences. However, while several research groups now study synesthesia, there is no single protocol for comparing, contrasting and pooling synesthetic subjects across these groups. There is no standard battery of tests, no quantifiable scoring system, and no standard phrasing of questions. Additionally, the tests that exist offer no means for data comparison. To remedy this deficit we have devised the Synesthesia Battery. This unified collection of tests is freely accessible online (http://www.synesthete.org). It consists of a questionnaire and several online software programs, and test results are immediately available for use by synesthetes and invited researchers. Performance on the tests is quantified with a standard scoring system. We introduce several novel tests here, and offer the software for running the tests. By presenting standardized procedures for testing and comparing subjects, this endeavor hopes to speed scientific progress in synesthesia research. PMID:16919755

  9. A methodology for testing fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.

    1985-01-01

    A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.

  10. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  11. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    NASA Astrophysics Data System (ADS)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  12. Operational Based Vision Assessment Cone Contrast Test: Description and Operation

    DTIC Science & Technology

    2016-06-01

    designed to detect abnormalities and characterize the contrast sensitivity of the color mechanisms of the human visual system. The OBVA CCT will...than 1, the individual is determined to have an abnormal L-M mechanism. The L-M sensitivity of mildly abnormal individuals (anomalous trichromats...response pads. This hardware is integrated with custom software that generates the stimuli, collects responses, and analyzes the results as outlined in

  13. Measurement of Visual Reaction Times Using Hand-held Mobile Devices

    NASA Technical Reports Server (NTRS)

    Mulligan, Jeffrey B.; Arsintescu, Lucia; Flynn-Evans, Erin

    2015-01-01

    Modern mobile devices provide a convenient platform for collecting research data in the field. But,because the working of these devices is often cloaked behind multiple layers of proprietary system software, it can bedifficult to assess the accuracy of the data they produce, particularly in the case of timing. We have been collecting datain a simple visual reaction time experiment, as part of a fatigue testing protocol known as the Psychomotor Vigilance Test (PVT). In this protocol, subjects run a 5-minute block consisting of a sequence of trials in which a visual stimulus appears after an unpredictable variable delay. The subject is required to tap the screen as soon as possible after the appearance of the stimulus. In order to validate the reaction times reported by our program, we had subjects perform the task while a high-speed video camera recorded both the display screen, and a side view of the finger (observed in a mirror). Simple image-processing methods were applied to determine the frames in which the stimulus appeared and disappeared, and in which the finger made and broke contact with the screen. The results demonstrate a systematic delay between the initial contact by the finger and the detection of the touch by the software, having a value of 80 +- 20 milliseconds.

  14. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  15. Software to Go--And It Goes!

    ERIC Educational Resources Information Center

    Abrams, Mary; Kurlychek, Ken

    1989-01-01

    This article describes the Software Evaluation Clearinghouse for Educators of the Hearing Impaired at Gallaudet University (Washington, DC). Software compatible with Apple and IBM hardware is collected, rated by clearinghouse members, and described in a printed catalog. Tips on starting a software lending library are offered. (PB)

  16. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  17. The software product assurance metrics study: JPL's software systems quality and productivity

    NASA Technical Reports Server (NTRS)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  18. Sociotechnical Human Factors Involved in Remote Online Usability Testing of Two eHealth Interventions.

    PubMed

    Wozney, Lori M; Baxter, Pamela; Fast, Hilary; Cleghorn, Laura; Hundert, Amos S; Newton, Amanda S

    2016-02-03

    Research in the fields of human performance technology and human computer interaction are challenging the traditional macro focus of usability testing arguing for methods that help test moderators assess "use in context" (ie, cognitive skills, usability understood over time) and in authentic "real world" settings. Human factors in these complex test scenarios may impact on the quality of usability results being derived yet there is a lack of research detailing moderator experiences in these test environments. Most comparative research has focused on the impact of the physical environment on results, and rarely on how the sociotechnical elements of the test environment affect moderator and test user performance. Improving our understanding of moderator roles and experiences with conducting "real world" usability testing can lead to improved techniques and strategies To understand moderator experiences of using Web-conferencing software to conduct remote usability testing of 2 eHealth interventions. An exploratory case study approach was used to study 4 moderators' experiences using Blackboard Collaborate for remote testing sessions of 2 different eHealth interventions. Data collection involved audio-recording iterative cycles of test sessions, collecting summary notes taken by moderators, and conducting 2 90-minute focus groups via teleconference. A direct content analysis with an inductive coding approach was used to explore personal accounts, assess the credibility of data interpretation, and generate consensus on the thematic structure of the results. Following the convergence of data from the various sources, 3 major themes were identified: (1) moderators experienced and adapted to unpredictable changes in cognitive load during testing; (2) moderators experienced challenges in creating and sustaining social presence and untangling dialogue; and (3) moderators experienced diverse technical demands, but were able to collaboratively troubleshoot with test users. Results highlight important human-computer interactions and human factor qualities that impact usability testing processes. Moderators need an advanced skill and knowledge set to address the social interaction aspects of Web-based usability testing and technical aspects of conferencing software during test sessions. Findings from moderator-focused studies can inform the design of remote testing platforms and real-time usability evaluation processes that place less cognitive burden on moderators and test users.

  19. GROVER: An autonomous vehicle for ice sheet research

    NASA Astrophysics Data System (ADS)

    Trisca, G. O.; Robertson, M. E.; Marshall, H.; Koenig, L.; Comberiate, M. A.

    2013-12-01

    The Goddard Remotely Operated Vehicle for Exploration and Research or Greenland Rover (GROVER) is a science enabling autonomous robot specifically designed to carry a low-power, large bandwidth radar for snow accumulation mapping over the Greenland Ice Sheet. This new and evolving technology enables reduced cost and increased safety for polar research. GROVER was field tested at Summit, Greenland in May 2013. The robot traveled over 30 km and was controlled both by line of sight wireless and completely autonomously with commands and telemetry via the Iridium Satellite Network, from Summit as well as remotely from Boise, Idaho. Here we describe GROVER's unique abilities and design. The software stack features a modular design that can be adapted for any application that requires autonomous behavior, reliable communications using different technologies and low level control of peripherals. The modules are built to communicate using the publisher-subscriber design pattern to maximize data-reuse and allow for graceful failures at the software level, along with the ability to be loaded or unloaded on-the-fly, enabling the software to adopt different behaviors based on power constraints or specific processing needs. These modules can also be loaded or unloaded remotely for servicing and telemetry can be configured to contain any kind of information being generated by the sensors or scientific instruments. The hardware design protects the electronic components and the control system can change functional parameters based on sensor input. Power failure modes built into the hardware prevent the vehicle from running out of energy permanently by monitoring voltage levels and triggering software reboots when the levels match pre-established conditions. This guarantees that the control software will be operational as soon as there is enough charge to sustain it, giving the vehicle increased longevity in case of a temporary power loss. GROVER demonstrates that autonomous rovers can be a revolutionary tool for data collection, and that both the technology and the software are available and ready to be implemented to create scientific data collection platforms.

  20. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon D.

    1991-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  1. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease.

    PubMed

    Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk

    2016-06-01

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.

  2. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  3. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  4. Software and package applicating for network meta-analysis: A usage-based comparative study.

    PubMed

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  5. Results of the second flight test of the Loran-C receiver/data collection system

    NASA Technical Reports Server (NTRS)

    Fischer, J. P.

    1979-01-01

    The components of the Loran-C navigation system which were developed thus far are a phase-locked-loop receiver and a microcomputer development system. The microcomputer is being used as a means of testing and implementing software to handle sensor control and navigation calculations. Currently, the microcomputer is being used to collect and record data from the receiver in addition to development work. With these components, it was possible to record receiver data over a period of time and then reduce this data to obtain statistical information. It was particularly interesting to load the equipment developed in the laboratory into an aircraft and collect data while in flight. For initial flight tests, some important considerations were how well the entire system will perform in the field, signal strength levels while on the ground and in the air, the amount of noise present, changing of signal-to-noise ratio for various aircraft configurations and maneuvers, receiver overloading due to other equipment and antennas, and the overall usefulness of Loran-C as a navigation aid.

  6. Space shuttle orbiter avionics software: Post review report for the entry FACI (First Article Configuration Inspection). [including orbital flight tests integrated system

    NASA Technical Reports Server (NTRS)

    Markos, H.

    1978-01-01

    Status of the computer programs dealing with space shuttle orbiter avionics is reported. Specific topics covered include: delivery status; SSW software; SM software; DL software; GNC software; level 3/4 testing; level 5 testing; performance analysis, SDL readiness for entry first article configuration inspection; and verification assessment.

  7. Software To Go: A Catalog of Software Available for Loan.

    ERIC Educational Resources Information Center

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  8. A Compatible Hardware/Software Reliability Prediction Model.

    DTIC Science & Technology

    1981-07-22

    machines. In particular, he was interested in the following problem: assu me that one has a collection of connected elements computing and transmitting...software reliability prediction model is desirable, the findings about the Weibull distribution are intriguing. After collecting failure data from several...capacitor, some of the added charge carriers are collected by the capacitor. If the added charge is sufficiently large, the information stored is changed

  9. Pyrolaser Operating System

    NASA Technical Reports Server (NTRS)

    Roberts, Floyd E., III

    1994-01-01

    Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.

  10. An Integrated Unix-based CAD System for the Design and Testing of Custom VLSI Chips

    NASA Technical Reports Server (NTRS)

    Deutsch, L. J.

    1985-01-01

    A computer aided design (CAD) system that is being used at the Jet Propulsion Laboratory for the design of custom and semicustom very large scale integrated (VLSI) chips is described. The system consists of a Digital Equipment Corporation VAX computer with the UNIX operating system and a collection of software tools for the layout, simulation, and verification of microcircuits. Most of these tools were written by the academic community and are, therefore, available to JPL at little or no cost. Some small pieces of software have been written in-house in order to make all the tools interact with each other with a minimal amount of effort on the part of the designer.

  11. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  12. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  13. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.

  14. Using the Landlab toolkit to evaluate and compare alternative geomorphic and hydrologic model formulations

    NASA Astrophysics Data System (ADS)

    Tucker, G. E.; Adams, J. M.; Doty, S. G.; Gasparini, N. M.; Hill, M. C.; Hobley, D. E. J.; Hutton, E.; Istanbulluoglu, E.; Nudurupati, S. S.

    2016-12-01

    Developing a better understanding of catchment hydrology and geomorphology ideally involves quantitative hypothesis testing. Often one seeks to identify the simplest mathematical and/or computational model that accounts for the essential dynamics in the system of interest. Development of alternative hypotheses involves testing and comparing alternative formulations, but the process of comparison and evaluation is made challenging by the rigid nature of many computational models, which are often built around a single assumed set of equations. Here we review a software framework for two-dimensional computational modeling that facilitates the creation, testing, and comparison of surface-dynamics models. Landlab is essentially a Python-language software library. Its gridding module allows for easy generation of a structured (raster, hex) or unstructured (Voronoi-Delaunay) mesh, with the capability to attach data arrays to particular types of element. Landlab includes functions that implement common numerical operations, such as gradient calculation and summation of fluxes within grid cells. Landlab also includes a collection of process components, which are encapsulated pieces of software that implement a numerical calculation of a particular process. Examples include downslope flow routing over topography, shallow-water hydrodynamics, stream erosion, and sediment transport on hillslopes. Individual components share a common grid and data arrays, and they can be coupled through the use of a simple Python script. We illustrate Landlab's capabilities with a case study of Holocene landscape development in the northeastern US, in which we seek to identify a collection of model components that can account for the formation of a series of incised canyons that have that developed since the Laurentide ice sheet last retreated. We compare sets of model ingredients related to (1) catchment hydrologic response, (2) hillslope evolution, and (3) stream channel and gully incision. The case-study example demonstrates the value of exploring multiple working hypotheses, in the form of multiple alternative model components.

  15. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner. Through agile testing, there has been increased value through individuals and interactions over processes and tools, improved customer collaboration, and improved responsiveness to changes through controlled planning. The presentation will describe agile testing methodology as taken with the SLS FSW Test and Verification team at Marshall Space Flight Center.

  16. Overview of software development at the parabolic dish test site

    NASA Technical Reports Server (NTRS)

    Miyazono, C. K.

    1985-01-01

    The development history of the data acquisition and data analysis software is discussed. The software development occurred between 1978 and 1984 in support of solar energy module testing at the Jet Propulsion Laboratory's Parabolic Dish Test Site, located within Edwards Test Station. The development went through incremental stages, starting with a simple single-user BASIC set of programs, and progressing to the relative complex multi-user FORTRAN system that was used until the termination of the project. Additional software in support of testing is discussed including software in support of a meteorological subsystem and the Test Bed Concentrator Control Console interface. Conclusions and recommendations for further development are discussed.

  17. Dynamic Weather Routes Architecture Overview

    NASA Technical Reports Server (NTRS)

    Eslami, Hassan; Eshow, Michelle

    2014-01-01

    Dynamic Weather Routes Architecture Overview, presents the high level software architecture of DWR, based on the CTAS software framework and the Direct-To automation tool. The document also covers external and internal data flows, required dataset, changes to the Direct-To software for DWR, collection of software statistics, and the code structure.

  18. 75 FR 6185 - Information Collection Requirement; Defense Federal Acquisition Regulation Supplement; Rights in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ...; Defense Federal Acquisition Regulation Supplement; Rights in Technical Data and Computer Software (OMB... 227.72, Rights in Computer Software and Computer Software Documentation, and related provisions and... rights in technical data and computer software. DoD needs this information to implement 10 U.S.C. 2320...

  19. 76 FR 80402 - Certain Personal Data and Mobile Communications Devices and Related Software; Final Determination...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... Communications Devices and Related Software; Final Determination Finding Violation of Section 337; Issuance of a... importation of infringing personal data and mobile communications devices and related software. The Commission... subsidiary NeXT Software, Inc., both of Cupertino, California (collectively, ``Apple''), alleging a violation...

  20. 78 FR 30898 - Information Collection Requirement; Defense Federal Acquisition Regulation Supplement; Rights in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ... Data and Computer Software AGENCY: Defense Acquisition Regulations System; Department of Defense (DoD... in Technical Data, and Subpart 227.72, Rights in Computer Software and Computer Software... are associated with rights in technical data and computer software. DoD needs this information to...

  1. Understanding the Perception of Very Small Software Companies towards the Adoption of Process Standards

    NASA Astrophysics Data System (ADS)

    Basri, Shuib; O'Connor, Rory V.

    This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.

  2. The Automated Instrumentation and Monitoring System (AIMS) reference manual

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Hontalas, Philip; Listgarten, Sherry

    1993-01-01

    Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).

  3. Designing Test Suites for Software Interactions Testing

    DTIC Science & Technology

    2004-01-01

    the annual cost of insufficient software testing methods and tools in the United States is between 22.2 to 59.5 billion US dollars [13, 14]. This study...10 (2004), 1–29. [21] Cheng, C., Dumitrescu, A., and Schroeder , P. Generating small com- binatorial test suites to cover input-output relationships... Proceedings of the Conference on the Future of Software Engineering (May 2000), pp. 61 – 72. [51] Hartman, A. Software and hardware testing using

  4. Radioactive Decay: Audio Data Collection

    ERIC Educational Resources Information Center

    Struthers, Allan

    2009-01-01

    Many phenomena generate interesting audible time series. This data can be collected and processed using audio software. The free software package "Audacity" is used to demonstrate the process by recording, processing, and extracting click times from an inexpensive radiation detector. The high quality of the data is demonstrated with a simple…

  5. Using Online Surveys to Promote and Assess Learning

    ERIC Educational Resources Information Center

    Taylor, Laura; Doehler, Kirsten

    2014-01-01

    This article explores the use of online survey software to collect data from students during class to efficiently use class time. Several example activities for an introductory statistics classroom are considered. We also discuss utilization of online survey software for other purposes such as collecting assessment information and student…

  6. Improving the quality of care of patients with rheumatic disease using patient-centric electronic redesign software.

    PubMed

    Newman, Eric D; Lerch, Virginia; Billet, Jon; Berger, Andrea; Kirchner, H Lester

    2015-04-01

    Electronic health records (EHRs) are not optimized for chronic disease management. To improve the quality of care for patients with rheumatic disease, we developed electronic data capture, aggregation, display, and documentation software. The software integrated and reassembled information from the patient (via a touchscreen questionnaire), nurse, physician, and EHR into a series of actionable views. Core functions included trends over time, rheumatology-related demographics, and documentation for patient and provider. Quality measures collected included patient-reported outcomes, disease activity, and function. The software was tested and implemented in 3 rheumatology departments, and integrated into routine care delivery. Post-implementation evaluation measured adoption, efficiency, productivity, and patient perception. Over 2 years, 6,725 patients completed 19,786 touchscreen questionnaires. The software was adopted for use by 86% of patients and rheumatologists. Chart review and documentation time trended downward, and productivity increased by 26%. Patient satisfaction, activation, and adherence remained unchanged, although pre-implementation values were high. A strong correlation was seen between use of the software and disease control (weighted Pearson's correlation coefficient 0.5927, P = 0.0095), and a relative increase in patients with low disease activity of 3% per quarter was noted. We describe innovative software that aggregates, stores, and displays information vital to improving the quality of care for patients with chronic rheumatic disease. The software was well-adopted by patients and providers. Post-implementation, significant improvements in quality of care, efficiency of care, and productivity were demonstrated. Copyright © 2015 by the American College of Rheumatology.

  7. Development and Validation of a Computational Model for Predicting the Behavior of Plumes from Large Solid Rocket Motors

    NASA Technical Reports Server (NTRS)

    Wells, Jason E.; Black, David L.; Taylor, Casey L.

    2013-01-01

    Exhaust plumes from large solid rocket motors fired at ATK's Promontory test site carry particulates to high altitudes and typically produce deposits that fall on regions downwind of the test area. As populations and communities near the test facility grow, ATK has become increasingly concerned about the impact of motor testing on those surrounding communities. To assess the potential impact of motor testing on the community and to identify feasible mitigation strategies, it is essential to have a tool capable of predicting plume behavior downrange of the test stand. A software package, called PlumeTracker, has been developed and validated at ATK for this purpose. The code is a point model that offers a time-dependent, physics-based description of plume transport and precipitation. The code can utilize either measured or forecasted weather data to generate plume predictions. Next-Generation Radar (NEXRAD) data and field observations from twenty-three historical motor test fires at Promontory were collected to test the predictive capability of PlumeTracker. Model predictions for plume trajectories and deposition fields were found to correlate well with the collected dataset.

  8. Flexible and Low-Cost Measurements for Space Software Development- The Measurements Exploration Framework

    NASA Astrophysics Data System (ADS)

    Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika

    2011-08-01

    Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.

  9. ETICS: the international software engineering service for the grid

    NASA Astrophysics Data System (ADS)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  10. Improving Maintenance Data Collection Via Point-of- Maintenance (POMX) Implementation

    DTIC Science & Technology

    2006-03-01

    accurate documentation, (3) identifying and correcting the root causes for poor data integrity, and (4) educating the unit on the critical need for data ...the validity of the results. The data in this study were analyzed using the SAS JMP 6.0 statistical software package. The results for the tests...traditional keyboard data entry methods at a computer terminal. These terminals are typically located in the aircraft maintenance unit (AMU) facility , away

  11. Improving Maintenance Data Collection Via Point-Of-Maintenance (POMX) Implementation

    DTIC Science & Technology

    2006-03-01

    accurate documentation, (3) identifying and correcting the root causes for poor data integrity, and (4) educating the unit on the critical need for data ...the validity of the results. The data in this study were analyzed using the SAS JMP 6.0 statistical software package. The results for the tests...traditional keyboard data entry methods at a computer terminal. These terminals are typically located in the aircraft maintenance unit (AMU) facility , away

  12. Effectiveness Testing of Embedded User Support for U.S. Army Installation-Level Software

    DTIC Science & Technology

    1991-06-01

    under what conditions Dynamic Help could influence performance and satisfaction. The ACIFS program was modified to provide automatic collection of all...under what conditions Dynamic Help can influence user performance and satisfaction. This chapter reports the design, implementation, and analysis of...ambiguous or is hidden in the body of the message. The ACIFS program has many user interface deficiencies, but it does allow the user to use trial and

  13. "Test" is a Four Letter Word

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G M

    2005-05-03

    For a number of years I had the pleasure of teaching Testing Seminars all over the world and meeting and learning from others in our field. Over a twelve year period, I always asked the following questions to Software Developers, Test Engineers, and Managers who took my two or three day seminar on Software Testing: 'When was the first time you heard the word test'? 'Where were you when you first heard the word test'? 'Who said the word test'? 'How did the word test make you feel'? Most of the thousands of responses were similar to 'It was mymore » third grade teacher at school, and I felt nervous and afraid'. Now there were a few exceptions like 'It was my third grade teacher, and I was happy and excited to show how smart I was'. But by and large, my informal survey found that 'testing' is a word to which most people attach negative meanings, based on its historical context. So why is this important to those of us in the software development business? Because I have found that a preponderance of software developers do not get real excited about hearing that the software they just wrote is going to be 'tested' by the Test Group. Typical reactions I have heard over the years run from: 'I'm sure there is nothing wrong with the software, so go ahead and test it, better you find defects than our customers'. to these extremes: 'There is no need to test my software because there is nothing wrong with it'. 'You are not qualified to test my software because you don't know as much as I do about it'. 'If any Test Engineers come into our office again to test our software we will throw them through the third floor window'. So why is there such a strong negative reaction to testing? It is primitive. It goes back to grade school for many of us. It is a negative word that congers up negative emotions. In other words, 'test' is a four letter word. How many of us associate 'Joy' with 'Test'? Not many. It is hard for most of us to reprogram associations learned at an early age. So what can we do about it (short of hypnotic therapy for software developers)? Well one concept I have used (and still use) is to not call testing 'testing'. Call it something else. Ever wonder why most of the Independent Software Testing groups are called Software Quality Assurance groups? Now you know. Software Quality Assurance is not such a negatively charged phrase, even though Software Quality Assurance is much more than simply testing. It was a real blessing when the concept of Validation and Verification came about for software. Now I define Validation to mean assuring that the product produced does the right thing (usually what the customer wants it to do), and verification means that the product was built the right way (in accordance with some good design principles and practices). So I have deliberately called the System Test Group the Verification and Validation Group, or V&V Group, as a way of avoiding the negative image problem. I remember once having a conversation with a developer colleague who said, in the heat of battle, that it was fine to V&V his code, just don't test it! Once again V&V includes many things besides testing, but it just doesn't sound like an onerous thing to do to software. In my current job, working at a highly regarded national laboratory with world renowned physicists, I have again encountered the negativity about testing software. Except here they don't take kindly to Software Quality Assurance or Software Verification and Validation either. After all, software is just a trivial tool to automate algorithms that implement physics models. Testing, SQA, and V&V take time and get in the way of completing ground breaking science experiments. So I have again had to change the name of software testing to something less negative in the physics world. I found (the hard way) that if I requested more time to do software experimentation, the physicist's resistance melted. And so the conversation continues, 'We have time to run more software experiments. Just don't waste any time testing the software'! In case the concept of not calling testing 'testing' appeals to you, and there may be an opportunity for you to take the sting out of the name at your place of employment, I have compiled a table of things that testing could be called besides 'testing'. Of course we can embellish this by adding some good sounding prefixes and suffixes also. To come up with alternate names for testing, pick a word from columns A, B, and C in the table below. For instance Unified Acceptance Trials (A2,B7,C3) or Tailored Observational Demonstration (A6,B5,C5) or Agile Criteria Scoring (A3,B8,C8) or Rapid Requirement Proof (A1,B9,C7) or Satisfaction Assurance (B10,C1). You can probably think of some additional combinations appropriate for your industry.« less

  14. Integrating Testing into Software Engineering Courses Supported by a Collaborative Learning Environment

    ERIC Educational Resources Information Center

    Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.

    2014-01-01

    As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…

  15. Prospectus 2000

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.; Gettys, Nancy S.

    2000-01-01

    We begin 2000 with a message about our plans for JCE Software and what you will be seeing in this column as the year progresses. Floppy Disk --> CD-ROM Most software today is distributed on CD-ROM or by downloading from the Internet. Several new computers no longer include a floppy disk drive as "standard equipment". Today's software no longer fits on one or two floppies (the installation software alone can require two disks) and the cost of reproducing and distributing several disks is prohibitive. In short, distribution of software on floppy disks is no longer practical. Therefore, JCE Software will distribute all new software publications on CD-ROM rather than on disks. Regular Issues --> Collections Distribution of all our software on CD-ROM allows us to extend our concept of software collections that we started with the General Chemistry Collection. Such collections will contain all the previously published software that is still "in print" (i.e., is compatible with current operating systems and hardware) and any new programs that fall under the topic of the collection. Proposed topics in addition to General Chemistry currently include Advanced Chemistry, Instrument and Laboratory Simulations, and Spectroscopy. Eventually, all regular issues will be replaced by these collections, which will be updated annually or semiannually with new programs and updates to existing programs. Abstracts for all new programs will continue to appear in this column when a collection or its update is ready for publication. We will continue to offer special issues of single larger programs (e.g. Periodic Table Live!, Chemistry Comes Alive! volumes) on CD-ROM and video on videotape. Connect with Your Students outside Class JCE Software has always offered network licenses to allow instructors to make our software available to students in computer labs, but that model no longer fits the way many instructors and students work with computers. Many students (or their families) own a personal computer allowing them much more flexibility than a campus computer lab. Many instructors utilize the World Wide Web, creating HTML pages for students to use. JCE Software has options available to take advantage of both of these developments. Software Adoption To provide students who own computers access to JCE Software programs, consider adopting one or more of our CD-ROMs as you would a textbook. The General Chemistry Collection has been adopted by several general chemistry courses. We can arrange to bundle CDs with laboratory manuals or to be sold separately to students through the campus bookstore. The cost per CD can be quite low (as little as $5) when large numbers are ordered, making this a cost-effective method of allowing students access to the software they need whenever and wherever they desire. Web-Ready Publications Several JCE Software programs use HTML to present the material. Viewed with the ubiquitous Internet Browser, HTML is compatible with both Mac OS and Windows (as well most other current operating systems) and provides a flexible hypermedia interface that is familiar to an increasing number of instructors and students. HTML-based publications are also ready for use on local intranets, with appropriate licensing, and can be readily incorporated into other HTML-based materials. Already published in this format are: Chemistry Comes Alive!, Volumes 1 and 2 (Special Issues 18 and 21), Flying over Atoms (Special Issue 19), and Periodic Table Live! Second Edition (Special Issue 17). Solid State Resources Second Edition (Special Issue 12) and Chemistry Comes Alive!, Volume 3 (Special Issue 23) will be available soon. Other submissions being developed in HTML format include ChemPages Laboratory and Multimedia General Chemistry Problems. Contact the JCE Software office to learn about licensing alternatives that take advantage of the World Wide Web. Periodic Table Live! 2nd ed. is one of JCE Software's "Web-ready" publications. Publication Plans for 2000 We have several exciting new issues planned for publication in the coming year. Chemistry Comes Alive! The Chemistry Comes Alive! (CCA!) series continues with additional CD-ROMs for Mac OS and Windows. Each volume in this series contains video and animations of chemical reactions that can be easily incorporated into your own computer-based presentations. Our digital video now uses state-of-the-art compression that yields higher quality video with smaller file sizes and data rates more suited for WWW delivery. Video for Periodic Table Live! 2nd edition, Chemistry Comes Alive! Volumes 3, ChemPages Laboratory, and Multimedia General Chemistry Problems use this new format. We will be releasing updates of CCA! Volumes 1 and 2 to take advantage of this new technology. We are very pleased with the results and think you will be also. The reaction of aluminum with chlorine is included in Chemistry Comes Alive! Volume 3. ChemPages Laboratory ChemPages Laboratory, developed by the New Traditions Curriculum Project at the University of Wisconsin-Madison, is an HTML-based CD-ROM for Mac OS and Windows that contains lessons and tutorials to prepare introductory chemistry students to work in the laboratory. It includes text, photographs, computer graphics, animations, digital video, and voice narration to introduce students to the laboratory equipment and procedures. ChemPages Laboratory teaches introductory chemistry students about laboratory instruments, equipment, and procedures. Versatile Video Video demonstrating the "drinking bird" is included in the Chemistry Comes Alive! video collection. Video from this collection can be incorporated into many other projects. As an example, David Whisnant has used the drinking bird in his Multimedia General Chemistry Problems, where students view the video and are asked to explain why the bird bobs up and down. JCE Software anticipates publication of Multimedia General Chemistry Problems on CD-ROM for Mac OS and Windows in 2000. It will be "Web-ready". General Chemistry Collection, 4th Edition The General Chemistry Collection will be revised early in the summer and CDs will be shipped in time for fall adoptions. The 4th edition will include JCE Software publications for general chemistry published in 1999, as well as any programs for general chemistry accepted in 2000. Regular Issues We have had many recent submissions and submissions of work in progress. In 2000 we will work with the authors and our peer-reviewers to complete and publish these submissions individually or as part of a software collection on CD-ROM. An Invitation In collaboration with JCE Online we plan to make available in 2000 more support files for JCE Software. These will include not only troubleshooting tips and technical support notes, but also supporting information submitted by users such as lessons, specific assignments, and activities using JCE Software publications. All JCE Software users are invited to contribute to this area. Get in touch with JCE Software and let us know how you are using our materials so that we can share your ideas with others! Although the word software is in our name, many of our publications are not traditional software. We also publish video on videotape, videodisc, and CD-ROM and electronic documents (Mathcad and Mathematica, spreadsheet files and macros, HTML documents, and PowerPoint presentations). Most chemistry instructors who use a computer in their teaching have created or considered creating one or more of these for their classes. If you have an original computer presentation, electronic document, animation, video, or any other item that is not printed text it is probably an appropriate submission for JCE Software. By publishing your work in any branch of the Journal of Chemical Education, you will share your efforts with chemistry instructors and students all over the world and get professional recognition for your achievements. All JCE Software publications are Y2K compliant.

  16. PhysioNet: physiologic signals, time series and related open source software for basic, clinical, and applied research.

    PubMed

    Moody, George B; Mark, Roger G; Goldberger, Ary L

    2011-01-01

    PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.

  17. Developing multimedia software and virtual reality worlds and their use in rehabilitation and psychology.

    PubMed

    Sik Lányi, Cecília; Laky, Viktória; Tilinger, Adám; Pataky, Ilona; Simon, Lajos; Kiss, Bernadett; Simon, Viktória; Szabó, Júlianna; Páll, Attila

    2004-01-01

    The multimedia and virtual reality projects performed at our laboratory during the last ten years can be grouped into the following groups: 1) tutorial and entertainment programs for handicapped children, 2) rehabilitation programs for stroke patients and patients with phobias. We have developed multimedia software for handicapped children with various impairments: partial vision, hearing difficulties, locomotive difficulties, mental retardation, dyslexia etc. In the present paper we show the advantages of using multimedia software to develop mental skills in handicapped people and deal with the special needs of handicapped children. For the rehabilitation of stroke patients we have developed a computer-controlled method, which enables - contrary to methods used internationally - not only the establishment of a diagnosis, but also measurement of therapy effectiveness: 1) it enables us to produce a database of patients, which contains not only their personal data but also test results, their drawings and audio recordings, 2) it is in itself an intensive therapeutic test and contains tutorial programs. We are currently collecting test results. We have also developed some virtual worlds for treating phobias: a virtual balcony and a ten-story building with an external glass elevator as well as an internal glass elevator in the virtual Atrium Hyatt hotel. We have developed a virtual environment for treating claustrophobia too: a closed lift and a room where the walls can move. For specific phobias (fear of travelling) we have modelled the underground railway system in Budapest. For autistic children, we have developed virtual shopping software too. In this paper we present the advantages of virtual reality in the investigation, evaluation and treatment of perception, behaviour and neuropsychological disorders.

  18. How can GPs drive software changes to improve healthcare for Aboriginal and Torres Strait Islanders peoples?

    PubMed

    Kehoe, Helen

    2017-01-01

    Changes to the software used in general practice could improve the collection of the Aboriginal and Torres Strait Islander status of all patients, and boost access to healthcare measures specifically for Aboriginal and Torres Strait Islander peoples provided directly or indirectly by general practitioners (GPs). Despite longstanding calls for improvements to general practice software to better support Aboriginal and Torres Strait Islander health, little change has been made. The aim of this article is to promote software improvements by identifying desirable software attributes and encouraging GPs to promote their adoption. Establishing strong links between collecting Aboriginal and Torres Strait Islander status, clinical decision supports, and uptake of GP-mediated health measures specifically for Aboriginal and Torres Strait Islander peoples - and embedding these links in GP software - is a long overdue reform. In the absence of government initiatives in this area, GPs are best placed to advocate for software changes, using the model described here as a starting point for action.

  19. A portable telescope based on the ALIBAVA system for test beam studies

    NASA Astrophysics Data System (ADS)

    Bernabeu, J.; Casse, G.; Garcia, C.; Greenall, A.; Lacasta, C.; Lozano, M.; Marti-Garcia, S.; Pellegrini, G.; Rodriguez, J.; Ullan, M.; Tsurin, I.

    2013-12-01

    A test beam telescope has been built using the ALIBAVA system to drive its data acquisition. The basic telescope planes consist of four XYT stations. Each station is built from a detector board with two strip sensors, mounted one in each side (strips crossing at 90°). The ensemble is coupled to an ALIBAVA daughter board. These stations act as reference frame and allow a precise track reconstruction. The system is triggered by the coincidence signal of the two scintillators located up and down stream. The telescope can hold several devices under tests. Each ALIBAVA daughter board is linked to its corresponding mother board. The system can hold up to 16 mother boards. A master board synchronizes and controls all the mother boards and collects their data. The off-line analysis software has been developed to study the charge collection, cluster width, tracking efficiency, resolution, etc., of the devices under test. Moreover, the built-in ALIBAVA TDC allows the analysis of the time profile of the device signal. The ALIBAVA telescope has been successfully operated in two test runs at the DESY and CERN-SPS beam lines.

  20. Using Deep Learning Algorithm to Enhance Image-review Software for Surveillance Cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yonggang; Thomas, Maikael A.

    We propose the development of proven deep learning algorithms to flag objects and events of interest in Next Generation Surveillance System (NGSS) surveillance to make IAEA image review more efficient. Video surveillance is one of the core monitoring technologies used by the IAEA Department of Safeguards when implementing safeguards at nuclear facilities worldwide. The current image review software GARS has limited automated functions, such as scene-change detection, black image detection and missing scene analysis, but struggles with highly cluttered backgrounds. A cutting-edge algorithm to be developed in this project will enable efficient and effective searches in images and video streamsmore » by identifying and tracking safeguards relevant objects and detect anomalies in their vicinity. In this project, we will develop the algorithm, test it with the IAEA surveillance cameras and data sets collected at simulated nuclear facilities at BNL and SNL, and implement it in a software program for potential integration into the IAEA’s IRAP (Integrated Review and Analysis Program).« less

  1. Reducing beam hardening effects and metal artefacts in spectral CT using Medipix3RX

    NASA Astrophysics Data System (ADS)

    Rajendran, K.; Walsh, M. F.; de Ruiter, N. J. A.; Chernoglazov, A. I.; Panta, R. K.; Butler, A. P. H.; Butler, P. H.; Bell, S. T.; Anderson, N. G.; Woodfield, T. B. F.; Tredinnick, S. J.; Healy, J. L.; Bateman, C. J.; Aamir, R.; Doesburg, R. M. N.; Renaud, P. F.; Gieseg, S. P.; Smithies, D. J.; Mohr, J. L.; Mandalika, V. B. H.; Opie, A. M. T.; Cook, N. J.; Ronaldson, J. P.; Nik, S. J.; Atharifard, A.; Clyne, M.; Bones, P. J.; Bartneck, C.; Grasset, R.; Schleich, N.; Billinghurst, M.

    2014-03-01

    This paper discusses methods for reducing beam hardening effects and metal artefacts using spectral x-ray information in biomaterial samples. A small-animal spectral scanner was operated in the 15 to 80 keV x-ray energy range for this study. We use the photon-processing features of a CdTe-Medipix3RX ASIC in charge summing mode to reduce beam hardening and associated artefacts. We present spectral data collected for metal alloy samples, its analysis using algebraic 3D reconstruction software and volume visualisation using a custom volume rendering software. The cupping effect and streak artefacts are quantified in the spectral datasets. The results show reduction in beam hardening effects and metal artefacts in the narrow high energy range acquired using the spectroscopic detector. A post-reconstruction comparison between CdTe-Medipix3RX and Si-Medipix3.1 is discussed. The raw data and processed data are made available (http://hdl.handle.net/10092/8851) for testing with other software routines.

  2. Applications and testing of the LSCAD system

    NASA Astrophysics Data System (ADS)

    Althouse, Mark L.; Gross, Robert L.; Ditillo, John T.; Lagna, William M.; Kolodzey, Steve J.; Keiser, Christopher C.; Nasers, Gary D.

    1996-06-01

    The lightweight standoff chemical agent detector (LSCAD) is an infrared Michelson interferometer operating in the 8 - 13 micron band and is designed primarily for military contamination avoidance and early warning applications. The system is designed to be operated autonomously from a vehicle while on the move and provide 360 degree coverage. The first group of prototypes were delivered in 1994 and have undergone integration into several platforms including the HMMWV, the M2 Bradley Fighting Vehicle, the M109 self- propelled Howitzer and the Pioneer and Hurricane unmanned air vehicles (UAVs). Additional vehicles and platforms are planned. To meet the restrictions of military applications, the prototype interferometer subsystem has a weight of about 10 lbs and is approximately 0.20 cu fit in size. The full system size and weight depends upon the particular platform and its operational requirements. LSCAD employs onboard instrument control, data collection, analysis and target detection decision software, all of which are critical to real-time operation. The hardware, software, and test results are discussed.

  3. Path generation algorithm for UML graphic modeling of aerospace test software

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  4. Wall adjustment strategy software for use with the NASA Langley 0.3-meter transonic cryogenic tunnel adaptive wall test section

    NASA Technical Reports Server (NTRS)

    Wolf, Stephen W. D.

    1988-01-01

    The Wall Adjustment Strategy (WAS) software provides successful on-line control of the 2-D flexible walled test section of the Langley 0.3-m Transonic Cryogenic Tunnel. This software package allows the level of operator intervention to be regulated as necessary for research and production type 2-D testing using and Adaptive Wall Test Section (AWTS). The software is designed to accept modification for future requirements, such as 3-D testing, with a minimum of complexity. The WAS software described is an attempt to provide a user friendly package which could be used to control any flexible walled AWTS. Control system constraints influence the details of data transfer, not the data type. Then this entire software package could be used in different control systems, if suitable interface software is available. A complete overview of the software highlights the data flow paths, the modular architecture of the software and the various operating and analysis modes available. A detailed description of the software modules includes listings of the code. A user's manual is provided to explain task generation, operating environment, user options and what to expect at execution.

  5. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  6. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Buhler, Melanie; Valett, Jon

    1989-01-01

    An annotated bibliography is presented of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. The bibliography was updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials were grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  7. Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing

    NASA Astrophysics Data System (ADS)

    Srivastava, Praveen Ranjan; Pareek, Deepak

    Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.

  8. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  9. Statistics of software vulnerability detection in certification testing

    NASA Astrophysics Data System (ADS)

    Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.

    2018-05-01

    The paper discusses practical aspects of introduction of the methods to detect software vulnerability in the day-to-day activities of the accredited testing laboratory. It presents the approval results of the vulnerability detection methods as part of the study of the open source software and the software that is a test object of the certification tests under information security requirements, including software for communication networks. Results of the study showing the allocation of identified vulnerabilities by types of attacks, country of origin, programming languages used in the development, methods for detecting vulnerability, etc. are given. The experience of foreign information security certification systems related to the detection of certified software vulnerabilities is analyzed. The main conclusion based on the study is the need to implement practices for developing secure software in the development life cycle processes. The conclusions and recommendations for the testing laboratories on the implementation of the vulnerability analysis methods are laid down.

  10. NEDLite user's manual: forest inventory for Palm OS handheld computers

    Treesearch

    Peter D. Knopp; Mark J. Twery

    2006-01-01

    A user's manual for NEDLite, software that enables collection of forest inventory data on Palm OS handheld computers, with the option of transferring data into NED software for analysis and subsequent prescription development. NEDLite software is included. Download the NEDLite software at: http://www.fs.fed.us/ne/burlington/ned

  11. 75 FR 16817 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... software developers can provide input on these technical specifications for the Common Formats Version 1.1... specifications, which provide direction to software developers that plan to implement the Common Formats...

  12. 77 FR 40083 - Certain Personal Data and Mobile Communications Devices and Related Software; Institution of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... Communications Devices and Related Software; Institution of a Formal Enforcement Proceeding; Denial of Request... subsidiary NeXT Software, Inc., both of Cupertino, California (collectively, ``Apple''), alleging a violation... importation of certain personal data and mobile communications devices and related software. 75 FR 17434 (Apr...

  13. 76 FR 16785 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-25

    ... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... designed as an interactive forum where PSOs and software developers can provide input on these technical... updated event descriptions, forms, and technical specifications for software developers. As an update to...

  14. Complexity, Systems, and Software

    DTIC Science & Technology

    2014-08-14

    2014 Carnegie Mellon University Complexity, Systems, and Software Software Engineering Institute Carnegie Mellon University Pittsburgh, PA...this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services , Directorate for Information...OMB control number. 1. REPORT DATE 29 OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Complexity, Systems, and Software

  15. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  16. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities - A General Overview

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.

  17. Measuring the software process and product: Lessons learned in the SEL

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1985-01-01

    The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.

  18. The revenue generated from clinical chemistry and hematology laboratory services as determined using activity-based costing (ABC) model.

    PubMed

    Adane, Kasaw; Abiy, Zenegnaw; Desta, Kassu

    2015-01-01

    The rapid and continuous growth of health care cost aggravates the frequently low priority and less attention given in financing laboratory services. The poorest countries have the highest out-of-pocket spending as a percentage of income. Higher charges might provide a greater potential for revenue. If fees raise quality sufficiently, it can enhance usage. Therefore, estimating the revenue generated from laboratory services could help in capacity building and improved quality service provision. Panel study design was used to determine revenue generated from clinical chemistry and hematology services at Tikur Anbessa Specialized Teaching Hospital, Addis Ababa, Ethiopia. Activity-Based Costing (ABC) model was used to determine the true cost of tests performed from October 2011 to December 2011 in the hospital. The principle of Activity-based Costing is that activities consume resources and activities consumed by services which incur the costs and hence service takes the cost of resources. All resources with costs are aggregated with the established casual relationships. The process maps designed was restructured in consultation with the senior staffs working and/or supervising the laboratory and pretested checklists were used for observation. Moreover, office documents, receipts and service bills were used while collecting data. The amount of revenue collected from services was compared with the cost of each subsequent test and the profitability or return on investment (ROI) of services was calculated. Data were collected, entered, cleaned, and analyzed using Microsoft Excel 2007 software program and Statistical Software Package for Social Sciences version 19 (SPSS). Paired sample t test was used to compare the price and cost of each test. P-value less than 0.05 were considered as statistically significant. A total of 25,654 specimens were analyzed during 3 months of regular working hours. The total numbers of clinical chemistry and hematology tests performed during the study period were 45,959 (66.1 %) and 23,570 (33.9 %), respectively. Only 274, 386 (25.3 %) Ethiopian Birr (ETB) was recovered from the total cost of 1,086,008.09 ETB incurred on clinical chemistry and hematology laboratory tests. The result showed that, about 133,821 (12.32 %) ETB was revenue not collected from out-of-pocket payments that was paid for the services as a result of under pricing. The result showed that 18 out of 20 laboratory tests were under priced. The cost burden related to free Anti Retro-viral Therapy (ART) services was 285,979.82 (26.3 %) ETB. The cost per test estimated was significantly different to the existing price. About 90 % of the tests were under priced. This information could warn the hospital to reconsider resetting prices of these tests profitability ration less than 1. The revenue collected could help to build capacity, upscale quality, and sustainable service delivery.

  19. 15 CFR 995.27 - Format validation software testing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Format validation software testing... CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES CERTIFICATION REQUIREMENTS FOR... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying...

  20. The Design of Software for Three-Phase Induction Motor Test System

    NASA Astrophysics Data System (ADS)

    Haixiang, Xu; Fengqi, Wu; Jiai, Xue

    2017-11-01

    The design and development of control system software is important to three-phase induction motor test equipment, which needs to be completely familiar with the test process and the control procedure of test equipment. In this paper, the software is developed according to the national standard (GB/T1032-2005) about three-phase induction motor test method by VB language. The control system and data analysis software and the implement about motor test system are described individually, which has the advantages of high automation and high accuracy.

  1. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    ERIC Educational Resources Information Center

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  2. 75 FR 75464 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-03

    ... type of switch software) to provide payphone specific coding digits for per-call compensation. The... Information; [cir] RF Exposure Information; [cir] Operational Description; [cir] Cover Letters; [cir] Software Defined Radio/Cognitive Radio Files In general, an applicant's submission is as follows: (a) FCC Form 731...

  3. 78 FR 77718 - Comment Request for Information Collection for Information Collection for the Data Validation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-24

    ... Program (NFJP), and Senior Community Service Employment Program (SCSEP). The current expiration date is May 31, 2014. Please note that the data submission processes within the new data validation software..., 2014). ETA believes the software will be completed and states will have experience with using it by the...

  4. CATS, continuous automated testing of seismological, hydroacoustic, and infrasound (SHI) processing software.

    NASA Astrophysics Data System (ADS)

    Brouwer, Albert; Brown, David; Tomuta, Elena

    2017-04-01

    To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.

  5. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  6. The Production Data Approach for Full Lifecycle Management

    NASA Astrophysics Data System (ADS)

    Schopf, J.

    2012-04-01

    The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.

  7. Free and simple GIS as appropriate for health mapping in a low resource setting: a case study in eastern Indonesia.

    PubMed

    Fisher, Rohan P; Myers, Bronwyn A

    2011-02-25

    Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability.

  8. Development of a computer-assisted personal interview software system for collection of tribal fish consumption data.

    PubMed

    Kissinger, Lon; Lorenzana, Roseanne; Mittl, Beth; Lasrado, Merwyn; Iwenofu, Samuel; Olivo, Vanessa; Helba, Cynthia; Capoeman, Pauline; Williams, Ann H

    2010-12-01

    The authors developed a computer-assisted personal interviewing (CAPI) seafood consumption survey tool from existing Pacific NW Native American seafood consumption survey methodology. The software runs on readily available hardware and software, and is easily configured for different cultures and seafood resources. The CAPI is used with a booklet of harvest location maps and species and portion size images. The use of a CAPI facilitates tribal administration of seafood consumption surveys, allowing cost-effective collection of scientifically defensible data and tribal management of data and data interpretation. Use of tribal interviewers reduces potential bias and discomfort that may be associated with nontribal interviewers. The CAPI contains a 24-hour recall and food frequency questionnaire, and assesses seasonal seafood consumption and temporal changes in consumption. EPA's methodology for developing ambient water quality criteria for tribes assigns a high priority to local data. The CAPI will satisfy this guidance objective. Survey results will support development of tribal water quality standards on their lands and assessment of seafood consumption-related contaminant risks and nutritional benefits. CAPI advantages over paper surveys include complex question branching without raising respondent burden, more complete interviews due to answer error and range checking, data transcription error elimination, printing and mailing cost elimination, and improved data storage. The survey instrument was pilot tested among the Quinault Nation in 2006. © 2010 Society for Risk Analysis.

  9. Free and simple GIS as appropriate for health mapping in a low resource setting: a case study in eastern Indonesia

    PubMed Central

    2011-01-01

    Background Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Results Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. Conclusions We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability. PMID:21352553

  10. The effect of three ergonomics interventions on body posture and musculoskeletal disorders among stuff of Isfahan Province Gas Company

    PubMed Central

    Habibi, Ehsanollah; Soury, Shiva

    2015-01-01

    Background: Prevalence of work-related musculoskeletal disorders (WMSDs) is high among computer users. The study investigates the effect of three ergonomic interventions on the incidence of musculoskeletal disorders among the staff of Isfahan Province Gas Company, including training, sport, and installation of software. Materials and Methods: The study was performed in the summer of 2013 on 75 (52 men, 23 women) Isfahan Province Gas Company employees in three phases (phase 1: Evaluation of present situation, phase 2: Performing interventions, and phase 3: Re-evaluation). Participants were divided into three groups (training, exercise, and software). The Nordic Musculoskeletal Questionnaire (NMQ) and rapid upper limb assessment (RULA) were used. Data collected were analyzed using SPSS software and McNemar test, t-test, and Chi-square test. Results: Based on the evaluations, there was a decrease in musculoskeletal symptoms among the trained group participants after they received the training. McNemar test showed that the lower rate of pain in low back, neck, knee, and wrist was significant (P < 0.05). The results obtained from the RULA method for evaluation of posture showed an average 25 points decrease in the right side of the body and 20 points decrease in the left side of the body in the group subjected to training. Based on t-test, the decrease was significant. Conclusion: The study demonstrated that majority of the participants accepted interventions, which indicates that most of the people were unsatisfied with the work settings and seeking improvement at the workplace. Overall, the findings show that training, chair adjustment, and arrangement in workplace could decrease musculoskeletal disorders. PMID:26430692

  11. Analysis of Photogrammetry Data from ISIM Mockup

    NASA Technical Reports Server (NTRS)

    Nowak, Maria; Hill, Mike

    2007-01-01

    During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software, proprietary software owned by Geodetic Systems Inc. The primary objectives of the metrology performed on the ISIM mock-up were (1) to quantify the accuracy of the INCA3 photogrammetry camera on a representative full scale version of the ISIM structure at ambient temperature by comparing the measurements obtained with this camera to measurements using the Leica laser tracker system and (2), empirically determine the smallest increment of target position movement that can be resolved by the PG camera in the test setup, i.e., precision, or resolution. In addition, the geometrical details of the test setup defined during the mockup testing, such as target locations and camera positions, will contribute to the final design of the photogrammetry system to be used on the ISIM Flight Structure.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Harvey, Julia B.

    The International Atomic Energy Agency State Evaluation Process: The Role of Information Analysis in Reaching Safeguards Conclusions (Mathews et al. 2008), several examples of nonproliferation models using analytical software were developed that may assist the IAEA with collecting, visualizing, analyzing, and reporting information in support of the State Evaluation Process. This paper focuses on one of the examples a set of models developed in the Proactive Scenario Production, Evidence Collection, and Testing (ProSPECT) software that evaluates the status and nature of a state’s nuclear activities. The models use three distinct subject areas to perform this assessment: the presence of nuclearmore » activities, the consistency of those nuclear activities with national nuclear energy goals, and the geopolitical context in which those nuclear activities are taking place. As a proof-of-concept for the models, a crude case study was performed. The study, which attempted to evaluate the nuclear activities taking place in Syria prior to September 2007, yielded illustrative, yet inconclusive, results. Due to the inconclusive nature of the case study results, changes that may improve the model’s efficiency and accuracy are proposed.« less

  13. Digital X-ray portable scanner based on monolithic semi-insulating GaAs detectors: General description and first “quantum” images

    NASA Astrophysics Data System (ADS)

    Dubecký, F.; Perd'ochová, A.; Ščepko, P.; Zat'ko, B.; Sekerka, V.; Nečas, V.; Sekáčová, M.; Hudec, M.; Boháček, P.; Huran, J.

    2005-07-01

    The present work describes a portable digital X-ray scanner based on bulk undoped semi-insulating (SI) GaAs monolithic strip line detectors. The scanner operates in "quantum" imaging mode ("single photon counting"), with potential improvement of the dynamic range in contrast of the observed X-ray images. The "heart" of the scanner (detection unit) is based on SI GaAs strip line detectors. The measured detection efficiency of the SI GaAs detector reached a value of over 60 % (compared to the theoretical one of ˜75 %) for the detection of 60 keV photons at a reverse bias of 200 V. The read-out electronics consists of 20 modules fabricated using a progressive SMD technology with automatic assembly of electronic devices. Signals from counters included in the digital parts of the modules are collected in a PC via a USB port and evaluated by custom developed software allowing X-ray image reconstruction. The collected data were used for the creation of the first X-ray "quantum" images of various test objects using the imaging software developed.

  14. Simplified Novel Application (SNApp) framework: a guide to developing and implementing second-generation mobile applications for behavioral health research.

    PubMed

    Fillo, Jennifer; Staplefoote-Boynton, B Lynette; Martinez, Angel; Sontag-Padilla, Lisa; Shadel, William G; Martino, Steven C; Setodji, Claude M; Meeker, Daniella; Scharf, Deborah

    2016-12-01

    Advances in mobile technology and mobile applications (apps) have opened up an exciting new frontier for behavioral health researchers, with a "second generation" of apps allowing for the simultaneous collection of multiple streams of data in real time. With this comes a host of technical decisions and ethical considerations unique to this evolving approach to research. Drawing on our experience developing a second-generation app for the simultaneous collection of text message, voice, and self-report data, we provide a framework for researchers interested in developing and using second-generation mobile apps to study health behaviors. Our Simplified Novel Application (SNApp) framework breaks the app development process into four phases: (1) information and resource gathering, (2) software and hardware decisions, (3) software development and testing, and (4) study start-up and implementation. At each phase, we address common challenges and ethical issues and make suggestions for effective and efficient app development. Our goal is to help researchers effectively balance priorities related to the function of the app with the realities of app development, human subjects issues, and project resource constraints.

  15. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Groves, Paula; Valett, Jon

    1990-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory-software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. Subject and author indexes further classify these documents by specific topic and individual author.

  16. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1993-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory: software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. This document contains an index of these publications classified by individual author.

  17. The Impact of Software Culture on the Management of Community Data

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; Pulsifer, P. L.; Sheffield, E.; Lewis, S.; Oldenburg, J.

    2013-12-01

    The Exchange for Local Observations and Knowledge of the Arctic (ELOKA), a program hosted at the National Snow and Ice Data Center (NSIDC), supports the collection, curation, and distribution of Local and Traditional Knowledge (LTK) data, as well as some quantitative data products. Investigations involving LTK data often involve community participation, and therefore require flexible and robust user interfaces to support a reliable process of data collection and management. Often, investigators focused on LTK and community-based monitoring choose to use ELOKA's data services based on our ability to provide rapid proof-of-concepts and economical delivery of a usable product. To satisfy these two overarching criteria, ELOKA is experimenting with modifications to its software development culture both in terms of how the software applications are developed as well as the kind of software applications (or components) being developed. Over the past several years, NSIDC has shifted its software development culture from one of assigning individual scientific programmers to support particular principal investigators or projects, to an Agile Software Methodology implementation using Scrum practices. ELOKA has participated in this process by working with other product owners to schedule and prioritize development work which is then implemented by a team of application developers. Scrum, along with practices such as Test Driven Development (TDD) and paired programming, improves the quality of the software product delivered to the user community. To meet the need for rapid prototyping and to maximize product development and support with limited developer input, our software development efforts are now focused on creating a platform of application modules that can be quickly customized to suit the needs of a variety of LTK projects. This approach is in contrast to the strategy of delivering custom applications for individual projects. To date, we have integrated components of the Nunaliit Atlas framework (a Java/JavaScript client-server web-based application) with an existing Ruby on Rails application. This approach requires transitioning individual applications to expose a service layer, thus allowing interapplication communication via RESTful services. In this presentation we will report on our experiences using Agile Scrum practices, our efforts to move from custom solutions to a platform of customizable modules, and the impact of each on our ability to support researchers and Arctic residents in the domain of community-based observations and knowledge.

  18. MOSAIC: Software for creating mosaics from collections of images

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Gezari, D. Y.

    1992-01-01

    We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.

  19. Digital tape unit test facility software

    NASA Technical Reports Server (NTRS)

    Jackson, J. T.

    1971-01-01

    Two computer programs are described which are used for the collection and analysis of data from the digital tape unit test facility (DTUTF). The data are the recorded results of skew tests made on magnetic digital tapes which are used on computers as input/output media. The results of each tape test are keypunched onto an 80 column computer card. The format of the card is checked and the card image is stored on a master summary tape via the DTUTF card checking and tape updating system. The master summary tape containing the results of all the tape tests is then used for analysis as input to the DTUTF histogram generating system which produces a histogram of skew vs. date for selected data, followed by some statistical analysis of the data.

  20. Ultrasonic interface level analyzer shop test procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    STAEHR, T.W.

    1999-05-24

    The Royce Instrument Corporation Model 2511 Interface Level Analyzer (URSILLA) system uses an ultrasonic ranging technique (SONAR) to measure sludge depths in holding tanks. Three URSILLA instrument assemblies provided by the W-151 project are planned to be used during mixer pump testing to provide data for determining sludge mobilization effectiveness of the mixer pumps and sludge settling rates. The purpose of this test is to provide a documented means of verifying that the functional components of the three URSILLA instruments operate properly. Successful completion of this Shop Test Procedure (STP) is a prerequisite for installation in the AZ-101 tank. Themore » objective of the test is to verify the operation of the URSILLA instruments and to verify data collection using a stand alone software program.« less

  1. Sustaining Software-Intensive Systems

    DTIC Science & Technology

    2006-05-01

    2.2 Multi- Service Operational Test and Evaluation .......................................4 2.3 Stable Software Baseline...or equivalent document • completed Multi- Service Operational Test and Evaluation (MOT&E) for the potential production software package (or OT&E if...not multi- service ) • stable software production baseline • complete and current software documentation • Authority to Operate (ATO) for an

  2. Artificial intelligence and expert systems in-flight software testing

    NASA Technical Reports Server (NTRS)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  3. Simulation analysis of a microcomputer-based, low-cost Omega navigation system

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.; Salter, R. J., Jr.

    1976-01-01

    The current status of research on a proposed micro-computer-based, low-cost Omega Navigation System (ONS) is described. The design approach emphasizes minimum hardware, maximum software, and the use of a low-cost, commercially-available microcomputer. Currently under investigation is the implementation of a low-cost navigation processor and its interface with an omega sensor to complete the hardware-based ONS. Sensor processor functions are simulated to determine how many of the sensor processor functions can be handled by innovative software. An input data base of live Omega ground and flight test data was created. The Omega sensor and microcomputer interface modules used to collect the data are functionally described. Automatic synchronization to the Omega transmission pattern is described as an example of the algorithms developed using this data base.

  4. General Chemistry Collection for Students (CD-ROM), Abstract of Special Issue 16, 4th Edition

    NASA Astrophysics Data System (ADS)

    2000-07-01

    The General Chemistry Collection contains both new and previously published JCE Software programs that are intended for use by introductory-level chemistry students. These peer-reviewed programs for Macintosh and for Windows are available on a single CD-ROM for convenient distribution to and access by students, and the CD may be adopted for students to purchase as they would a textbook. General Chemistry Collection covers a broad range of topics providing students with interesting information, tutorials, and simulations that will be useful to them as they study chemistry for the first time. There are 22 programs included in the General Chemistry Collection 4th Edition. Their titles and the general chemistry topics they cover are listed in Table 1. Features in This Edition General Chemistry Collection, 4th edition includes:

    • Lessons for Introductory Chemistry and INQUAL-S, two new programs not previously published by JCE Software (abstracts appear below)
    • Writing Electron Dot Structures (1) and Viscosity Measurement: A Virtual Experiment for Windows (2), two programs published individually by JCE Software
    • Periodic Table Live! LE, a limited edition of Periodic Table Live!, 2nd Edition (3) (this replaces Chemistry Navigator (4) and Illustrated Periodic Table (5))
    • Many of the programs from previous editions (6)1
    Hardware and Software Requirements System requirements are given in Table 2. Some programs have additional requirements. See the individual program abstracts at JCE Online, or documentation included on the CD-ROM for more specific information. Licensing and Discounts for Adoptions The General Chemistry Collection is intended for use by individual students. Institutions and faculty members may adopt General Chemistry Collection 4th Edition as they would a textbook. We can arrange for CDs to be packaged with laboratory manuals or other course materials or to be sold for direct distribution to students through the campus bookstore. The cost per CD can be quite low when large numbers are ordered (as little as $3 each), making this a cost-effective method of allowing students access to the software they need whenever and wherever they desire. Other JCE Software CDs can also be adopted. Network licenses to distribute the software to your students via your local campus network can also be arranged. Contact us for details on purchasing multiple user licenses. Price and Ordering An order form is inserted in this issue that provides prices and other ordering information. If this card is not available or if you need additional information, contact: JCE Software, University of Wisconsin-Madison, 1101 University Avenue, Madison, WI 53706-1396; phone; 608/262-5153 or 800/991-5534; fax: 608/265-8094; email: jcesoft@chem.wisc.edu. Table 1. Contents of the General Chemistry Collection, 4th Edition

  5. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Test Documentation for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1207, ``Test Documentation for Digital... practices for test documentation for software and computer systems as described in the Institute of...

  6. An information model for use in software management estimation and prediction

    NASA Technical Reports Server (NTRS)

    Li, Ningda R.; Zelkowitz, Marvin V.

    1993-01-01

    This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.

  7. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 2, March/April 2011

    DTIC Science & Technology

    2011-04-01

    and insider at- tacks, we plan to conduct experiments and collect concrete and empirical evidence. As we have done in prior research projects [11...subsequent service failure.” Yet, a faulty state can continue to render service; an er- roneous state cannot. Consider a system that receives concrete ...that does not satisfy specifications. The faults in the concrete are not detected during (faulty) acceptance testing. A two-deck bridge is built using

  9. Testing of Hand-Held Mine Detection Systems

    DTIC Science & Technology

    2015-01-08

    ITOP 04-2-5208 for guidance on software testing . Testing software is necessary to ensure that safety is designed into the software algorithm, and that...sensor verification areas or target lanes. F.2. TESTING OBJECTIVES. a. Testing objectives will impact on the test design . Some examples of...overall safety, performance, and reliability of the system. It describes activities necessary to ensure safety is designed into the system under test

  10. Special Report: Part One. New Tools for Professionals.

    ERIC Educational Resources Information Center

    Liskin, Miriam; And Others

    1984-01-01

    This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)

  11. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  12. Getting started on metrics - Jet Propulsion Laboratory productivity and quality

    NASA Technical Reports Server (NTRS)

    Bush, M. W.

    1990-01-01

    A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.

  13. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  14. TypingSuite: Integrated Software for Presenting Stimuli, and Collecting and Analyzing Typing Data

    ERIC Educational Resources Information Center

    Mazerolle, Erin L.; Marchand, Yannick

    2015-01-01

    Research into typing patterns has broad applications in both psycholinguistics and biometrics (i.e., improving security of computer access via each user's unique typing patterns). We present a new software package, TypingSuite, which can be used for presenting visual and auditory stimuli, collecting typing data, and summarizing and analyzing the…

  15. High Assurance Software

    DTIC Science & Technology

    2013-10-22

    CONGRESSIONAL ) HIGH ASSURANCE SOFTWARE WILLIAM MAHONEY UNIVERSITY OF NEBRASKA 10/22/2013 Final Report DISTRIBUTION A: Distribution approved for ...0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to

  16. A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.

    ERIC Educational Resources Information Center

    McConkie, George W.; And Others

    A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…

  17. Academic Testing and Grading with Spreadsheet Software.

    ERIC Educational Resources Information Center

    Ho, James K.

    1987-01-01

    Explains how spreadsheet software can be used in the design and grading of academic tests and in assigning grades. Macro programs and menu-driven software are highlighted and an example using IBM PCs and Lotus 1-2-3 software is given. (Author/LRW)

  18. LC-IM-TOF Instrument Control & Data Visualization Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-05-12

    Liquid Chromatography-Ion Mobility-time of Flight Instrument Control and Data Visualization software is designed to control instrument voltages for the Ion Mobility drift tube. It collects and stores information collected from the Agilent TOF instrument and analyses/displays the ion intensity information acquired. The software interface can be split into 3 categories -- Instrument Settings/Controls, Data Acquisition, and Viewer. The Instrument Settings/Controls prepares the instrument for Data Acquisition. The Viewer contains common objects that are used by Instrument Settings/Controls and Data Acquisition. Intensity information is collected in 1 nanosec bins and separated by TOF pulses called scans. A collection of scans aremore » stored side by side making up an accumulation. In order for the computer to keep up with the stream of data, 30-50 accumulations are commonly summed into a single frame. A collection of frames makes up an experiment. The Viewer software then takes the experiment and presents the data in several possible ways, each frame can be viewed in TOF bins or m/z (mass to charge ratio). The experiment can be viewed frame by frame, merging several frames, or by viewing the peak chromatogram. The user can zoom into the data, export data, and/or animate frames. Additional features include calibration of the data and even post-processing multiplexed data.« less

  19. pFUnit 3.0 Tutorial Advanced

    NASA Technical Reports Server (NTRS)

    Clune, Tom

    2014-01-01

    This tutorial will introduce Fortran developers to unit-testing and test-driven development (TDD) using pFUnit. As with other unit-testing frameworks, pFUnit, simplifies the process of writing, collecting, and executing tests while providing clear diagnostic messages for failing tests. pFUnit specifically targets the development of scientific-technical software written in Fortran and includes customized features such as: assertions for multi-dimensional arrays, distributed (MPI) and thread-based (OpenMP) parallellism, and flexible parameterized tests.These sessions will include numerous examples and hands-on exercises that gradually build in complexity. Attendees are expected to have working knowledge of F90, but familiarity with object-oriented syntax in F2003 and MPI will be of benefit for the more advanced examples. By the end of the tutorial the audience should feel comfortable in applying pFUnit within their own development environment.

  20. Assessment Environment for Complex Systems Software Guide

    NASA Technical Reports Server (NTRS)

    2013-01-01

    This Software Guide (SG) describes the software developed to test the Assessment Environment for Complex Systems (AECS) by the West Virginia High Technology Consortium (WVHTC) Foundation's Mission Systems Group (MSG) for the National Aeronautics and Space Administration (NASA) Aeronautics Research Mission Directorate (ARMD). This software is referred to as the AECS Test Project throughout the remainder of this document. AECS provides a framework for developing, simulating, testing, and analyzing modern avionics systems within an Integrated Modular Avionics (IMA) architecture. The purpose of the AECS Test Project is twofold. First, it provides a means to test the AECS hardware and system developed by MSG. Second, it provides an example project upon which future AECS research may be based. This Software Guide fully describes building, installing, and executing the AECS Test Project as well as its architecture and design. The design of the AECS hardware is described in the AECS Hardware Guide. Instructions on how to configure, build and use the AECS are described in the User's Guide. Sample AECS software, developed by the WVHTC Foundation, is presented in the AECS Software Guide. The AECS Hardware Guide, AECS User's Guide, and AECS Software Guide are authored by MSG. The requirements set forth for AECS are presented in the Statement of Work for the Assessment Environment for Complex Systems authored by NASA Dryden Flight Research Center (DFRC). The intended audience for this document includes software engineers, hardware engineers, project managers, and quality assurance personnel from WVHTC Foundation (the suppliers of the software), NASA (the customer), and future researchers (users of the software). Readers are assumed to have general knowledge in the field of real-time, embedded computer software development.

  1. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  2. Proactive Security Testing and Fuzzing

    NASA Astrophysics Data System (ADS)

    Takanen, Ari

    Software is bound to have security critical flaws, and no testing or code auditing can ensure that software is flaw-less. But software security testing requirements have improved radically during the past years, largely due to criticism from security conscious consumers and Enterprise customers. Whereas in the past, security flaws were taken for granted (and patches were quietly and humbly installed), they now are probably one of the most common reasons why people switch vendors or software providers. The maintenance costs from security updates often add to become one of the biggest cost items to large Enterprise users. Fortunately test automation techniques have also improved. Techniques like model-based testing (MBT) enable efficient generation of security tests that reach good confidence levels in discovering zero-day mistakes in software. This technique is called fuzzing.

  3. Agile deployment and code coverage testing metrics of the boot software on-board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián

    2018-02-01

    In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.

  4. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  5. Advanced E-O test capability for Army Next-Generation Automated Test System (NGATS)

    NASA Astrophysics Data System (ADS)

    Errea, S.; Grigor, J.; King, D. F.; Matis, G.; McHugh, S.; McKechnie, J.; Nehring, B.

    2015-05-01

    The Future E-O (FEO) program was established to develop a flexible, modular, automated test capability as part of the Next Generation Automatic Test System (NGATS) program to support the test and diagnostic needs of currently fielded U.S. Army electro-optical (E-O) devices, as well as being expandable to address the requirements of future Navy, Marine Corps and Air Force E-O systems. Santa Barbara infrared (SBIR) has designed, fabricated, and delivered three (3) prototype FEO for engineering and logistics evaluation prior to anticipated full-scale production beginning in 2016. In addition to presenting a detailed overview of the FEO system hardware design, features and testing capabilities, the integration of SBIR's EO-IR sensor and laser test software package, IRWindows 4™, into FEO to automate the test execution, data collection and analysis, archiving and reporting of results is also described.

  6. Development and Flight Testing of an Adaptable Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.; Taylor, B. Douglas; Brett, Rube R.

    2003-01-01

    Development and testing of an adaptable wireless health-monitoring architecture for a vehicle fleet is presented. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained adaptable expert system. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate, and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear.

  7. Florida alternative NTCIP testing software (ANTS) for actuated signal controllers.

    DOT National Transportation Integrated Search

    2009-01-01

    The scope of this research project did include the development of a software tool to test devices for NTCIP compliance. Development of the Florida Alternative NTCIP Testing Software (ANTS) was developed by the research team due to limitations found w...

  8. Analysis of Consequences of Birth Asphyxia in Infants: A Regional Study in Southern Punjab, Pakistan.

    PubMed

    Samad, Noreen; Farooq, Samia; Hafeez, Kinza; Maryam, Mukharma; Rafi, Muhammad Aftab

    2016-12-01

    To evaluate the biochemical consequences and platelet counts of birth asphyxia in neonates. Cohort study. Department of Child Health, Nishter Medical College and Hospital, Multan, from September to November 2015. The data of 50 (50%) asphyxiated neonates and 50 (50%) non-asphyxiated neonates, with age range less than 1 month, was collected from Children Ward of Nishtar Hospital, Multan, Pakistan. Data on platelet count in blood, kidney function tests (creatinine, urea), liver function tests (bilirubin, alanine aminotransferase (ALT), aspartate aminotransferase (AST)) and cardiac enzyme test (lactate dehydrogenase (LDH)) were analysed by paired sample t-test by SPSS software. Sociodemographic data of those neonate's mothers was also collected. In asphyxiated neonates LDH, ALT, AST, creatinine, bilirubin, urea levels were higher than healthy infants, while the platelet count was smaller in asphyxiated neonates than healthy infants. There was a higher rate of alteration in platelet count, levels of LDH, AST, ALT, urea creatinine and bilirubin in asphyxiated infants. These alterations may be correlated with damage of vital organ of asphyxiated neonates.

  9. 48 CFR 352.227-14 - Rights in Data-Exceptional Circumstances.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....] Computer database or database means a collection of recorded information in a form capable of, and for the... databases or computer software documentation. Computer software documentation means owner's manuals, user's... nature (including computer databases and computer software documentation). This term does not include...

  10. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Kistler, David; Bristow, John; Smith, Don

    1994-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  11. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  12. Center for Efficient Exascale Discretizations Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir

    The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.

  13. Emerging Software Development and Acquisition Approaches: Panacea or Villain

    DTIC Science & Technology

    2011-05-16

    2010 Carnegie Mellon University Emerging Software Development and Acquisition Approaches: Panacea or Villain Software Engineering Institute...aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services , Directorate for...Emerging Software Development and Acquisition Approaches: Panacea or Villain 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S

  14. Design of a Thermal Precipitator for the Characterization of Smoke Particles from Common Spacecraft Materials

    NASA Technical Reports Server (NTRS)

    Meyer, Marit Elisabeth

    2015-01-01

    A thermal precipitator (TP) was designed to collect smoke aerosol particles for microscopic analysis in fire characterization research. Information on particle morphology, size and agglomerate structure obtained from these tests supplements additional aerosol data collected. Modeling of the thermal precipitator throughout the design process was performed with the COMSOL Multiphysics finite element software package, including the Eulerian flow field and thermal gradients in the fluid. The COMSOL Particle Tracing Module was subsequently used to determine particle deposition. Modeling provided optimized design parameters such as geometry, flow rate and temperatures. The thermal precipitator was built and testing verified the performance of the first iteration of the device. The thermal precipitator was successfully operated and provided quality particle samples for microscopic analysis, which furthered the body of knowledge on smoke particulates. This information is a key element of smoke characterization and will be useful for future spacecraft fire detection research.

  15. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  16. Sociotechnical Human Factors Involved in Remote Online Usability Testing of Two eHealth Interventions

    PubMed Central

    2016-01-01

    Background Research in the fields of human performance technology and human computer interaction are challenging the traditional macro focus of usability testing arguing for methods that help test moderators assess “use in context” (ie, cognitive skills, usability understood over time) and in authentic “real world” settings. Human factors in these complex test scenarios may impact on the quality of usability results being derived yet there is a lack of research detailing moderator experiences in these test environments. Most comparative research has focused on the impact of the physical environment on results, and rarely on how the sociotechnical elements of the test environment affect moderator and test user performance. Improving our understanding of moderator roles and experiences with conducting “real world” usability testing can lead to improved techniques and strategies Objective To understand moderator experiences of using Web-conferencing software to conduct remote usability testing of 2 eHealth interventions. Methods An exploratory case study approach was used to study 4 moderators’ experiences using Blackboard Collaborate for remote testing sessions of 2 different eHealth interventions. Data collection involved audio-recording iterative cycles of test sessions, collecting summary notes taken by moderators, and conducting 2 90-minute focus groups via teleconference. A direct content analysis with an inductive coding approach was used to explore personal accounts, assess the credibility of data interpretation, and generate consensus on the thematic structure of the results. Results Following the convergence of data from the various sources, 3 major themes were identified: (1) moderators experienced and adapted to unpredictable changes in cognitive load during testing; (2) moderators experienced challenges in creating and sustaining social presence and untangling dialogue; and (3) moderators experienced diverse technical demands, but were able to collaboratively troubleshoot with test users. Conclusions Results highlight important human-computer interactions and human factor qualities that impact usability testing processes. Moderators need an advanced skill and knowledge set to address the social interaction aspects of Web-based usability testing and technical aspects of conferencing software during test sessions. Findings from moderator-focused studies can inform the design of remote testing platforms and real-time usability evaluation processes that place less cognitive burden on moderators and test users. PMID:27026291

  17. Early results from the Array of Things

    NASA Astrophysics Data System (ADS)

    Jacob, R. L.; Catlett, C.; Beckman, P. H.; Sankaran, R.

    2017-12-01

    The Array of Things (AoT) is an experimental sensor and edge-computing network being deployed in the City of Chicago. An AoT node contains sensors for temperature, pressure, humidty and several trace gases as well as 4-core CPU and full Linux operating system. Custom software called "Waggle" controls the hardware and provides the data collection and transmission services. Each node is attached to a traffic signal light and has power 24/7. Data is sent over the cellular network in near realtime. With Chicago's Department of Transportation, we have been making test deployments of AoT nodes, evaluating their capabilities and comparing collected data with that from other observing systems in the Chicago area.

  18. Modernized build and test infrastructure for control software at ESO: highly flexible building, testing, and automatic quality practices for telescope control software

    NASA Astrophysics Data System (ADS)

    Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.

    2016-07-01

    The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.

  19. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  20. Field Test of Route Planning Software for Lunar Polar Missions

    NASA Astrophysics Data System (ADS)

    Horchler, A. D.; Cunningham, C.; Jones, H. L.; Arnett, D.; Fang, E.; Amoroso, E.; Otten, N.; Kitchell, F.; Holst, I.; Rock, G.; Whittaker, W.

    2017-10-01

    A novel field test paradigm has been developed to demonstrate and validate route planning software in the stark low-angled light and sweeping shadows a rover would experience at the poles of the Moon. Software, ConOps, and test results are presented.

  1. Using collective variables to drive molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Fiorin, Giacomo; Klein, Michael L.; Hénin, Jérôme

    2013-12-01

    A software framework is introduced that facilitates the application of biasing algorithms to collective variables of the type commonly employed to drive massively parallel molecular dynamics (MD) simulations. The modular framework that is presented enables one to combine existing collective variables into new ones, and combine any chosen collective variable with available biasing methods. The latter include the classic time-dependent biases referred to as steered MD and targeted MD, the temperature-accelerated MD algorithm, as well as the adaptive free-energy biases called metadynamics and adaptive biasing force. The present modular software is extensible, and portable between commonly used MD simulation engines.

  2. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  3. Proceedings of the Annual Ada Software Engineering Education and Training Symposium (3rd) Held in Denver, Colorado on June 14-16, 1988

    DTIC Science & Technology

    1988-06-01

    Based Software Engineering Project Course .............. 83 SSoftware Engineering, Software Engineering Concepts: The Importance of Object-Based...quality assurance, and independent system testing . The Chief Programmer is responsible for all software development activities, including prototyping...during the Requirements Analysis phase, the Preliminary Design, the Detailed Design, Coding and Unit Testing , CSC Integration and Testing , and informal

  4. Software OT&E Guidelines. Volume 1. Software Test Manager’s Handbook

    DTIC Science & Technology

    1981-02-01

    on reverse side If neceeary and identify by block number) The Software OT&E Guidelines is a set of handbooks prepared by the Computer / Support Systems...is one of a set of handbooks prepared by the Computer /Support Systems Division of the Test and Evaluation Directorate, Air Force Test and Evaluation...15 E. Software Maintainability .. .. ........ ... 16 F. Standard Questionnaires. .. .. ....... .... 16 1. Operator- Computer Interface Evaluation

  5. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  6. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction

    PubMed Central

    Venkatesan, R.

    2016-01-01

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets. PMID:27738649

  7. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.

    PubMed

    Kumudha, P; Venkatesan, R

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  8. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  9. Coexistence of Named Data Networking (NDN) and Software-Defined Networking (SDN)

    DTIC Science & Technology

    2017-09-01

    Networking (NDN) and Software-Defined Networking (SDN) by Vinod Mishra Computational and Information Sciences Directorate, ARL...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this

  10. Co-streaming classes: a follow-up study in improving the user experience to better reach users.

    PubMed

    Hayes, Barrie E; Handler, Lara J; Main, Lindsey R

    2011-01-01

    Co-streaming classes have enabled library staff to extend open classes to distance education students and other users. Student evaluations showed that the model could be improved. Two areas required attention: audio problems experienced by online participants and staff teaching methods. Staff tested equipment and adjusted software configuration to improve user experience. Staff training increased familiarity with specialized teaching techniques and troubleshooting procedures. Technology testing and staff training were completed, and best practices were developed and applied. Class evaluations indicate improvements in classroom experience. Future plans include expanding co-streaming to more classes and on-going data collection, evaluation, and improvement of classes.

  11. The Rapid Integration and Test Environment: A Process for Achieving Software Test Acceptance

    DTIC Science & Technology

    2010-05-01

    Test Environment : A Process for Achieving Software Test Acceptance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...mlif`v= 365= k^s^i=mlpqdo^ar^qb=p`elli= The Rapid Integration and Test Environment : A Process for Achieving Software Test Acceptance Patrick V...was awarded the Bronze Star. Introduction The Rapid Integration and Test Environment (RITE) initiative, implemented by the Program Executive Office

  12. IMCS reflight certification requirements and design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The requirements for reflight certification are established. Software requirements encompass the software programs that are resident in the PCC, DEP, PDSS, EC, or any related GSE. A design approach for the reflight software packages is recommended. These designs will be of sufficient detail to permit the implementation of reflight software. The PDSS/IMC Reflight Certification system provides the tools and mechanisms for the user to perform the reflight certification test procedures, test data capture, test data display, and test data analysis. The system as defined will be structured to permit maximum automation of reflight certification procedures and test data analysis.

  13. Data quality can make or break a research infrastructure

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.

    2017-12-01

    Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.

  14. Cassini's Test Methodology for Flight Software Verification and Operations

    NASA Technical Reports Server (NTRS)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  15. Passive Seismic Monitoring for Rockfall at Yucca Mountain: Concept Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, J; Twilley, K; Murvosh, H

    2003-03-03

    For the purpose of proof-testing a system intended to remotely monitor rockfall inside a potential radioactive waste repository at Yucca Mountain, a system of seismic sub-arrays will be deployed and tested on the surface of the mountain. The goal is to identify and locate rockfall events remotely using automated data collecting and processing techniques. We install seismometers on the ground surface, generate seismic energy to simulate rockfall in underground space beneath the array, and interpret the surface response to discriminate and locate the event. Data will be analyzed using matched-field processing, a generalized beam forming method for localizing discrete signals.more » Software is being developed to facilitate the processing. To date, a three-component sub-array has been installed and successfully tested.« less

  16. An Interactive, Mobile-Based Tool for Personal Social Network Data Collection and Visualization Among a Geographically Isolated and Socioeconomically Disadvantaged Population: Early-Stage Feasibility Study With Qualitative User Feedback.

    PubMed

    Eddens, Katherine S; Fagan, Jesse M; Collins, Tom

    2017-06-22

    Personal social networks have a profound impact on our health, yet collecting personal network data for use in health communication, behavior change, or translation and dissemination interventions has proved challenging. Recent advances in social network data collection software have reduced the burden of network studies on researchers and respondents alike, yet little testing has occurred to discover whether these methods are: (1) acceptable to a variety of target populations, including those who may have limited experience with technology or limited literacy; and (2) practical in the field, specifically in areas that are geographically and technologically disconnected, such as rural Appalachian Kentucky. We explored the early-stage feasibility (Acceptability, Demand, Implementation, and Practicality) of using innovative, interactive, tablet-based network data collection and visualization software (OpenEddi) in field collection of personal network data in Appalachian Kentucky. A total of 168 rural Appalachian women who had previously participated in a study on the use of a self-collected vaginal swab (SCVS) for human papillomavirus testing were recruited by community-based nurse interviewers between September 2013 and August 2014. Participants completed egocentric network surveys via OpenEddi, which captured social and communication network influences on participation in, and recruitment to, the SCVS study. After study completion, we conducted a qualitative group interview with four nurse interviewers and two participants in the network study. Using this qualitative data, and quantitative data from the network study, we applied guidelines from Bowen et al to assess feasibility in four areas of early-stage development of OpenEddi: Acceptability, Demand, Implementation, and Practicality. Basic descriptive network statistics (size, edges, density) were analyzed using RStudio. OpenEddi was perceived as fun, novel, and superior to other data collection methods or tools. Respondents enjoyed the social network survey component, and visualizing social networks produced thoughtful responses from participants about leveraging or changing network content and structure for specific health-promoting purposes. Areas for improved literacy and functionality of the tool were identified. However, technical issues led to substantial (50%) data loss, limiting the success of its implementation from a researcher's perspective, and hindering practicality in the field. OpenEddi is a promising data collection tool for use in geographically isolated and socioeconomically disadvantaged populations. Future development will mitigate technical problems, improve usability and literacy, and test new methods of data collection. These changes will support goals for use of this tool in the delivery of network-based health communication and social support interventions to socioeconomically disadvantaged populations. ©Katherine S Eddens, Jesse M Fagan, Tom Collins. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 22.06.2017.

  17. An Interactive, Mobile-Based Tool for Personal Social Network Data Collection and Visualization Among a Geographically Isolated and Socioeconomically Disadvantaged Population: Early-Stage Feasibility Study With Qualitative User Feedback

    PubMed Central

    Fagan, Jesse M; Collins, Tom

    2017-01-01

    Background Personal social networks have a profound impact on our health, yet collecting personal network data for use in health communication, behavior change, or translation and dissemination interventions has proved challenging. Recent advances in social network data collection software have reduced the burden of network studies on researchers and respondents alike, yet little testing has occurred to discover whether these methods are: (1) acceptable to a variety of target populations, including those who may have limited experience with technology or limited literacy; and (2) practical in the field, specifically in areas that are geographically and technologically disconnected, such as rural Appalachian Kentucky. Objective We explored the early-stage feasibility (Acceptability, Demand, Implementation, and Practicality) of using innovative, interactive, tablet-based network data collection and visualization software (OpenEddi) in field collection of personal network data in Appalachian Kentucky. Methods A total of 168 rural Appalachian women who had previously participated in a study on the use of a self-collected vaginal swab (SCVS) for human papillomavirus testing were recruited by community-based nurse interviewers between September 2013 and August 2014. Participants completed egocentric network surveys via OpenEddi, which captured social and communication network influences on participation in, and recruitment to, the SCVS study. After study completion, we conducted a qualitative group interview with four nurse interviewers and two participants in the network study. Using this qualitative data, and quantitative data from the network study, we applied guidelines from Bowen et al to assess feasibility in four areas of early-stage development of OpenEddi: Acceptability, Demand, Implementation, and Practicality. Basic descriptive network statistics (size, edges, density) were analyzed using RStudio. Results OpenEddi was perceived as fun, novel, and superior to other data collection methods or tools. Respondents enjoyed the social network survey component, and visualizing social networks produced thoughtful responses from participants about leveraging or changing network content and structure for specific health-promoting purposes. Areas for improved literacy and functionality of the tool were identified. However, technical issues led to substantial (50%) data loss, limiting the success of its implementation from a researcher’s perspective, and hindering practicality in the field. Conclusions OpenEddi is a promising data collection tool for use in geographically isolated and socioeconomically disadvantaged populations. Future development will mitigate technical problems, improve usability and literacy, and test new methods of data collection. These changes will support goals for use of this tool in the delivery of network-based health communication and social support interventions to socioeconomically disadvantaged populations. PMID:28642217

  18. Submarine pipeline on-bottom stability. Volume 2: Software and manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less

  19. Uav Photogrammetry: Block Triangulation Comparisons

    NASA Astrophysics Data System (ADS)

    Gini, R.; Pagliari, D.; Passoni, D.; Pinto, L.; Sona, G.; Dosso, P.

    2013-08-01

    UAVs systems represent a flexible technology able to collect a big amount of high resolution information, both for metric and interpretation uses. In the frame of experimental tests carried out at Dept. ICA of Politecnico di Milano to validate vector-sensor systems and to assess metric accuracies of images acquired by UAVs, a block of photos taken by a fixed wing system is triangulated with several software. The test field is a rural area included in an Italian Park ("Parco Adda Nord"), useful to study flight and imagery performances on buildings, roads, cultivated and uncultivated vegetation. The UAV SenseFly, equipped with a camera Canon Ixus 220HS, flew autonomously over the area at a height of 130 m yielding a block of 49 images divided in 5 strips. Sixteen pre-signalized Ground Control Points, surveyed in the area through GPS (NRTK survey), allowed the referencing of the block and accuracy analyses. Approximate values for exterior orientation parameters (positions and attitudes) were recorded by the flight control system. The block was processed with several software: Erdas-LPS, EyeDEA (Univ. of Parma), Agisoft Photoscan, Pix4UAV, in assisted or automatic way. Results comparisons are given in terms of differences among digital surface models, differences in orientation parameters and accuracies, when available. Moreover, image and ground point coordinates obtained by the various software were independently used as initial values in a comparative adjustment made by scientific in-house software, which can apply constraints to evaluate the effectiveness of different methods of point extraction and accuracies on ground check points.

  20. Extreme learning machine: a new alternative for measuring heat collection rate and heat loss coefficient of water-in-glass evacuated tube solar water heaters.

    PubMed

    Liu, Zhijian; Li, Hao; Tang, Xindong; Zhang, Xinyu; Lin, Fan; Cheng, Kewei

    2016-01-01

    Heat collection rate and heat loss coefficient are crucial indicators for the evaluation of in service water-in-glass evacuated tube solar water heaters. However, the direct determination requires complex detection devices and a series of standard experiments, wasting too much time and manpower. To address this problem, we previously used artificial neural networks and support vector machine to develop precise knowledge-based models for predicting the heat collection rates and heat loss coefficients of water-in-glass evacuated tube solar water heaters, setting the properties measured by "portable test instruments" as the independent variables. A robust software for determination was also developed. However, in previous results, the prediction accuracy of heat loss coefficients can still be improved compared to those of heat collection rates. Also, in practical applications, even a small reduction in root mean square errors (RMSEs) can sometimes significantly improve the evaluation and business processes. As a further study, in this short report, we show that using a novel and fast machine learning algorithm-extreme learning machine can generate better predicted results for heat loss coefficient, which reduces the average RMSEs to 0.67 in testing.

  1. Kalman Filters for UXO Detection: Real-Time Feedback and Small Target Detection

    DTIC Science & Technology

    2012-05-01

    last two decades. Accomplishments reported from both hardware and software point of views have moved the re- search focus from simple laboratory tests...quality data which in turn require a good positioning of the sensors atop the UXOs. The data collection protocol is currently based on a two-stage process...Note that this results is merely an illustration of the convergence of the Kalman filter. In practise , the linear part can be directly inverted for if

  2. Army Training and Testing Area Carrying Capacity (ATTACC) LS Factor Calculator User Manual, Version 1.0

    DTIC Science & Technology

    2002-08-01

    of these elevation files depends on the contour interval and map scale of the original contour data. The choice of data source for determining the...the ATTACC LCM Installation CD. Not all versions of the ATTACC LCM Installation CD have the LS Factor Calculator software included. The process for ...PAGE Form Approved OMB No. 0704-0188 Public reporting burder for this collection of information is estibated to average 1 hour per response

  3. Mars Science Laboratory Flight Software Boot Robustness Testing Project Report

    NASA Technical Reports Server (NTRS)

    Roth, Brian

    2011-01-01

    On the surface of Mars, the Mars Science Laboratory will boot up its flight computers every morning, having charged the batteries through the night. This boot process is complicated, critical, and affected by numerous hardware states that can be difficult to test. The hardware test beds do not facilitate testing a long duration of back-to-back unmanned automated tests, and although the software simulation has provided the necessary functionality and fidelity for this boot testing, there has not been support for the full flexibility necessary for this task. Therefore to perform this testing a framework has been build around the software simulation that supports running automated tests loading a variety of starting configurations for software and hardware states. This implementation has been tested against the nominal cases to validate the methodology, and support for configuring off-nominal cases is ongoing. The implication of this testing is that the introduction of input configurations that have yet proved difficult to test may reveal boot scenarios worth higher fidelity investigation, and in other cases increase confidence in the robustness of the flight software boot process.

  4. Glossary of Software Engineering Laboratory terms

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A glossary of terms used in the Software Engineering Laboratory (SEL) is given. The terms are defined within the context of the software development environment for flight dynamics at the Goddard Space Flight Center. A concise reference for clarifying the language employed in SEL documents and data collection forms is given. Basic software engineering concepts are explained and standard definitions for use by SEL personnel are established.

  5. Technology-driven dietary assessment: a software developer’s perspective

    PubMed Central

    Buday, Richard; Tapia, Ramsey; Maze, Gary R.

    2015-01-01

    Dietary researchers need new software to improve nutrition data collection and analysis, but creating information technology is difficult. Software development projects may be unsuccessful due to inadequate understanding of needs, management problems, technology barriers or legal hurdles. Cost overruns and schedule delays are common. Barriers facing scientific researchers developing software include workflow, cost, schedule, and team issues. Different methods of software development and the role that intellectual property rights play are discussed. A dietary researcher must carefully consider multiple issues to maximize the likelihood of success when creating new software. PMID:22591224

  6. Application-level regression testing framework using Jenkins

    DOE PAGES

    Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen

    2017-09-26

    Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less

  7. Application-level regression testing framework using Jenkins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen

    Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less

  8. Experiments in fault tolerant software reliability

    NASA Technical Reports Server (NTRS)

    Mcallister, David F.; Tai, K. C.; Vouk, Mladen A.

    1987-01-01

    The reliability of voting was evaluated in a fault-tolerant software system for small output spaces. The effectiveness of the back-to-back testing process was investigated. Version 3.0 of the RSDIMU-ATS, a semi-automated test bed for certification testing of RSDIMU software, was prepared and distributed. Software reliability estimation methods based on non-random sampling are being studied. The investigation of existing fault-tolerance models was continued and formulation of new models was initiated.

  9. Glossary of software engineering laboratory terms

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A glossary of terms used in the Software Engineering Laboratory (SEL) is presented. The terms are defined within the context of the software development environment for flight dynamics at Goddard Space Flight Center. A concise reference for clarifying and understanding the language employed in SEL documents and data collection forms is provided.

  10. Computer software management, evaluation, and dissemination

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The activities of the Computer Software Management and Information Center involving the collection, processing, and distribution of software developed under the auspices of NASA and certain other federal agencies are reported. Program checkout and evaluation, inventory control, customer services and marketing, dissemination, program maintenance, and special development tasks are discussed.

  11. A Public Domain Software Library for Reading and Language Arts.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…

  12. Evaluating Games-Based Learning

    ERIC Educational Resources Information Center

    Hainey, Thomas; Connolly, Thomas

    2010-01-01

    A highly important part of software engineering education is requirements collection and analysis, one of the initial stages of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall system if performed incorrectly. As software engineering is a field with a reputation…

  13. 78 FR 45196 - Federal Acquisition Regulation; Information Collection; Rights in Data and Copyrights

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    .... (5) FAR 52.227-19, Commercial Computer Software License. This clause is used in contracts and... standard commercial license. The contractor is responsible for affixing a notice on any commercial software..., Representation of Limited Rights Data and Restricted Computer Software. This clauses is included in solicitations...

  14. Time-Domain Terahertz Computed Axial Tomography NDE System

    NASA Technical Reports Server (NTRS)

    Zimdars, David

    2012-01-01

    NASA has identified the need for advanced non-destructive evaluation (NDE) methods to characterize aging and durability in aircraft materials to improve the safety of the nation's airline fleet. 3D THz tomography can play a major role in detection and characterization of flaws and degradation in aircraft materials, including Kevlar-based composites and Kevlar and Zylon fabric covers for soft-shell fan containment where aging and durability issues are critical. A prototype computed tomography (CT) time-domain (TD) THz imaging system has been used to generate 3D images of several test objects including a TUFI tile (a thermal protection system tile used on the Space Shuttle and possibly the Orion or similar capsules). This TUFI tile had simulated impact damage that was located and the depth of damage determined. The CT motion control gan try was designed and constructed, and then integrated with a T-Ray 4000 control unit and motion controller to create a complete CT TD-THz imaging system prototype. A data collection software script was developed that takes multiple z-axis slices in sequence and saves the data for batch processing. The data collection software was integrated with the ability to batch process the slice data with the CT TD-THz image reconstruction software. The time required to take a single CT slice was decreased from six minutes to approximately one minute by replacing the 320 ps, 100-Hz waveform acquisition system with an 80 ps, 1,000-Hz waveform acquisition system. The TD-THZ computed tomography system was built from pre-existing commercial off-the-shelf subsystems. A CT motion control gantry was constructed from COTS components that can handle larger samples. The motion control gantry allows inspection of sample sizes of up to approximately one cubic foot (.0.03 cubic meters). The system reduced to practice a CT-TDTHz system incorporating a COTS 80- ps/l-kHz waveform scanner. The incorporation of this scanner in the system allows acquisition of 3D slice data with better signal-to-noise using a COTS scanner rather than the gchirped h scanner. The system also reduced to practice a prototype for commercial CT systems for insulating materials where safety concerns cannot accommodate x-ray. A software script was written to automate the COTS software to collect and process TD-THz CT data.

  15. Java Mission Evaluation Workstation System

    NASA Technical Reports Server (NTRS)

    Pettinger, Ross; Watlington, Tim; Ryley, Richard; Harbour, Jeff

    2006-01-01

    The Java Mission Evaluation Workstation System (JMEWS) is a collection of applications designed to retrieve, display, and analyze both real-time and recorded telemetry data. This software is currently being used by both the Space Shuttle Program (SSP) and the International Space Station (ISS) program. JMEWS was written in the Java programming language to satisfy the requirement of platform independence. An object-oriented design was used to satisfy additional requirements and to make the software easily extendable. By virtue of its platform independence, JMEWS can be used on the UNIX workstations in the Mission Control Center (MCC) and on office computers. JMEWS includes an interactive editor that allows users to easily develop displays that meet their specific needs. The displays can be developed and modified while viewing data. By simply selecting a data source, the user can view real-time, recorded, or test data.

  16. Computer image analysis in obtaining characteristics of images: greenhouse tomatoes in the process of generating learning sets of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.

    2014-04-01

    The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.

  17. Research on Automatic Positioning System of Ultrasonic Testing of Wind Turbine Blade Flaws

    NASA Astrophysics Data System (ADS)

    Liu, Q. X.; Wang, Z. H.; Long, S. G.; Cai, M.; Cai, M.; Wang, X.; Chen, X. Y.; Bu, J. L.

    2017-11-01

    Ultrasonic testing technology has been used essentially in non-destructive testing of wind turbine blades. However, it is fact that the ultrasonic flaw detection method has inefficiently employed in recent years. This is because the testing result will illustrate a small deviation due to the artificial, environmental and technical factors. Therefore, it is an urgent technical demand for engineers to test the various flaws efficiently and quickly. An automatic positioning system has been designed in this paper to record the moving coordinates and the target distance in real time. Simultaneously, it could launch and acquire the sonic wave automatically. The ADNS-3080 optoelectronic chip is manufactured by Agilent Technologies Inc, which is also utilized in the system. With the combination of the chip, the power conversion module and the USB transmission module, the collected data can be transmitted from the upper monitor to the hardware that could process and control the data through software programming. An experiment has been designed to prove the reliability of automotive positioning system. The result has been validated by comparing the result collected form LABVIEW and actual plots on Perspex plane, it concludes that the system possesses high accuracy and magnificent meanings in practical engineering.

  18. IHE cross-enterprise document sharing for imaging: interoperability testing software

    PubMed Central

    2010-01-01

    Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241

  19. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    PubMed

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  20. Measurement of the area of venous ulcers using two software programs.

    PubMed

    Eberhardt, Thaís Dresch; Lima, Suzinara Beatriz Soares de; Lopes, Luis Felipe Dias; Borges, Eline de Lima; Weiller, Teresinha Heck; Fonseca, Graziele Gorete Portella da

    2016-12-19

    to compare the measurement area of venous ulcers using AutoCAD(r) and Image Tool software. this was an assessment of reproducibility tests conducted in a angiology clinic of a university hospital. Data were collected from 21 patients with venous ulcers, in the period from March to July of 2015, using a collection form and photograph of wounds. Five nurses (evaluators) of the hospital skin wound study group participated. The wounds were measured using both software programs. Data were analyzed using intraclass correlation coefficient, concordance correlation coefficient and Bland-Altman analysis. The study met the ethical aspects in accordance with current legislation. the size of ulcers varied widely, however, without significant difference between the measurements; an excellent intraclass and concordance correlation was found between both software programs, which seem to be more accurate when measuring a wound area >10 cm². the use of both software programs is appropriate for measurement of venous ulcers, appearing to be more accurate when used to measure a wound area > 10 cm². comparar a mensuração de área de úlceras venosas por meio dos softwares AutoCAD(r) e Image Tool. trata-se de um estudo de avaliação de reprodutibilidade de testes, realizado em um ambulatório de angiologia de um hospital universitário. Os dados foram coletados de 21 pacientes com úlceras venosas, no período de março a julho de 2015, por meio de formulário de coleta e fotografia das feridas. Cinco enfermeiros (avaliadores) do Grupo de Estudos de Lesões de Pele do hospital participaram da pesquisa. As feridas foram mensuradas em ambos os softwares. Os dados foram analisados por meio do Coeficiente de correlação intraclasse, Coeficiente de correlação de concordância e procedimento de Bland e Altman. A pesquisa respeitou os aspectos éticos de acordo com a legislação vigente. os tamanhos das úlceras apresentaram grande amplitude, porém, sem diferença significativa entre as mensurações, existe excelente correlação intraclasse e de concordância entre os softwares, os quais parecem ser mais precisos na mensuração de feridas com área > 10 cm². o uso de ambos os softwares é indicado para a mensuração de úlceras venosas, parecendo ser mais precisos quando utilizados para mensurar feridas com área > 10 cm². comparar la medida del área de úlceras venosas por medio de los softwares AutoCAD(r) e Image Tool. se trata de un estudio de evaluación de reproducibilidad de pruebas, realizado en un ambulatorio de angiología de un hospital universitario. Los datos fueron recolectados de 21 pacientes con úlceras venosas, en el período de marzo a julio de 2015, por medio de formulario de recolección y fotografías de las heridas. Cinco enfermeros (evaluadores) del Grupo de Estudios de Lesiones de Piel del hospital participaron de la investigación. Las heridas fueron medidas en ambos softwares. Los datos fueron analizados por medio de: Coeficiente de correlación intraclase, Coeficiente de correlación de concordancia y Procedimiento de Bland y Altman. La investigación respetó los aspectos éticos de acuerdo con la legislación vigente. los tamaños de las úlceras presentaron gran amplitud, sin embargo, sin diferencia significativa entre las medidas; existe excelente correlación intraclase y de concordancia entre los softwares, los que parecen ser más precisos en medidas de heridas con área > 10 cm². el uso de ambos softwares es indicado para medir úlceras venosas, pareciendo ser más precisos cuando utilizados para medir heridas con área > 10 cm².

  1. 15 CFR 740.9 - Temporary imports, exports, and reexports (TMP).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the end of the beta test period as defined by the software producer or, if the software producer does... States; and exports and reexports of beta test software. (a) Temporary exports and reexports—(1) Scope. You may export and reexport commodities and software for temporary use abroad (including use in...

  2. 15 CFR 740.9 - Temporary imports, exports, and reexports (TMP).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the end of the beta test period as defined by the software producer or, if the software producer does... States; and exports and reexports of beta test software. (a) Temporary exports and reexports—(1) Scope. You may export and reexport commodities and software for temporary use abroad (including use in...

  3. Mars Science Laboratory Boot Robustness Testing

    NASA Technical Reports Server (NTRS)

    Banazadeh, Payam; Lam, Danny

    2011-01-01

    Mars Science Laboratory (MSL) is one of the most complex spacecrafts in the history of mankind. Due to the nature of its complexity, a large number of flight software (FSW) requirements have been written for implementation. In practice, these requirements necessitate very complex and very precise flight software with no room for error. One of flight software's responsibilities is to be able to boot up and check the state of all devices on the spacecraft after the wake up process. This boot up and initialization is crucial to the mission success since any misbehavior of different devices needs to be handled through the flight software. I have created a test toolkit that allows the FSW team to exhaustively test the flight software under variety of different unexpected scenarios and validate that flight software can handle any situation after booting up. The test includes initializing different devices on spacecraft to different configurations and validate at the end of the flight software boot up that the flight software has initialized those devices to what they are suppose to be in that particular scenario.

  4. Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions

    DTIC Science & Technology

    2012-07-01

    Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software

  5. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  6. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  7. Real-time software failure characterization

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Finelli, George B.

    1990-01-01

    A series of studies aimed at characterizing the fundamentals of the software failure process has been undertaken as part of a NASA project on the modeling of a real-time aerospace vehicle software reliability. An overview of these studies is provided, and the current study, an investigation of the reliability of aerospace vehicle guidance and control software, is examined. The study approach provides for the collection of life-cycle process data, and for the retention and evaluation of interim software life-cycle products.

  8. Playbook Data Analysis Tool: Collecting Interaction Data from Extremely Remote Users

    NASA Technical Reports Server (NTRS)

    Kanefsky, Bob; Zheng, Jimin; Deliz, Ivonne; Marquez, Jessica J.; Hillenius, Steven

    2017-01-01

    Typically, user tests for software tools are conducted in person. At NASA, the users may be located at the bottom of the ocean in a pressurized habitat, above the atmosphere in the International Space Station, or in an isolated capsule on a simulated asteroid mission. The Playbook Data Analysis Tool (P-DAT) is a human-computer interaction (HCI) evaluation tool that the NASA Ames HCI Group has developed to record user interactions with Playbook, the group's existing planning-and-execution software application. Once the remotely collected user interaction data makes its way back to Earth, researchers can use P-DAT for in-depth analysis. Since a critical component of the Playbook project is to understand how to develop more intuitive software tools for astronauts to plan in space, P-DAT helps guide us in the development of additional easy-to-use features for Playbook, informing the design of future crew autonomy tools.P-DAT has demonstrated the capability of discreetly capturing usability data in amanner that is transparent to Playbook’s end-users. In our experience, P-DAT data hasalready shown its utility, revealing potential usability patterns, helping diagnose softwarebugs, and identifying metrics and events that are pertinent to Playbook usage aswell as spaceflight operations. As we continue to develop this analysis tool, P-DATmay yet provide a method for long-duration, unobtrusive human performance collectionand evaluation for mission controllers back on Earth and researchers investigatingthe effects and mitigations related to future human spaceflight performance.

  9. Development of A General Principle Solution Forisoagrinet Compliant Networking System Components in Animal Husbandry

    NASA Astrophysics Data System (ADS)

    Kuhlmann, Arne; Herd, Daniel; Röβler, Benjamin; Gallmann, Eva; Jungbluth, Thomas

    In pig production software and electronic systems are widely used for process control and management. Unfortunately most devices on farms are proprietary solutions and autonomically working. To unify data communication of devices in agricultural husbandry, the international standard ISOagriNET (ISO 17532:2007) was developed. It defines data formats and exchange protocols, to link up devices like climate controls, feeding systems and sensors, but also management software. The aim of the research project, "Information and Data Collection in Livestock Systems" is to develop an ISOagriNET compliant IT system, a so called Farming Cell. It integrates all electronic components to acquire the available data and information for pig fattening. That way, an additional benefit to humans, animals and the environment regarding process control and documentation, can be generated. Developing the Farming Cell is very complex; in detail it is very difficult and long-winded to integrate hardware and software by various vendors into an ISOagriNET compliant IT system. This ISOagriNET prototype shows as a test environment the potential of this new standard.

  10. Human performance interfaces in air traffic control.

    PubMed

    Chang, Yu-Hern; Yeh, Chung-Hsing

    2010-01-01

    This paper examines how human performance factors in air traffic control (ATC) affect each other through their mutual interactions. The paper extends the conceptual SHEL model of ergonomics to describe the ATC system as human performance interfaces in which the air traffic controllers interact with other human performance factors including other controllers, software, hardware, environment, and organisation. New research hypotheses about the relationships between human performance interfaces of the system are developed and tested on data collected from air traffic controllers, using structural equation modelling. The research result suggests that organisation influences play a more significant role than individual differences or peer influences on how the controllers interact with the software, hardware, and environment of the ATC system. There are mutual influences between the controller-software, controller-hardware, controller-environment, and controller-organisation interfaces of the ATC system, with the exception of the controller-controller interface. Research findings of this study provide practical insights in managing human performance interfaces of the ATC system in the face of internal or external change, particularly in understanding its possible consequences in relation to the interactions between human performance factors.

  11. A Genuine TEAM Player

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Qualtech Systems, Inc. developed a complete software system with capabilities of multisignal modeling, diagnostic analysis, run-time diagnostic operations, and intelligent interactive reasoners. Commercially available as the TEAMS (Testability Engineering and Maintenance System) tool set, the software can be used to reveal unanticipated system failures. The TEAMS software package is broken down into four companion tools: TEAMS-RT, TEAMATE, TEAMS-KB, and TEAMS-RDS. TEAMS-RT identifies good, bad, and suspect components in the system in real-time. It reports system health results from onboard tests, and detects and isolates failures within the system, allowing for rapid fault isolation. TEAMATE takes over from where TEAMS-RT left off by intelligently guiding the maintenance technician through the troubleshooting procedure, repair actions, and operational checkout. TEAMS-KB serves as a model management and collection tool. TEAMS-RDS (TEAMS-Remote Diagnostic Server) has the ability to continuously assess a system and isolate any failure in that system or its components, in real time. RDS incorporates TEAMS-RT, TEAMATE, and TEAMS-KB in a large-scale server architecture capable of providing advanced diagnostic and maintenance functions over a network, such as the Internet, with a web browser user interface.

  12. Towards an autonomous telescope system: the Test-Bed Telescope project

    NASA Astrophysics Data System (ADS)

    Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium

    2015-05-01

    In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.

  13. System Testing of Ground Cooling System Components

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler Steven

    2014-01-01

    This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.

  14. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    PubMed

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  15. Writing executable assertions to test flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    An executable assertion is a logical statement about the variables or a block of code. If there is no error during execution, the assertion statement results in a true value. Executable assertions can be used for dynamic testing of software. They can be employed for validation during the design phase, and exception and error detection during the operation phase. The present investigation is concerned with the problem of writing executable assertions, taking into account the use of assertions for testing flight software. They can be employed for validation during the design phase, and for exception handling and error detection during the operation phase The digital flight control system and the flight control software are discussed. The considered system provides autopilot and flight director modes of operation for automatic and manual control of the aircraft during all phases of flight. Attention is given to techniques for writing and using assertions to test flight software, an experimental setup to test flight software, and language features to support efficient use of assertions.

  16. cPath: open source software for collecting, storing, and querying biological pathways.

    PubMed

    Cerami, Ethan G; Bader, Gary D; Gross, Benjamin E; Sander, Chris

    2006-11-13

    Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling.

  17. HPC Software Stack Testing Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garvey, Cormac

    The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).

  18. Methods for Processing and Interpretation of AIS Signals Corrupted by Noise and Packet Collisions

    NASA Astrophysics Data System (ADS)

    Poļevskis, J.; Krastiņš, M.; Korāts, G.; Skorodumovs, A.; Trokšs, J.

    2012-01-01

    The authors deal with the operation of Automatic Identification System (AIS) used in the marine traffic monitoring to broadcast messages containing information about the vessel: id, payload, size, speed, destination etc., meant primarily for avoidance of ship collisions. To extend the radius of AIS operation, it is envisaged to dispose its receivers on satellites. However, in space, due to a large coverage area, interfering factors are especially pronounced - such as packet collision, Doppler's shift and noise impact on AIS message receiving, pre-processing and decoding. To assess the quality of an AIS receiver's operation, a test was carried out in which, varying automatically frequency, amplitude, noise, and other parameters, the data on the ability of the receiver's ability to decode AIS signals are collected. In the work, both hardware- and software-based AIS decoders were tested. As a result, quite satisfactory statistics has been gathered - both on the common and the differing features of such decoders when operating in space. To obtain reliable data on the software-defined radio AIS receivers, further research is envisaged.

  19. Field Information Support Tool

    DTIC Science & Technology

    2010-09-01

    35 1. ODK Collect ...34 Figure 17. ODK Collect main menu...commercially available software packages and capabilities to function effectively. The first of these tools, ODK Collect and ODK Aggregate, are

  20. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  1. Testing of Safety-Critical Software Embedded in an Artificial Heart

    NASA Astrophysics Data System (ADS)

    Cha, Sungdeok; Jeong, Sehun; Yoo, Junbeom; Kim, Young-Gab

    Software is being used more frequently to control medical devices such as artificial heart or robotic surgery system. While much of software safety issues in such systems are similar to other safety-critical systems (e.g., nuclear power plants), domain-specific properties may warrant development of customized techniques to demonstrate fitness of the system on patients. In this paper, we report results of a preliminary analysis done on software controlling a Hybrid Ventricular Assist Device (H-VAD) developed by Korea Artificial Organ Centre (KAOC). It is a state-of-the-art artificial heart which completed animal testing phase. We performed software testing in in-vitro experiments and animal experiments. An abnormal behaviour, never detected during extensive in-vitro analysis and animal testing, was found.

  2. The Perception of Educational Software Development Self-Efficacy among Undergraduate CEIT Teacher Candidates

    ERIC Educational Resources Information Center

    Uzun, Adem; Ozkilic, Ruchan; Senturk, Aysan

    2013-01-01

    The objective of this study was to analyze self-efficacy perceptions for education software development of teacher candidates studying at Department of Computer Education and Instructional Technologies, with respect to a range of variables. The Educational Software Development Self-Efficacy Perception Scale was used as data collection tool. Sixty…

  3. 75 FR 23779 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-04

    ... an initial investment in software, training, and process change that cannot simply be converted from... authoring software which significantly reduces the burden and time needed to generate well- formed SPL..., eliminating the need to retype any information. Enhancements have also been made to the software for...

  4. A program downloader and other utility software for the DATAC bus monitor unit

    NASA Technical Reports Server (NTRS)

    Novacki, Stanley M., III

    1987-01-01

    A set or programs designed to facilitate software testing on the DATAC Bus Monitor is described. By providing a means to simplify program loading, firmware generation, and subsequent testing of programs, the overhead involved in software evaluation is reduced and that time is used more productively in performance, analysis and improvement of current software.

  5. An image-processing software package: UU and Fig for optical metrology applications

    NASA Astrophysics Data System (ADS)

    Chen, Lujie

    2013-06-01

    Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.

  6. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  7. "Proximal Sensing" capabilities for snow cover monitoring

    NASA Astrophysics Data System (ADS)

    Valt, Mauro; Salvatori, Rosamaria; Plini, Paolo; Salzano, Roberto; Giusti, Marco; Montagnoli, Mauro; Sigismondi, Daniele; Cagnati, Anselmo

    2013-04-01

    The seasonal snow cover represents one of the most important land cover class in relation to environmental studies in mountain areas, especially considering its variation during time. Snow cover and its extension play a relevant role for the studies on the atmospheric dynamics and the evolution of climate. It is also important for the analysis and management of water resources and for the management of touristic activities in mountain areas. Recently, webcam images collected at daily or even hourly intervals are being used as tools to observe the snow covered areas; those images, properly processed, can be considered a very important environmental data source. Images captured by digital cameras become a useful tool at local scale providing images even when the cloud coverage makes impossible the observation by satellite sensors. When suitably processed these images can be used for scientific purposes, having a good resolution (at least 800x600x16 million colours) and a very good sampling frequency (hourly images taken through the whole year). Once stored in databases, those images represent therefore an important source of information for the study of recent climatic changes, to evaluate the available water resources and to analyse the daily surface evolution of the snow cover. The Snow-noSnow software has been specifically designed to automatically detect the extension of snow cover collected from webcam images with a very limited human intervention. The software was tested on images collected on Alps (ARPAV webcam network) and on Apennine in a pilot station properly equipped for this project by CNR-IIA. The results obtained through the use of Snow-noSnow are comparable to the one achieved by photo-interpretation and could be considered as better as the ones obtained using the image segmentation routine implemented into image processing commercial softwares. Additionally, Snow-noSnow operates in a semi-automatic way and has a reduced processing time. The analysis of this kind of images could represent an useful element to support the interpretation of remote sensing images, especially those provided by high spatial resolution sensors. Keywords: snow cover monitoring, digital images, software, Alps, Apennines.

  8. A comprehensive data acquisition and management system for an ecosystem-scale peatland warming and elevated CO2 experiment

    NASA Astrophysics Data System (ADS)

    Krassovski, M. B.; Riggs, J. S.; Hook, L. A.; Nettles, W. R.; Hanson, P. J.; Boden, T. A.

    2015-07-01

    Ecosystem-scale manipulation experiments represent large science investments that require well-designed data acquisition and management systems to provide reliable, accurate information to project participants and third party users. The SPRUCE Project (Spruce and Peatland Responses Under Climatic and Environmental Change, http://mnspruce.ornl.gov) is such an experiment funded by the Department of Energy's (DOE), Office of Science, Terrestrial Ecosystem Science (TES) Program. The SPRUCE experimental mission is to assess ecosystem-level biological responses of vulnerable, high carbon terrestrial ecosystems to a range of climate warming manipulations and an elevated CO2 atmosphere. SPRUCE provides a platform for testing mechanisms controlling the vulnerability of organisms, biogeochemical processes, and ecosystems to climatic change (e.g., thresholds for organism decline or mortality, limitations to regeneration, biogeochemical limitations to productivity, the cycling and release of CO2 and CH4 to the atmosphere). The SPRUCE experiment will generate a wide range of continuous and discrete measurements. To successfully manage SPRUCE data collection, achieve SPRUCE science objectives, and support broader climate change research, the research staff has designed a flexible data system using proven network technologies and software components. The primary SPRUCE data system components are: 1. Data acquisition and control system - set of hardware and software to retrieve biological and engineering data from sensors, collect sensor status information, and distribute feedback to control components. 2. Data collection system - set of hardware and software to deliver data to a central depository for storage and further processing. 3. Data management plan - set of plans, policies, and practices to control consistency, protect data integrity, and deliver data. This publication presents our approach to meeting the challenges of designing and constructing an efficient data system for managing high volume sources of in-situ observations in a remote, harsh environmental location. The approach covers data flow starting from the sensors and ending at the archival/distribution points, discusses types of hardware and software used, examines design considerations that were used to choose them, and describes the data management practices chosen to control and enhance the value of the data.

  9. A comprehensive data acquisition and management system for an ecosystem-scale peatland warming and elevated CO2 experiment

    NASA Astrophysics Data System (ADS)

    Krassovski, M. B.; Riggs, J. S.; Hook, L. A.; Nettles, W. R.; Hanson, P. J.; Boden, T. A.

    2015-11-01

    Ecosystem-scale manipulation experiments represent large science investments that require well-designed data acquisition and management systems to provide reliable, accurate information to project participants and third party users. The SPRUCE project (Spruce and Peatland Responses Under Climatic and Environmental Change, http://mnspruce.ornl.gov) is such an experiment funded by the Department of Energy's (DOE), Office of Science, Terrestrial Ecosystem Science (TES) Program. The SPRUCE experimental mission is to assess ecosystem-level biological responses of vulnerable, high carbon terrestrial ecosystems to a range of climate warming manipulations and an elevated CO2 atmosphere. SPRUCE provides a platform for testing mechanisms controlling the vulnerability of organisms, biogeochemical processes, and ecosystems to climatic change (e.g., thresholds for organism decline or mortality, limitations to regeneration, biogeochemical limitations to productivity, and the cycling and release of CO2 and CH4 to the atmosphere). The SPRUCE experiment will generate a wide range of continuous and discrete measurements. To successfully manage SPRUCE data collection, achieve SPRUCE science objectives, and support broader climate change research, the research staff has designed a flexible data system using proven network technologies and software components. The primary SPRUCE data system components are the following: 1. data acquisition and control system - set of hardware and software to retrieve biological and engineering data from sensors, collect sensor status information, and distribute feedback to control components; 2. data collection system - set of hardware and software to deliver data to a central depository for storage and further processing; 3. data management plan - set of plans, policies, and practices to control consistency, protect data integrity, and deliver data. This publication presents our approach to meeting the challenges of designing and constructing an efficient data system for managing high volume sources of in situ observations in a remote, harsh environmental location. The approach covers data flow starting from the sensors and ending at the archival/distribution points, discusses types of hardware and software used, examines design considerations that were used to choose them, and describes the data management practices chosen to control and enhance the value of the data.

  10. Proceedings of the Joint Logistics Commanders Joint Policy Coordinating Group on Computer Resource Management; Computer Software Management Software Workshop, 2-5 April 1979.

    DTIC Science & Technology

    1979-08-21

    Appendix s - Outline and Draft Material for Proposed Triservice Interim Guideline on Application of Software Acceptance Criteria....... 269 Appendix 9...AND DRAFT MATERIAL FOR PROPOSED TRISERVICE INTERIM GUIDELINE ON APPLICATION OF SOFTWARE ACCEPTANCE CRITERIA I I INTRODUCTION The purpose of this guide...contract item (CPCI) (code) 5. CPCI test plan 6. CPCI test procedures 7. CPCI test report 8. Handbooks and manuals. Al though additional material does

  11. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 1: Project summary

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1990-01-01

    This volume (1 of 4) gives a summary of the original AMPS software system configuration, points out some of the problem areas in the original software design that this project is to address, and in the appendix collects all the bimonthly status reports. The purpose of AMPS is to provide a self reliant system to control the generation and distribution of power in the space station. The software in the AMPS breadboard can be divided into three levels: the operating environment software, the protocol software, and the station specific software. This project deals only with the operating environment software and the protocol software. The present station specific software will not change except as necessary to conform to new data formats.

  12. A taxonomy and discussion of software attack technologies

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2005-03-01

    Software is a complex thing. It is not an engineering artifact that springs forth from a design by simply following software coding rules; creativity and the human element are at the heart of the process. Software development is part science, part art, and part craft. Design, architecture, and coding are equally important activities and in each of these activities, errors may be introduced that lead to security vulnerabilities. Therefore, inevitably, errors enter into the code. Some of these errors are discovered during testing; however, some are not. The best way to find security errors, whether they are introduced as part of the architecture development effort or coding effort, is to automate the security testing process to the maximum extent possible and add this class of tools to the tools available, which aids in the compilation process, testing, test analysis, and software distribution. Recent technological advances, improvements in computer-generated forces (CGFs), and results in research in information assurance and software protection indicate that we can build a semi-intelligent software security testing tool. However, before we can undertake the security testing automation effort, we must understand the scope of the required testing, the security failures that need to be uncovered during testing, and the characteristics of the failures. Therefore, we undertook the research reported in the paper, which is the development of a taxonomy and a discussion of software attacks generated from the point of view of the security tester with the goal of using the taxonomy to guide the development of the knowledge base for the automated security testing tool. The representation for attacks and threat cases yielded by this research captures the strategies, tactics, and other considerations that come into play during the planning and execution of attacks upon application software. The paper is organized as follows. Section one contains an introduction to our research and a discussion of the motivation for our work. Section two contains a presents our taxonomy of software attacks and a discussion of the strategies employed and general weaknesses exploited for each attack. Section three contains a summary and suggestions for further research.

  13. Recent Developments in Grid Generation and Force Integration Technology for Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; VanDalsem, William R. (Technical Monitor)

    1994-01-01

    Recent developments in algorithms and software tools for generating overset grids for complex configurations are described. These include the overset surface grid generation code SURGRD and version 2.0 of the hyperbolic volume grid generation code HYPGEN. The SURGRD code is in beta test mode where the new features include the capability to march over a collection of panel networks, a variety of ways to control the side boundaries and the marching step sizes and distance, a more robust projection scheme and an interpolation option. New features in version 2.0 of HYPGEN include a wider range of boundary condition types. The code also allows the user to specify different marching step sizes and distance for each point on the surface grid. A scheme that takes into account of the overlapped zones on the body surface for the purpose of forces and moments computation is also briefly described, The process involves the following two software modules: MIXSUR - a composite grid generation module to produce a collection of quadrilaterals and triangles on which pressure and viscous stresses are to be integrated, and OVERINT - a forces and moments integration module.

  14. FFTF Passive Safety Test Data for Benchmarks for New LMR Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootan, David W.; Casella, Andrew M.

    Liquid Metal Reactors (LMRs) continue to be considered as an attractive concept for advanced reactor design. Software packages such as SASSYS are being used to im-prove new LMR designs and operating characteristics. Significant cost and safety im-provements can be realized in advanced liquid metal reactor designs by emphasizing inherent or passive safety through crediting the beneficial reactivity feedbacks associ-ated with core and structural movement. This passive safety approach was adopted for the Fast Flux Test Facility (FFTF), and an experimental program was conducted to characterize the structural reactivity feedback. The FFTF passive safety testing pro-gram was developed to examine howmore » specific design elements influenced dynamic re-activity feedback in response to a reactivity input and to demonstrate the scalability of reactivity feedback results to reactors of current interest. The U.S. Department of En-ergy, Office of Nuclear Energy Advanced Reactor Technology program is in the pro-cess of preserving, protecting, securing, and placing in electronic format information and data from the FFTF, including the core configurations and data collected during the passive safety tests. Benchmarks based on empirical data gathered during operation of the Fast Flux Test Facility (FFTF) as well as design documents and post-irradiation examination will aid in the validation of these software packages and the models and calculations they produce. Evaluation of these actual test data could provide insight to improve analytical methods which may be used to support future licensing applications for LMRs« less

  15. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is presented. More than 100 publications are summarized. These publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials are grouped into five general subject areas for easy reference: (1) the software engineering laboratory; (2) software tools; (3) models and measures; (4) technology evaluations; and (5) data collection. An index further classifies these documents by specific topic.

  16. Collected software engineering papers, volume 12

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1993 through October 1994. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 12th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  17. Collected software engineering papers, volume 11

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1992 through November 1993. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 11th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  18. Contribution of Electronic Medical Records to the Management of Rare Diseases.

    PubMed

    Bremond-Gignac, Dominique; Lewandowski, Elisabeth; Copin, Henri

    2015-01-01

    Electronic health record systems provide great opportunity to study most diseases. Objective of this study was to determine whether electronic medical records (EMR) in ophthalmology contribute to management of rare eye diseases, isolated or in syndromes. Study was designed to identify and collect patients' data with ophthalmology-specific EMR. Ophthalmology-specific EMR software (Softalmo software Corilus) was used to acquire ophthalmological ocular consultation data from patients with five rare eye diseases. The rare eye diseases and data were selected and collected regarding expertise of eye center. A total of 135,206 outpatient consultations were performed between 2011 and 2014 in our medical center specialized in rare eye diseases. The search software identified 29 congenital aniridia, 6 Axenfeld/Rieger syndrome, 11 BEPS, 3 Nanophthalmos, and 3 Rubinstein-Taybi syndrome. EMR provides advantages for medical care. The use of ophthalmology-specific EMR is reliable and can contribute to a comprehensive ocular visual phenotype useful for clinical research. Routinely EMR acquired with specific software dedicated to ophthalmology provides sufficient detail for rare diseases. These software-collected data appear useful for creating patient cohorts and recording ocular examination, avoiding the time-consuming analysis of paper records and investigation, in a University Hospital linked to a National Reference Rare Center Disease.

  19. Contribution of Electronic Medical Records to the Management of Rare Diseases

    PubMed Central

    Bremond-Gignac, Dominique; Lewandowski, Elisabeth; Copin, Henri

    2015-01-01

    Purpose. Electronic health record systems provide great opportunity to study most diseases. Objective of this study was to determine whether electronic medical records (EMR) in ophthalmology contribute to management of rare eye diseases, isolated or in syndromes. Study was designed to identify and collect patients' data with ophthalmology-specific EMR. Methods. Ophthalmology-specific EMR software (Softalmo software Corilus) was used to acquire ophthalmological ocular consultation data from patients with five rare eye diseases. The rare eye diseases and data were selected and collected regarding expertise of eye center. Results. A total of 135,206 outpatient consultations were performed between 2011 and 2014 in our medical center specialized in rare eye diseases. The search software identified 29 congenital aniridia, 6 Axenfeld/Rieger syndrome, 11 BEPS, 3 Nanophthalmos, and 3 Rubinstein-Taybi syndrome. Discussion. EMR provides advantages for medical care. The use of ophthalmology-specific EMR is reliable and can contribute to a comprehensive ocular visual phenotype useful for clinical research. Conclusion. Routinely EMR acquired with specific software dedicated to ophthalmology provides sufficient detail for rare diseases. These software-collected data appear useful for creating patient cohorts and recording ocular examination, avoiding the time-consuming analysis of paper records and investigation, in a University Hospital linked to a National Reference Rare Center Disease. PMID:26539543

  20. PDSS/IMC qualification test software acceptance procedures

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Tests to be performed for qualifying the payload development support system image motion compensator (IMC) are identified. The performance of these tests will verify the IMC interfaces and thereby verify the qualification test software.

  1. DSN system performance test software

    NASA Technical Reports Server (NTRS)

    Martin, M.

    1978-01-01

    The system performance test software is currently being modified to include additional capabilities and enhancements. Additional software programs are currently being developed for the Command Store and Forward System and the Automatic Total Recall System. The test executive is the main program. It controls the input and output of the individual test programs by routing data blocks and operator directives to those programs. It also processes data block dump requests from the operator.

  2. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  3. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  4. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  5. Evaluation of management measures of software development. Volume 1: Analysis summary

    NASA Technical Reports Server (NTRS)

    Page, J.; Card, D.; Mcgarry, F.

    1982-01-01

    The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.

  6. Test-driven programming

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2013-12-01

    In this paper, are presented some possibilities concerning the implementation of a test-driven development as a programming method. Here is offered a different point of view for creation of advanced programming techniques (build tests before programming source with all necessary software tools and modules respectively). Therefore, this nontraditional approach for easier programmer's work through building tests at first is preferable way of software development. This approach allows comparatively simple programming (applied with different object-oriented programming languages as for example JAVA, XML, PYTHON etc.). It is predictable way to develop software tools and to provide help about creating better software that is also easier to maintain. Test-driven programming is able to replace more complicated casual paradigms, used by many programmers.

  7. Design and Optimization of a Telemetric system for appliance in earthquake prediction

    NASA Astrophysics Data System (ADS)

    Bogdos, G.; Tassoulas, E.; Vereses, A.; Papapanagiotou, A.; Filippi, K.; Koulouras, G.; Nomicos, C.

    2009-04-01

    This project's aim is to design a telemetric system which will be able to collect data from a digitizer, transform it into appropriate form and transfer this data to a central system where an on-line data elaboration will take place. On-line mathematical elaboration (fractal analysis) of pre-seismic electromagnetic signals and instant display may lead to safe earthquake prediction methodologies. Ad-hoc connections and heterogeneous topologies are the core network, while wired and wireless means cooperate for an accurate and on-time transmission. The nature of data is considered very sensitive so the transmission needs to be instant. All stations are situated in rural places in order to prevent electromagnetic interferences; this imposes continuous monitoring and provision of backup data links. The central stations collect the data of every station and allocate them properly in a predefined database. Special software is designed to elaborate mathematically the incoming data and export it graphically. The developing part included digitizer design, workstation software design, transmission protocol study and simulation on OPNET, database programming, mathematical data elaborations and software development for graphical representation. All the package was tested under lab conditions and tested in real conditions. The main aspect that this project serves is the very big interest for the scientific community in case this platform will eventually be implemented and then installed in Greek countryside in large scale. The platform is designed in such a way that techniques of data mining and mathematical elaboration are possible and any extension can be adapted. The main specialization of this project is that these mechanisms and mathematical transformations can be applied on live data. This can help to rapid exploitation of the real meaning of the measured and stored data. The elaboration of this study has as primary intention to help and alleviate the analysis process while triggering the scientific community to pay attention on seismic activities in Greece watching it on-line.

  8. Medical Data Architecture Project Capabilities and Design

    NASA Technical Reports Server (NTRS)

    Middour, C.; Krihak, M.; Lindsey, A.; Marker, N.; Wolfe, S.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2017-01-01

    Mission constraints will challenge the delivery of medical care on a long-term, deep space exploration mission. This type of mission will be restricted in the availability of medical knowledge, skills, procedures and resources to prevent, diagnose, and treat in-flight medical events. Challenges to providing medical care are anticipated, including resource and resupply constraints, delayed communications and no ability for medical evacuation. The Medical Data Architecture (MDA) project will enable medical care capability in this constrained environment. The first version of the system, called "Test Bed 1," includes capabilities for automated data collection, data storage and data retrieval to provide information to the Crew Medical Officer (CMO). Test Bed 1 seeks to establish a data architecture foundation and develop a scalable data management system through modular design and standardized interfaces. In addition, it will demonstrate to stakeholders the potential for an improved, automated, flow of data to and from the medical system over the current methods employed on the International Space Station (ISS). It integrates a set of external devices, software and processes, and a Subjective, Objective, Assessment, and Plan (SOAP) note commonly used by clinicians. Medical data like electrocardiogram plots, heart rate, skin temperature, respiration rate, medications taken, and more are collected from devices and stored in the Electronic Medical Records (EMR) system, and reported to crew and clinician. Devices integrated include the Astroskin biosensor vest and IMED CARDIAX electrocardiogram (ECG) device with INEED MD ECG Glove, and the NASA-developed Medical Dose Tracker application. The system is designed to be operated as a standalone system, and can be deployed in a variety of environments, from a laptop to a data center. The system is primarily composed of open-source software tools, and is designed to be modular, so new capabilities can be added. The software components and integration methods will be discussed.

  9. Process evaluation of software using the international classification of external causes of injuries for collecting burn injury data at burn centers in the United States.

    PubMed

    Villaveces, Andrés; Peck, Michael; Faraklas, Iris; Hsu-Chang, Naiwei; Joe, Victor; Wibbenmeyer, Lucy

    2014-01-01

    Detailed information on the cause of burns is necessary to construct effective prevention programs. The International Classification of External Causes of Injury (ICECI) is a data collection tool that allows comprehensive categorization of multiple facets of injury events. The objective of this study was to conduct a process evaluation of software designed to improve the ease of use of the ICECI so as to identify key additional variables useful for understanding the occurrence of burn injuries, and compare this software with existing data-collection practices conducted for burn injuries. The authors completed a process evaluation of the implementation and ease of use of the software in six U.S. burn centers. They also collected preliminary burn injury data and compared them with existing variables reported to the American Burn Association's National Burn Repository (NBR). The authors accomplished their goals of 1) creating a data-collection tool for the ICECI, which can be linked to existing operational programs of the NBR, 2) training registrars in the use of this tool, 3) establishing quality-control mechanisms for ensuring accuracy and reliability, 4) incorporating ICECI data entry into the weekly routine of the burn registrar, and 5) demonstrating the quality differences between data collected using this tool and the NBR. Using this or similar tools with the ICECI structure or key selected variables can improve the quantity and quality of data on burn injuries in the United States and elsewhere and thus can be more useful in informing prevention strategies.

  10. ShakeNet: a portable wireless sensor network for instrumenting large civil structures

    USGS Publications Warehouse

    Kohler, Monica D.; Hao, Shuai; Mishra, Nilesh; Govindan, Ramesh; Nigbor, Robert

    2015-08-03

    We report our findings from a U.S. Geological Survey (USGS) National Earthquake Hazards Reduction Program-funded project to develop and test a wireless, portable, strong-motion network of up to 40 triaxial accelerometers for structural health monitoring. The overall goal of the project was to record ambient vibrations for several days from USGS-instrumented structures. Structural health monitoring has important applications in fields like civil engineering and the study of earthquakes. The emergence of wireless sensor networks provides a promising means to such applications. However, while most wireless sensor networks are still in the experimentation stage, very few take into consideration the realistic earthquake engineering application requirements. To collect comprehensive data for structural health monitoring for civil engineers, high-resolution vibration sensors and sufficient sampling rates should be adopted, which makes it challenging for current wireless sensor network technology in the following ways: processing capabilities, storage limit, and communication bandwidth. The wireless sensor network has to meet expectations set by wired sensor devices prevalent in the structural health monitoring community. For this project, we built and tested an application-realistic, commercially based, portable, wireless sensor network called ShakeNet for instrumentation of large civil structures, especially for buildings, bridges, or dams after earthquakes. Two to three people can deploy ShakeNet sensors within hours after an earthquake to measure the structural response of the building or bridge during aftershocks. ShakeNet involved the development of a new sensing platform (ShakeBox) running a software suite for networking, data collection, and monitoring. Deployments reported here on a tall building and a large dam were real-world tests of ShakeNet operation, and helped to refine both hardware and software. 

  11. THRESH—Software for tracking rainfall thresholds for landslide and debris-flow occurrence, user manual

    USGS Publications Warehouse

    Baum, Rex L.; Fischer, Sarah J.; Vigil, Jacob C.

    2018-02-28

    Precipitation thresholds are used in many areas to provide early warning of precipitation-induced landslides and debris flows, and the software distribution THRESH is designed for automated tracking of precipitation, including precipitation forecasts, relative to thresholds for landslide occurrence. This software is also useful for analyzing multiyear precipitation records to compare timing of threshold exceedance with dates and times of historical landslides. This distribution includes the main program THRESH for comparing precipitation to several kinds of thresholds, two utility programs, and a small collection of Python and shell scripts to aid the automated collection and formatting of input data and the graphing and further analysis of output results. The software programs can be deployed on computing platforms that support Fortran 95, Python 2, and certain Unix commands. The software handles rainfall intensity-duration thresholds, cumulative recent-antecedent precipitation thresholds, and peak intensity thresholds as well as various measures of antecedent precipitation. Users should have predefined rainfall thresholds before running THRESH.

  12. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  13. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1993-01-01

    Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.

  14. An Overview of the Smart Sensor Inter-Agency Reference Testbench (SSIART)

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond S.; Braham, Stephen P.; Dufour, Jean-Francois; Barton, Richard J.

    2012-01-01

    In this paper, we present an overview of a proposed collaboration between the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA), which is designed to facilitate the introduction of commercial-off-the-shelf (COTS) radios for smart-sensing applications into international spaceflight programs and projects. The proposed work will produce test hardware reference designs, test software reference architectures and example implementations, test plans in reference test environments, and test results, all of which will be shared between the agencies and documented for future use by mission planners. The proposed collaborative structure together with all of the anticipated tools and results produced under the effort is collectively referred to as the Smart Sensor Inter-agency Reference Testbench or SSIART. It is intended to provide guidance in technology selection and in increasing the related readiness levels of projects and missions as well as the space industry.

  15. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  16. Cellscope Aquatic: a Lab Quality, Portable Cellphone-Based Microscope for On-Site Collection of Algae Images

    NASA Astrophysics Data System (ADS)

    Steinberg, S. J.; Howard, M. D.

    2016-02-01

    Collecting algae samples from the field presents issues of specimen damage or degradation caused by preservation methods, handling and transport to laboratory facilities for identification. Traditionally, in-field collection of high quality microscopic images has not been possible due to the size, weight and fragility of high quality instruments and training of field staff in species identification. Scientists at the Southern California Coastal Water Research Project (SCCWRP) in collaboration with the Fletcher Lab, University of California Berkeley, Department of Bioengineering, tested and translated Fletcher's original medical CellScope for use in environmental monitoring applications. Field tests conducted by SCCWRP in 2014 led to modifications of the clinical CellScope to one better suited to in-field microscopic imaging for aquatic organisms. SCCWRP subsequently developed a custom cell-phone application to acquire microscopic imagery using the "CellScope Aquatic "in combination with other cell-phone derived field data (e.g. GPS location, date, time and other field observations). Data and imagery collected in-field may be transmitted in real-time to a web-based data system for tele-taxonomy evaluation and assessment by experts in the office. These hardware and software tools was tested in field in a variety of conditions and settings by multiple algae experts during the spring and summer of 2015 to further test and refine the CellScope Aquatic platform. The CellScope Aquatic provides an easy-to-use, affordable, lightweight, professional quality, data collection platform for environmental monitoring. Our ongoing efforts will focus on development of real-time expert systems for data analysis and image processing, to provide onsite feedback to field scientists.

  17. Software errors and complexity: An empirical investigation

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Perricone, Berry T.

    1983-01-01

    The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.

  18. Software errors and complexity: An empirical investigation

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Perricone, B. T.

    1982-01-01

    The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.

  19. ICESat (GLAS) Science Processing Software Document Series. Volume 3; GLAS Science Software Requirements Document; Ver 2.1

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Lee, Jeffrey; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    This document addresses the software requirements of the Geoscience Laser Altimeter System (GLAS) Standard Data Software (SDS) supporting the GLAS instrument on the EOS ICESat Spacecraft. This Software Requirements Document represents the initial collection of the technical engineering information for the GLAS SDS. This information is detailed within the second of four main volumes of the Standard documentation, the Product Specification volume. This document is a "roll-out" from the governing volume outline containing the Concept and Requirements sections.

  20. The implementation of intelligent home controller

    NASA Astrophysics Data System (ADS)

    Li, Biqing; Li, Zhao

    2018-04-01

    This paper mainly talks about the working way of smart home terminal controller and the design of hardware and software. Controlling the lights and by simulating the lamp and the test of the curtain, destroy the light of lamp ON-OFF and the curtain's UP-DOWN by simulating the lamp and the test of the cuetain. Through the sensor collects the ambient information and sends to the network, such as light, temperature and humidity. Besides, it can realise the control of intelligent home control by PCS. Terminal controller of intelligent home which is based on ZiBee technology has into the intelligent home system, it provides people with convenient, safe and intelligent household experience.

  1. Dynamic Emulation of NASA Missions for IVandV: A Case Study of JWST and SLS

    NASA Technical Reports Server (NTRS)

    Yokum, Steve

    2015-01-01

    Software-Only-Simulations are an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations ranging from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).

  2. pyam: Python Implementation of YaM

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.

  3. Development of new data acquisition system for COMPASS experiment

    NASA Astrophysics Data System (ADS)

    Bodlak, M.; Frolov, V.; Jary, V.; Huber, S.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Virius, M.

    2016-04-01

    This paper presents development and recent status of the new data acquisiton system of the COMPASS experiment at CERN with up to 50 kHz trigger rate and 36 kB average event size during 10 second period with beam followed by approximately 40 second period without beam. In the original DAQ, the event building is performed by software deployed on switched computer network, moreover the data readout is based on deprecated PCI technology; the new system replaces the event building network with a custom FPGA-based hardware. The custom cards are introduced and advantages of the FPGA technology for DAQ related tasks are discussed. In this paper, we focus on the software part that is mainly responsible for control and monitoring. The most of the system can run as slow control; only readout process has realtime requirements. The design of the software is built on state machines that are implemented using the Qt framework; communication between remote nodes that form the software architecture is based on the DIM library and IPBus technology. Furthermore, PHP and JS languages are used to maintain system configuration; the MySQL database was selected as storage for both configuration of the system and system messages. The system has been design with maximum throughput of 1500 MB/s and large buffering ability used to spread load on readout computers over longer period of time. Great emphasis is put on data latency, data consistency, and even timing checks which are done at each stage of event assembly. System collects results of these checks which together with special data format allows the software to localize origin of problems in data transmission process. A prototype version of the system has already been developed and tested the new system fulfills all given requirements. It is expected that the full-scale version of the system will be finalized in June 2014 and deployed on September provided that tests with cosmic run succeed.

  4. Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) user's guide and system description

    NASA Technical Reports Server (NTRS)

    Lo, P. S.; Card, D.

    1983-01-01

    The Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) is explained. The various software facilities of the SEL, DBAM operating procedures, and DBAM system information are described. The relationships among DBAM components (baseline diagrams), component descriptions, overlay descriptions, indirect command file listings, file definitions, and sample data collection forms are provided.

  5. Improving the Effectiveness of Program Managers

    DTIC Science & Technology

    2006-05-03

    Improving the Effectiveness of Program Managers Systems and Software Technology Conference Salt Lake City, Utah May 3, 2006 Presented by GAO’s...Companies’ best practices Motorola Caterpillar Toyota FedEx NCR Teradata Boeing Hughes Space and Communications Disciplined software and management...and total ownership costs Collection of metrics data to improve software reliability Technology readiness levels and design maturity Statistical

  6. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The

  7. Rules of thumb to increase the software quality through testing

    NASA Astrophysics Data System (ADS)

    Buttu, M.; Bartolini, M.; Migoni, C.; Orlati, A.; Poppi, S.; Righini, S.

    2016-07-01

    The software maintenance typically requires 40-80% of the overall project costs, and this considerable variability mostly depends on the software internal quality: the more the software is designed and implemented to constantly welcome new changes, the lower will be the maintenance costs. The internal quality is typically enforced through testing, which in turn also affects the development and maintenance costs. This is the reason why testing methodologies have become a major concern for any company that builds - or is involved in building - software. Although there is no testing approach that suits all contexts, we infer some general guidelines learned during the Development of the Italian Single-dish COntrol System (DISCOS), which is a project aimed at producing the control software for the three INAF radio telescopes (the Medicina and Noto dishes, and the newly-built SRT). These guidelines concern both the development and the maintenance phases, and their ultimate goal is to maximize the DISCOS software quality through a Behavior-Driven Development (BDD) workflow beside a continuous delivery pipeline. We consider different topics and patterns; they involve the proper apportion of the tests (from end-to-end to low-level tests), the choice between hardware simulators and mockers, why and how to apply TDD and the dependency injection to increase the test coverage, the emerging technologies available for test isolation, bug fixing, how to protect the system from the external resources changes (firmware updating, hardware substitution, etc.) and, eventually, how to accomplish BDD starting from functional tests and going through integration and unit tests. We discuss pros and cons of each solution and point out the motivations of our choices either as a general rule or narrowed in the context of the DISCOS project.

  8. The Alignment of Software Testing Skills of IS Students with Industry Practices--A South African Perspective

    ERIC Educational Resources Information Center

    Scott, Elsje; Zadirov, Alexander; Feinberg, Sean; Jayakody, Ruwanga

    2004-01-01

    Software testing is a crucial component in the development of good quality systems in industry. For this reason it was considered important to investigate the extent to which the Information Systems (IS) syllabus at the University of Cape Town (UCT) was aligned with accepted software testing practices in South Africa. For students to be effective…

  9. Acquisition Handbook - Update. Comprehensive Approach to Reusable Defensive Software (CARDS)

    DTIC Science & Technology

    1994-03-25

    designs, and implementation components (source code, test plans, procedures and results, and system/software documentation). This handbook provides a...activities where software components are acquired, evaluated, tested and sometimes modified. In addition to serving as a facility for the acquisition and...systems from such components [1]. Implementation components are at the lowest level and consist of: specifications; detailed designs; code, test

  10. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  11. Evaluating the interior thermal performance of mosques in the tropical environment

    NASA Astrophysics Data System (ADS)

    Nordin, N. I.; Misni, A.

    2018-02-01

    This study introduces the methodology applied in conducting data collection and data analysis. Data collection is the process of gathering and measuring information on targeted variables in an established systematic method. Qualitative and quantitative methods are combined in collecting data from government departments, site experiments and observation. Furthermore, analysing the indoor thermal performance data in the heritage and new mosques were used thermal monitoring tests, while validation will be made by meteorology data. Origin 8 version of the software is used to analyse all the data. Comparison techniques were applied to analyse several factors that influence the indoor thermal performance of mosques, namely building envelope include floor area, opening, and material used. Building orientation, location, surrounding vegetation and water elements are also recorded as supported building primary data. The comparison of primary data using these variables for four mosques include heritage and new buildings were revealed.

  12. Preparing for the first meeting with a statistician.

    PubMed

    De Muth, James E

    2008-12-15

    Practical statistical issues that should be considered when performing data collection and analysis are reviewed. The meeting with a statistician should take place early in the research development before any study data are collected. The process of statistical analysis involves establishing the research question, formulating a hypothesis, selecting an appropriate test, sampling correctly, collecting data, performing tests, and making decisions. Once the objectives are established, the researcher can determine the characteristics or demographics of the individuals required for the study, how to recruit volunteers, what type of data are needed to answer the research question(s), and the best methods for collecting the required information. There are two general types of statistics: descriptive and inferential. Presenting data in a more palatable format for the reader is called descriptive statistics. Inferential statistics involve making an inference or decision about a population based on results obtained from a sample of that population. In order for the results of a statistical test to be valid, the sample should be representative of the population from which it is drawn. When collecting information about volunteers, researchers should only collect information that is directly related to the study objectives. Important information that a statistician will require first is an understanding of the type of variables involved in the study and which variables can be controlled by researchers and which are beyond their control. Data can be presented in one of four different measurement scales: nominal, ordinal, interval, or ratio. Hypothesis testing involves two mutually exclusive and exhaustive statements related to the research question. Statisticians should not be replaced by computer software, and they should be consulted before any research data are collected. When preparing to meet with a statistician, the pharmacist researcher should be familiar with the steps of statistical analysis and consider several questions related to the study to be conducted.

  13. Analysis of a mammography teaching program based on an affordance design model.

    PubMed

    Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei

    2006-12-01

    The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.

  14. Cavendish Balance Automation

    NASA Technical Reports Server (NTRS)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  15. Robust binarization of degraded document images using heuristics

    NASA Astrophysics Data System (ADS)

    Parker, Jon; Frieder, Ophir; Frieder, Gideon

    2013-12-01

    Historically significant documents are often discovered with defects that make them difficult to read and analyze. This fact is particularly troublesome if the defects prevent software from performing an automated analysis. Image enhancement methods are used to remove or minimize document defects, improve software performance, and generally make images more legible. We describe an automated, image enhancement method that is input page independent and requires no training data. The approach applies to color or greyscale images with hand written script, typewritten text, images, and mixtures thereof. We evaluated the image enhancement method against the test images provided by the 2011 Document Image Binarization Contest (DIBCO). Our method outperforms all 2011 DIBCO entrants in terms of average F1 measure - doing so with a significantly lower variance than top contest entrants. The capability of the proposed method is also illustrated using select images from a collection of historic documents stored at Yad Vashem Holocaust Memorial in Israel.

  16. Digital Partnerships for Health: Steps to develop a community-specific health portal aimed at promoting health and well-being

    PubMed Central

    Kukafka, Rita; Khan, Sharib A.; Hutchinson, Carly; McFarlane, Delano J.; Li, Jianhua; Ancker, Jessica S.; Cohall, Alwyn

    2007-01-01

    We describe the steps taken by the Harlem Health Promotion Center to develop a community-specific health web portal aimed at promoting health and well-being in Harlem. Methods and results that begin with data collection and move onto elucidating requirements for the web portal are discussed. Sentiments of distrust in medical institutions, and the desire for community specific content and resources were among the needs emanating from our data analysis. These findings guided our decision to customize social software designed to foster connections, collaborations, flexibility, and interactivity; an “architecture of participation”. While we maintain that the leveraging of social software may indeed be the way to build healthy communities and support learning and engagement in underserved communities, our conclusion calls for careful thinking, testing and evaluation research to establish best practice models for leveraging these emerging technologies to support health improvements in the community. PMID:18693872

  17. Results from Automated Cloud and Dust Devil Detection Onboard the MER

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Castano, Rebecca; Bornstein, Benjamin; Fukunaga, Alex; Castano, Andres; Biesiadecki, Jeffrey; Greeley, Ron; Whelley, Patrick; Lemmon, Mark

    2008-01-01

    We describe a new capability to automatically detect dust devils and clouds in imagery onboard rovers, enabling downlink of just the images with the targets or only portions of the images containing the targets. Previously, the MER rovers conducted campaigns to image dust devils and clouds by commanding a set of images be collected at fixed times and downloading the entire image set. By increasing the efficiency of the campaigns, more campaigns can be executed. Software for these new capabilities was developed, tested, integrated, uploaded, and operationally checked out on both rovers as part of the R9.2 software upgrade. In April 2007 on Sol 1147 a dust devil was automatically detected onboard the Spirit rover for the first time. We discuss the operational usage of the capability and present initial dust devil results showing how this preliminary application has demonstrated the feasibility and potential benefits of the approach.

  18. Reliability and availability analysis of a 10 kW@20 K helium refrigerator

    NASA Astrophysics Data System (ADS)

    Li, J.; Xiong, L. Y.; Liu, L. Q.; Wang, H. R.; Wang, B. M.

    2017-02-01

    A 10 kW@20 K helium refrigerator has been established in the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. To evaluate and improve this refrigerator’s reliability and availability, a reliability and availability analysis is performed. According to the mission profile of this refrigerator, a functional analysis is performed. The failure data of the refrigerator components are collected and failure rate distributions are fitted by software Weibull++ V10.0. A Failure Modes, Effects & Criticality Analysis (FMECA) is performed and the critical components with higher risks are pointed out. Software BlockSim V9.0 is used to calculate the reliability and the availability of this refrigerator. The result indicates that compressors, turbine and vacuum pump are the critical components and the key units of this refrigerator. The mitigation actions with respect to design, testing, maintenance and operation are proposed to decrease those major and medium risks.

  19. Gaming and Simulating Ethno-Political Conflicts

    NASA Astrophysics Data System (ADS)

    Silverman, Barry G.; Bharathy, Gnana K.; Nye, Benjamin D.

    This chapter begins by describing a universally recurring socio-cultural “game” of inter-group competition for control of resources. It next describes efforts to author software agents able to play the game as real humans would - which suggests the ability to study alternative ways to influence them, observe PMESII effects, and potentially understand how best to alter the outcomes of potential conflict situations. These agents are unscripted, but use their decision making to react to events as they unfold and to plan out responses. For each agent, a software called PMFserv operates its perception and runs its physiology and personality/value system to determine fatigue and hunger, injuries and related stressors, grievances, tension buildup, impact of rumors and speech acts, emotions, and various collective and individual action decisions. The chapter wraps up with a correspondence test from a SE Asian ethnic conflict, the results of which indicate significant correlation between real and agentbased outcomes.

  20. Grasping objects autonomously in simulated KC-135 zero-g

    NASA Technical Reports Server (NTRS)

    Norsworthy, Robert S.

    1994-01-01

    The KC-135 aircraft was chosen for simulated zero gravity testing of the Extravehicular Activity Helper/retriever (EVAHR). A software simulation of the EVAHR hardware, KC-135 flight dynamics, collision detection and grasp inpact dynamics has been developed to integrate and test the EVAHR software prior to flight testing on the KC-135. The EVAHR software will perform target pose estimation, tracking, and motion estimation for rigid, freely rotating, polyhedral objects. Manipulator grasp planning and trajectory control software has also been developed to grasp targets while avoiding collisions.

  1. Software Quality Metrics: A Software Management Monitoring Method for Air Force Logistics Command in Its Software Quality Assurance Program for the Quantitative Assessment of the System Development Life Cycle under Configuration Management.

    DTIC Science & Technology

    1982-03-01

    pilot systems. Magnitude of the mutant error is classified as: o Program does not compute. o Program computes but does not run test data. o Program...14 Test and Integration ... ............ .. 105 15 The Mapping of SQM to the SDLC ........ ... 108 16 ADS Development .... .............. . 224 17...and funds. While the test phase concludes the normal development cycle, one should realize that with software the development continues in the

  2. SCIL Executive Summaries.

    ERIC Educational Resources Information Center

    Samuels, Alan R.; And Others

    1987-01-01

    These five papers by speakers at the Small Computers in Libraries 1987 conference include: "Acquiring and Using Shareware in Building Small Scale Automated Information systems" (Samuels); "A Software Lending Collection" (Talab); "Providing Subject Access to Microcomputer Software" (Mitchell); "Interfacing Vendor…

  3. cPath: open source software for collecting, storing, and querying biological pathways

    PubMed Central

    Cerami, Ethan G; Bader, Gary D; Gross, Benjamin E; Sander, Chris

    2006-01-01

    Background Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. Results We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. Conclusion cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling. PMID:17101041

  4. Preliminary design review package for the solar heating and cooling central data processing system

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Central Data Processing System (CDPS) is designed to transform the raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems. Software requirements for the CDPS are described. The programming standards to be used in development, documentation, and maintenance of the software are discussed along with the CDPS operations approach in support of daily data collection and processing.

  5. Adaptive Integration of Nonsmooth Dynamical Systems

    DTIC Science & Technology

    2017-10-11

    controlled time stepping method to interactively design running robots. [1] John Shepherd, Samuel Zapolsky, and Evan M. Drumwright, “Fast multi-body...software like this to test software running on my robots. Started working in simulation after attempting to use software like this to test software... running on my robots. The libraries that produce these beautiful results have failed at simulating robotic manipulation. Postulate: It is easier to

  6. Guidance and Navigation Software Architecture Design for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Test Bed

    DTIC Science & Technology

    2006-12-01

    NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI

  7. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  8. Waste retrieval sluicing system data acquisition system acceptance test report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevins, R.R.

    1998-07-31

    This document describes the test procedure for the Project W-320 Tank C-106 Sluicing Data Acquisition System (W-320 DAS). The Software Test portion will test items identified in the WRSS DAS System Description (SD), HNF-2115. Traceability to HNF-2115 will be via a reference that follows in parenthesis, after the test section title. The Field Test portion will test sensor operability, analog to digital conversion, and alarm setpoints for field instrumentation. The W-320 DAS supplies data to assist thermal modeling of tanks 241-C-106 and 241-AY-102. It is designed to be a central repository for information from sources that would otherwise have tomore » be read, recorded, and integrated manually. Thus, completion of the DAS requires communication with several different data collection devices and output to a usable PC data formats. This test procedure will demonstrate that the DAS functions as required by the project requirements stated in Section 3 of the W-320 DAS System Description, HNF-2115.« less

  9. Software for Automated Testing of Mission-Control Displays

    NASA Technical Reports Server (NTRS)

    OHagan, Brian

    2004-01-01

    MCC Display Cert Tool is a set of software tools for automated testing of computerterminal displays in spacecraft mission-control centers, including those of the space shuttle and the International Space Station. This software makes it possible to perform tests that are more thorough, take less time, and are less likely to lead to erroneous results, relative to tests performed manually. This software enables comparison of two sets of displays to report command and telemetry differences, generates test scripts for verifying telemetry and commands, and generates a documentary record containing display information, including version and corrective-maintenance data. At the time of reporting the information for this article, work was continuing to add a capability for validation of display parameters against a reconfiguration file.

  10. The Advanced Communication Technology Satellite and ISDN

    NASA Technical Reports Server (NTRS)

    Lowry, Peter A.

    1996-01-01

    This paper depicts the Advanced Communication Technology Satellite (ACTS) system as a global central office switch. The ground portion of the system is the collection of earth stations or T1-VSAT's (T1 very small aperture terminals). The control software for the T1-VSAT's resides in a single CPU. The software consists of two modules, the modem manager and the call manager. The modem manager (MM) controls the RF modem portion of the T1-VSAT. It processes the orderwires from the satellite or from signaling generated by the call manager (CM). The CM controls the Recom Laboratories MSPs by receiving signaling messages from the stacked MSP shelves ro units and sending appropriate setup commands to them. There are two methods used to setup and process calls in the CM; first by dialing up a circuit using a standard telephone handset or, secondly by using an external processor connected to the CPU's second COM port, by sending and receiving signaling orderwires. It is the use of the external processor which permits the ISDN (Integrated Services Digital Network) Signaling Processor to implement ISDN calls. In August 1993, the initial testing of the ISDN Signaling Processor was carried out at ACTS System Test at Lockheed Marietta, Princeton, NJ using the spacecraft in its test configuration on the ground.

  11. NASA Data Acquisitions System (NDAS) Software Architecture

    NASA Technical Reports Server (NTRS)

    Davis, Dawn; Duncan, Michael; Franzl, Richard; Holladay, Wendy; Marshall, Peggi; Morris, Jon; Turowski, Mark

    2012-01-01

    The NDAS Software Project is for the development of common low speed data acquisition system software to support NASA's rocket propulsion testing facilities at John C. Stennis Space Center (SSC), White Sands Test Facility (WSTF), Plum Brook Station (PBS), and Marshall Space Flight Center (MSFC).

  12. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  13. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  14. Estimating BrAC from transdermal alcohol concentration data using the BrAC estimator software program.

    PubMed

    Luczak, Susan E; Rosen, I Gary

    2014-08-01

    Transdermal alcohol sensor (TAS) devices have the potential to allow researchers and clinicians to unobtrusively collect naturalistic drinking data for weeks at a time, but the transdermal alcohol concentration (TAC) data these devices produce do not consistently correspond with breath alcohol concentration (BrAC) data. We present and test the BrAC Estimator software, a program designed to produce individualized estimates of BrAC from TAC data by fitting mathematical models to a specific person wearing a specific TAS device. Two TAS devices were worn simultaneously by 1 participant for 18 days. The trial began with a laboratory alcohol session to calibrate the model and was followed by a field trial with 10 drinking episodes. Model parameter estimates and fit indices were compared across drinking episodes to examine the calibration phase of the software. Software-generated estimates of peak BrAC, time of peak BrAC, and area under the BrAC curve were compared with breath analyzer data to examine the estimation phase of the software. In this single-subject design with breath analyzer peak BrAC scores ranging from 0.013 to 0.057, the software created consistent models for the 2 TAS devices, despite differences in raw TAC data, and was able to compensate for the attenuation of peak BrAC and latency of the time of peak BrAC that are typically observed in TAC data. This software program represents an important initial step for making it possible for non mathematician researchers and clinicians to obtain estimates of BrAC from TAC data in naturalistic drinking environments. Future research with more participants and greater variation in alcohol consumption levels and patterns, as well as examination of gain scheduling calibration procedures and nonlinear models of diffusion, will help to determine how precise these software models can become. Copyright © 2014 by the Research Society on Alcoholism.

  15. A data model for clinical legal medicine practice and the development of a dedicated software for both practitioners and researchers.

    PubMed

    Dang, Catherine; Phuong, Thomas; Beddag, Mahmoud; Vega, Anabel; Denis, Céline

    2018-07-01

    To present a data model for clinical legal medicine and the software based on that data model for both practitioners and researchers. The main functionalities of the presented software are computer-assisted production of medical certificates and data capture, storage and retrieval. The data model and the software were jointly developed by the department of forensic medicine of the Jean Verdier Hospital (Bondy, France) and an bioinformatics laboratory (LIMICS, Paris universities 6-13) between November 2015 and May 2016. The data model was built based on four sources: i) a template used in our department for producing standardised medical certificates; ii) a random sample of medical certificates produced by the forensic department; iii) anterior consensus between four healthcare professionals (two forensic practitioners, a psychologist and a forensic psychiatrist) and iv) anatomical dictionaries. The trial version of the open source software was first designed for examination of physical assault survivors. An UML-like data model dedicated to clinical legal practice was built. The data model describes the terminology for examinations of sexual assault survivors, physical assault survivors, individuals kept in police custody and undocumented migrants for age estimation. A trial version of a software relying on the data model was developed and tested by three physicians. The software allows files archiving, standardised data collection, extraction and assistance for certificate generation. It can be used for research purpose, by data exchange and analysis. Despite some current limitations of use, it is a tool which can be shared and used by other departments of forensic medicine and other specialties, improving data management and exploitation. Full integration with external sources, analytics software and use of a semantic interoperability framework are planned for the next months. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  16. micROS: a morphable, intelligent and collective robot operating system.

    PubMed

    Yang, Xuejun; Dai, Huadong; Yi, Xiaodong; Wang, Yanzhen; Yang, Shaowu; Zhang, Bo; Wang, Zhiyuan; Zhou, Yun; Peng, Xuefeng

    2016-01-01

    Robots are developing in much the same way that personal computers did 40 years ago, and robot operating system is the critical basis. Current robot software is mainly designed for individual robots. We present in this paper the design of micROS, a morphable, intelligent and collective robot operating system for future collective and collaborative robots. We first present the architecture of micROS, including the distributed architecture for collective robot system as a whole and the layered architecture for every single node. We then present the design of autonomous behavior management based on the observe-orient-decide-act cognitive behavior model and the design of collective intelligence including collective perception, collective cognition, collective game and collective dynamics. We also give the design of morphable resource management, which first categorizes robot resources into physical, information, cognitive and social domains, and then achieve morphability based on self-adaptive software technology. We finally deploy micROS on NuBot football robots and achieve significant improvement in real-time performance.

  17. Analysis of the Articulated Total Body (ATB) and Mathematical Dynamics Model (MADYMO) Software Suites for Modeling Anthropomorphic Test Devices (ATDs) in Blast Environments

    DTIC Science & Technology

    2013-05-01

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 ...valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1 . REPORT DATE (DD-MM-YYYY) May 2013 2. REPORT TYPE Final 3. DATES...area code) (410) 278-7386 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures v List of Tables v 1

  18. MAGNAMWAR: an R package for genome-wide association studies of bacterial orthologs.

    PubMed

    Sexton, Corinne E; Smith, Hayden Z; Newell, Peter D; Douglas, Angela E; Chaston, John M

    2018-06-01

    Here we report on an R package for genome-wide association studies of orthologous genes in bacteria. Before using the software, orthologs from bacterial genomes or metagenomes are defined using local or online implementations of OrthoMCL. These presence-absence patterns are statistically associated with variation in user-collected phenotypes using the Mono-Associated GNotobiotic Animals Metagenome-Wide Association R package (MAGNAMWAR). Genotype-phenotype associations can be performed with several different statistical tests based on the type and distribution of the data. MAGNAMWAR is available on CRAN. john_chaston@byu.edu.

  19. A Case Study of 4 & 5 Cost Effectiveness

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; McCaugherty, Dan; Joshi, Tulasi; Callahan, John

    1997-01-01

    This paper looks at the Independent Verification and Validation (IV&V) of NASA's Space Shuttle Day of Launch I-Load Update (DoLILU) project. IV&V is defined. The system's development life cycle is explained. Data collection and analysis are described. DoLILU Issue Tracking Reports (DITRs) authored by IV&V personnel are analyzed to determine the effectiveness of IV&V in finding errors before the code, testing, and integration phase of the software development life cycle. The study's findings are reported along with the limitations of the study and planned future research.

  20. Wideband propagation measurement system using spread spectrum signaling and TDRS

    NASA Technical Reports Server (NTRS)

    Jenkins, Jeffrey D.; Fan, Yiping; Osborne, William P.

    1995-01-01

    In this paper, a wideband propagation measurement system, which consisted of a ground-based transmitter, a mobile receiver, and a data acquisition system, was constructed. This system has been employed in a study of the characteristics of different propagation environments, such as urban, suburban and rural areas, by using a pseudonoise spreading sequence transmitted over NASA's Tracking and Data Relay Satellite System. The hardware and software tests showed that it met overall system requirements and it was very robust during a 3-month-long outdoor data collection experiment.

Top