Sample records for instrument control software

  1. Instrumentation: Software-Driven Instrumentation: The New Wave.

    ERIC Educational Resources Information Center

    Salit, M. L.; Parsons, M. L.

    1985-01-01

    Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…

  2. The New Cloud Absorption Radiometer (CAR) Software: One Model for NASA Remote Sensing Virtual Instruments

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Rapchun, David A.; Jones, Hollis H.

    2001-01-01

    The Cloud Absorption Radiometer (CAR) instrument has been the most frequently used airborne instrument built in-house at NASA Goddard Space Flight Center, having flown scientific research missions on-board various aircraft to many locations in the United States, Azores, Brazil, and Kuwait since 1983. The CAR instrument is capable of measuring scattered light by clouds in fourteen spectral bands in UV, visible and near-infrared region. This document describes the control, data acquisition, display, and file storage software for the new version of CAR. This software completely replaces the prior CAR Data System and Control Panel with a compact and robust virtual instrument computer interface. Additionally, the instrument is now usable for the first time for taking data in an off-aircraft mode. The new instrument is controlled via a LabVIEW v5. 1.1-developed software interface that utilizes, (1) serial port writes to write commands to the controller module of the instrument, and (2) serial port reads to acquire data from the controller module of the instrument. Step-by-step operational procedures are provided in this document. A suite of other software programs has been developed to complement the actual CAR virtual instrument. These programs include: (1) a simulator mode that allows pretesting of new features that might be added in the future, as well as demonstrations to CAR customers, and development at times when the instrument/hardware is off-location, and (2) a post-experiment data viewer that can be used to view all segments of individual data cycles and to locate positions where 'start' and stop' byte sequences were incorrectly formulated by the instrument controller. The CAR software described here is expected to be the basis for CAR operation for many missions and many years to come.

  3. Software Framework for Controlling Unsupervised Scientific Instruments.

    PubMed

    Schmid, Benjamin; Jahr, Wiebke; Weber, Michael; Huisken, Jan

    2016-01-01

    Science outreach and communication are gaining more and more importance for conveying the meaning of today's research to the general public. Public exhibitions of scientific instruments can provide hands-on experience with technical advances and their applications in the life sciences. The software of such devices, however, is oftentimes not appropriate for this purpose. In this study, we describe a software framework and the necessary computer configuration that is well suited for exposing a complex self-built and software-controlled instrument such as a microscope to laymen under limited supervision, e.g. in museums or schools. We identify several aspects that must be met by such software, and we describe a design that can simultaneously be used to control either (i) a fully functional instrument in a robust and fail-safe manner, (ii) an instrument that has low-cost or only partially working hardware attached for illustration purposes or (iii) a completely virtual instrument without hardware attached. We describe how to assess the educational success of such a device, how to monitor its operation and how to facilitate its maintenance. The introduced concepts are illustrated using our software to control eduSPIM, a fluorescent light sheet microscope that we are currently exhibiting in a technical museum.

  4. AMBER instrument control software

    NASA Astrophysics Data System (ADS)

    Le Coarer, Etienne P.; Zins, Gerard; Gluck, Laurence; Duvert, Gilles; Driebe, Thomas; Ohnaka, Keiichi; Heininger, Matthias; Connot, Claus; Behrend, Jan; Dugue, Michel; Clausse, Jean Michel; Millour, Florentin

    2004-09-01

    AMBER (Astronomical Multiple BEam Recombiner) is a 3 aperture interferometric recombiner operating between 1 and 2.5 um, for the Very Large Telescope Interferometer (VLTI). The control software of the instrument, based on the VLT Common Software, has been written to comply with specific features of the AMBER hardware, such as the Infrared detector read out modes or piezo stage drivers, as well as with the very specific operation modes of an interferomtric instrument. In this respect, the AMBER control software was designed to insure that all operations, from the preparation of the observations to the control/command of the instrument during the observations, would be kept as simple as possible for the users and operators, opening the use of an interferometric instrument to the largest community of astronomers. Peculiar attention was given to internal checks and calibration procedures both to evaluate data quality in real time, and improve the successes of long term UV plane coverage observations.

  5. Software structure for Vega/Chara instrument

    NASA Astrophysics Data System (ADS)

    Clausse, J.-M.

    2008-07-01

    VEGA (Visible spEctroGraph and polArimeter) is one of the focal instruments of the CHARA array at Mount Wilson near Los Angeles. Its control system is based on techniques developed on the GI2T interferometer (Grand Interferometre a 2 Telescopes) and on the SIRIUS fibered hyper telescope testbed at OCA (Observatoire de la Cote d'Azur). This article describes the software and electronics architecture of the instrument. It is based on local network architecture and uses also Virtual Private Network connections. The server part is based on Windows XP (VC++). The control software is on Linux (C, GTK). For the control of the science detector and the fringe tracking systems, distributed API use real-time techniques. The control software gathers all the necessary informations of the instrument. It allows an automatic management of the instrument by using an original task scheduler. This architecture intends to drive the instrument from remote sites, such as our institute in South of France.

  6. Achieving design reuse: a case study

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Nielsen, Jon J.; Roberts, William H.; Wilson, Greg M.

    2008-08-01

    The RSAA CICADA data acquisition and control software package uses an object-oriented approach to model astronomical instrumentation and a layered architecture for implementation. Emphasis has been placed on building reusable C++ class libraries and on the use of attribute/value tables for dynamic configuration. This paper details how the approach has been successfully used in the construction of the instrument control software for the Gemini NIFS and GSAOI instruments. The software is again being used for the new RSAA SkyMapper and WiFeS instruments.

  7. LC-IM-TOF Instrument Control & Data Visualization Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-05-12

    Liquid Chromatography-Ion Mobility-time of Flight Instrument Control and Data Visualization software is designed to control instrument voltages for the Ion Mobility drift tube. It collects and stores information collected from the Agilent TOF instrument and analyses/displays the ion intensity information acquired. The software interface can be split into 3 categories -- Instrument Settings/Controls, Data Acquisition, and Viewer. The Instrument Settings/Controls prepares the instrument for Data Acquisition. The Viewer contains common objects that are used by Instrument Settings/Controls and Data Acquisition. Intensity information is collected in 1 nanosec bins and separated by TOF pulses called scans. A collection of scans aremore » stored side by side making up an accumulation. In order for the computer to keep up with the stream of data, 30-50 accumulations are commonly summed into a single frame. A collection of frames makes up an experiment. The Viewer software then takes the experiment and presents the data in several possible ways, each frame can be viewed in TOF bins or m/z (mass to charge ratio). The experiment can be viewed frame by frame, merging several frames, or by viewing the peak chromatogram. The user can zoom into the data, export data, and/or animate frames. Additional features include calibration of the data and even post-processing multiplexed data.« less

  8. The nuMOIRCS project: detector upgrade overview and early commissioning results

    NASA Astrophysics Data System (ADS)

    Walawender, Josh; Wung, Matthew; Fabricius, Maximilian; Tanaka, Ichi; Arimoto, Nobuo; Cook, David; Elms, Brian; Hashiba, Yasuhito; Hu, Yen-Sang; Iwata, Ikuru; Nishimura, Tetsuo; Omata, Koji; Takato, Naruhisa; Wang, Shiang-Yu; Weber, Mark

    2016-08-01

    In 2014 and 2015 the Multi-Object InfraRed Camera and Spectrograph (MOIRCS) instrument at the Subaru Telescope on Maunakea is underwent a significant modernization and upgrade project. We upgraded the two Hawaii2 detectors to Hawaii2-RG models, modernized the cryogenic temperature control system, and rewrote much of the instrument control software. The detector upgrade replaced the Hawaii2 detectors which use the Tohoku University Focal Plane Array Controller (TUFPAC) electronics with Hawaii2-RG detectors using SIDECAR ASIC (a fully integrated FPA controller system-on-a-chip) and a SAM interface card. We achieved an improvement in read noise by a factor of about 2 with this detector and electronics upgrade. The cryogenic temperature control upgrade focused on modernizing the components and making the procedures for warm up and cool down of the instrument safer. We have moved PID control loops out of the instrument control software and into Lakeshore model 336 cryogenic temperature controllers and have added interlocks on the warming systems to prevent overheating of the instrument. Much of the instrument control software has also been re-written. This was necessitated by the different interface to the detector electronics (ASIC and SAM vs. TUFPAC) and by the desire to modernize the interface to the telescope control software which has been updated to Subaru's "Gen2" system since the time of MOIRCS construction and first light. The new software is also designed to increase reliability of operation of the instrument, decrease overheads, and be easier for night time operators and support astronomers to use.

  9. Instrument control software development process for the multi-star AO system ARGOS

    NASA Astrophysics Data System (ADS)

    Kulas, M.; Barl, L.; Borelli, J. L.; Gässler, W.; Rabien, S.

    2012-09-01

    The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.

  10. CICADA, CCD and Instrument Control Software

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Brooks, Mick; Meatheringham, Stephen J.; Roberts, William H.

    Computerised Instrument Control and Data Acquisition (CICADA) is a software system for control of telescope instruments in a distributed computing environment. It is designed using object-oriented techniques and built with standard computing tools such as RPC, SysV IPC, Posix threads, Tcl, and GUI builders. The system is readily extensible to new instruments and currently supports the Astromed 3200 CCD controller and MSSSO's new tip-tilt system. Work is currently underway to provide support for the SDSU CCD controller and MSSSO's Double Beam Spectrograph. A core set of processes handle common communication and control tasks, while specific instruments are ``bolted'' on using C++ inheritance techniques.

  11. Instrument control software requirement specification for Extremely Large Telescopes

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca

    2010-07-01

    Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.

  12. A software control system for the ACTS high-burst-rate link evaluation terminal

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Daugherty, Elaine S.

    1991-01-01

    Control and performance monitoring of NASA's High Burst Rate Link Evaluation Terminal (HBR-LET) is accomplished by using several software control modules. Different software modules are responsible for controlling remote radio frequency (RF) instrumentation, supporting communication between a host and a remote computer, controlling the output power of the Link Evaluation Terminal and data display. Remote commanding of microwave RF instrumentation and the LET digital ground terminal allows computer control of various experiments, including bit error rate measurements. Computer communication allows system operators to transmit and receive from the Advanced Communications Technology Satellite (ACTS). Finally, the output power control software dynamically controls the uplink output power of the terminal to compensate for signal loss due to rain fade. Included is a discussion of each software module and its applications.

  13. Using XML and Java for Astronomical Instrumentation Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Koons, Lisa; Sall, Ken; Warsaw, Craig

    2000-01-01

    Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). ]ML is used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, and communication mechanisms. Although the current effort is targeted for the High-resolution Airborne Wideband Camera, a first-light instrument of the Stratospheric Observatory for Infrared Astronomy, the framework is designed to be generic and extensible so that it can be applied to any instrument.

  14. Control software and electronics architecture design in the framework of the E-ELT instrumentation

    NASA Astrophysics Data System (ADS)

    Di Marcantonio, P.; Coretti, I.; Cirami, R.; Comari, M.; Santin, P.; Pucillo, M.

    2010-07-01

    During the last years the European Southern Observatory (ESO), in collaboration with other European astronomical institutes, has started several feasibility studies for the E-ELT (European-Extremely Large Telescope) instrumentation and post-focal adaptive optics. The goal is to create a flexible suite of instruments to deal with the wide variety of scientific questions astronomers would like to see solved in the coming decades. In this framework INAF-Astronomical Observatory of Trieste (INAF-AOTs) is currently responsible of carrying out the analysis and the preliminary study of the architecture of the electronics and control software of three instruments: CODEX (control software and electronics) and OPTIMOS-EVE/OPTIMOS-DIORAMAS (control software). To cope with the increased complexity and new requirements for stability, precision, real-time latency and communications among sub-systems imposed by these instruments, new solutions have been investigated by our group. In this paper we present the proposed software and electronics architecture based on a distributed common framework centered on the Component/Container model that uses OPC Unified Architecture as a standard layer to communicate with COTS components of three different vendors. We describe three working prototypes that have been set-up in our laboratory and discuss their performances, integration complexity and ease of deployment.

  15. A generic testbed for the design of plasma spectrometer control software with application to the THOR-CSW solar wind instrument

    NASA Astrophysics Data System (ADS)

    De Keyser, Johan; Lavraud, Benoit; Neefs, Eddy; Berkenbosch, Sophie; Beeckman, Bram; Maggiolo, Romain; Gamby, Emmanuel; Fedorov, Andrei; Baruah, Rituparna; Wong, King-Wah; Amoros, Carine; Mathon, Romain; Génot, Vincent; Marcucci, Federica; Brienza, Daniele

    2017-04-01

    Modern plasma spectrometers require intelligent software that is able to exploit their capabilities to the fullest. While the low-level control of the instrument and basic tasks such as performing the basic measurement, temperature control, and production of housekeeping data are to be done by software that is executed on an FPGA and/or processor inside the instrument, higher level tasks such as control of measurement sequences, on-board moment calculation, beam tracking decisions, and data compression, may be performed by the instrument or in the payload data processing unit. Such design decisions, as well as an assessment of the workload on the different processing components, require early prototyping. We have developed a generic simulation testbed for the design of plasma spectrometer control software that allows an early evaluation of the level of resources that is needed at each level. Early prototyping can pinpoint bottlenecks in the design allowing timely remediation. We have applied this tool to the THOR Cold Solar Wind (CSW) plasma spectrometer. Some examples illustrating the usefulness of the tool are given.

  16. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed Central

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  17. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed

    Gulotta, M

    1995-11-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.

  18. Software requirements flow-down and preliminary software design for the G-CLEF spectrograph

    NASA Astrophysics Data System (ADS)

    Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.

    2016-08-01

    The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.

  19. Advanced communications technology satellite high burst rate link evaluation terminal experiment control and monitor software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document.

  20. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    PubMed

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  1. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy

    NASA Astrophysics Data System (ADS)

    Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  2. Generic control software connecting astronomical instruments to the reflective memory data recording system of VLTI - bossvlti

    NASA Astrophysics Data System (ADS)

    Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.

    2014-07-01

    The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.

  3. Virtual Instrument Simulator for CERES

    NASA Technical Reports Server (NTRS)

    Chapman, John J.

    1997-01-01

    A benchtop virtual instrument simulator for CERES (Clouds and the Earth's Radiant Energy System) has been built at NASA, Langley Research Center in Hampton, VA. The CERES instruments will fly on several earth orbiting platforms notably NASDA's Tropical Rainfall Measurement Mission (TRMM) and NASA's Earth Observing System (EOS) satellites. CERES measures top of the atmosphere radiative fluxes using microprocessor controlled scanning radiometers. The CERES Virtual Instrument Simulator consists of electronic circuitry identical to the flight unit's twin microprocessors and telemetry interface to the supporting spacecraft electronics and two personal computers (PC) connected to the I/O ports that control azimuth and elevation gimbals. Software consists of the unmodified TRW developed Flight Code and Ground Support Software which serves as the instrument monitor and NASA/TRW developed engineering models of the scanners. The CERES Instrument Simulator will serve as a testbed for testing of custom instrument commands intended to solve in-flight anomalies of the instruments which could arise during the CERES mission. One of the supporting computers supports the telemetry display which monitors the simulator microprocessors during the development and testing of custom instrument commands. The CERES engineering development software models have been modified to provide a virtual instrument running on a second supporting computer linked in real time to the instrument flight microprocessor control ports. The CERES Instrument Simulator will be used to verify memory uploads by the CERES Flight Operations TEAM at NASA. Plots of the virtual scanner models match the actual instrument scan plots. A high speed logic analyzer has been used to track the performance of the flight microprocessor. The concept of using an identical but non-flight qualified microprocessor and electronics ensemble linked to a virtual instrument with identical system software affords a relatively inexpensive simulation system capable of high fidelity.

  4. The South African Astronomical Observatory instrumentation software architecture and the SHOC instruments

    NASA Astrophysics Data System (ADS)

    van Gend, Carel; Lombaard, Briehan; Sickafoose, Amanda; Whittal, Hamish

    2016-07-01

    Until recently, software for instruments on the smaller telescopes at the South African Astronomical Observatory (SAAO) has not been designed for remote accessibility and frequently has not been developed using modern software best-practice. We describe a software architecture we have implemented for use with new and upgraded instruments at the SAAO. The architecture was designed to allow for multiple components and to be fast, reliable, remotely- operable, support different user interfaces, employ as much non-proprietary software as possible, and to take future-proofing into consideration. Individual component drivers exist as standalone processes, communicating over a network. A controller layer coordinates the various components, and allows a variety of user interfaces to be used. The Sutherland High-speed Optical Cameras (SHOC) instruments incorporate an Andor electron-multiplying CCD camera, a GPS unit for accurate timing and a pair of filter wheels. We have applied the new architecture to the SHOC instruments, with the camera driver developed using Andor's software development kit. We have used this to develop an innovative web-based user-interface to the instrument.

  5. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    NASA Astrophysics Data System (ADS)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  6. Design and development of control unit and software for the ADFOSC instrument of the 3.6 m Devasthal optical telescope

    NASA Astrophysics Data System (ADS)

    Kumar, T. S.

    2016-08-01

    In this paper, we describe the details of control unit and GUI software for positioning two filter wheels, a slit wheel and a grism wheel in the ADFOSC instrument. This is a first generation instrument being built for the 3.6 m Devasthal optical telescope. The control hardware consists of five electronic boards based on low cost 8-bit PIC microcontrollers and are distributed over I2C bus. The four wheels are controlled by four identical boards which are configured in I2C slave mode while the fifth board acts as an I2C master for sending commands to and receiving status from the slave boards. The master also communicates with the interfacing PC over TCP/IP protocol using simple ASCII commands. For moving the wheels stepper motors along with suitable amplifiers have been employed. Homing after powering ON is achieved using hall effect sensors. By implementing distributed control units having identical design modularity is achieved enabling easier maintenance and upgradation. A GUI based software for commanding the instrument is developed in Microsoft Visual C++. For operating the system during observations the user selects normal mode while the engineering mode is available for offering additional flexibility and low level control during maintenance and testing. A detailed time-stamped log of commands, status and errors are continuously generated. Both the control unit and the software have been successfully tested and integrated with the ADFOSC instrument.

  7. The 4MOST facility control software

    NASA Astrophysics Data System (ADS)

    Pramskiy, Alexander; Mandel, Holger; Rothmaier, Florian; Stilz, Ingo; Winkler, Roland; Hahn, Thomas

    2016-07-01

    The 4-m Multi-Object Spectrographic Telescope (4MOST) is one high-resolution (R 18000) and two lowresolution (R fi 5000) spectrographs covering the wavelength range between 390 and 950 nm. The spectrographs will be installed on ESO VISTA telescope and will be fed by approximately 2400 fibres. The instrument is capable to simultaneously obtain spectra of about 2400 objects distributed over an hexagonal field-of-view of four square degrees. This paper aims at giving an overview of the control software design, which is based on the standard ESO VLT software architecture and customised to fit the needs of the 4MOST instrument. In particular, the facility control software is intended to arrange the precise positioning of the fibres, to schedule and observe many surveys in parallel, and to combine the output from the three spectrographs. Moreover, 4MOST's software will include user-friendly graphical user interfaces that enable users to interact with the facility control system and to monitor all data-taking and calibration tasks of the instrument. A secondary guiding system will be implemented to correct for any fibre exure and thus to improve 4MOST's guiding performance. The large amount of fibres requires the custom design of data exchange to avoid performance issues. The observation sequences are designed to use spectrographs in parallel with synchronous points for data exchange between subsystems. In order to control hardware devices, Programmable Logic Controller (PLC) components will be used, the new standard for future instruments at ESO.

  8. PC based PLCs and ethernet based fieldbus: the new standard platform for future VLT instrument control

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola

    2014-07-01

    ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.

  9. Advanced Communications Technology Satellite high burst rate link evaluation terminal experiment control and monitor software maintenance manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document. The EC&M Software Maintenance Manual, Version 1.0 (NASA-CR-189161) is a programmer's guide that describes current implementation of the EC&M software from a technical perspective. An overview of the EC&M software, computer algorithms, format representation, and computer hardware configuration are included in the manual.

  10. Automated control and data acquisition for a tunable diode laser heterodyne spectrometer

    NASA Technical Reports Server (NTRS)

    Shull, T. S.; Rinsland, P. L.

    1983-01-01

    This paper describes the hardware and software design, development, and implementation of the control and data electronics of a laser heterodyne spectrometer instrument being built at NASA Langley Research Center for a technology demonstration. Functional partitioning, applied at all levels of hardware and software, has been found to provide expedient design, development, and testing of the instrument. The instrument is composed of distributed microprocessor-based units. A master/slave protocol is presented which can be simulated by a terminal for unit checkout. All but one of the units are implemented using a set of core boards, plus unique boards where necessary. This design has led to reduced hardware development, reduced parts inventory, and replication of software modules, while providing the flexibility needed for a development instrument. The development tools and documentation guidelines are discussed.

  11. The U.S./IAEA Workshop on Software Sustainability for Safeguards Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pepper S. E.; .; Worrall, L.

    2014-08-08

    The U.S. National Nuclear Security Administration’s Next Generation Safeguards Initiative, the U.S. Department of State, and the International Atomic Energy Agency (IAEA) organized a a workshop on the subject of ”Software Sustainability for Safeguards Instrumentation.” The workshop was held at the Vienna International Centre in Vienna, Austria, May 6-8, 2014. The workshop participants included software and hardware experts from national laboratories, industry, government, and IAEA member states who were specially selected by the workshop organizers based on their experience with software that is developed for the control and operation of safeguards instrumentation. The workshop included presentations, to orient the participantsmore » to the IAEA Department of Safeguards software activities related to instrumentation data collection and processing, and case studies that were designed to inspire discussion of software development, use, maintenance, and upgrades in breakout sessions and to result in recommendations for effective software practices and management. This report summarizes the results of the workshop.« less

  12. Using XML and Java Technologies for Astronomical Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Case, Lynne; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center, under the Instrument Remote Control (IRC) project, is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is that the software is driven by an instrument description, written using the Instrument Markup Language (IML), a dialect of XML. IML is used to describe the command sets and command formats of the instrument, communication mechanisms, format of the data coming from the instrument, and characteristics of the graphical user interface to control and monitor the instrument. The IRC framework allows the users to define a data analysis pipeline which converts data coming out of the instrument. The data can be used in visualizations in order for the user to assess the data in real-time, if necessary. The data analysis pipeline algorithms can be supplied by the user in a variety of forms or programming languages. Although the current integration effort is targeted for the High-resolution Airborne Wideband Camera (HAWC) and the Submillimeter and Far Infrared Experiment (SAFIRE), first-light instruments of the Stratospheric Observatory for Infrared Astronomy (SOFIA), the framework is designed to be generic and extensible so that it can be applied to any instrument. Plans are underway to test the framework with other types of instruments, such as remote sensing earth science instruments.

  13. Flight Software Development for the CHEOPS Instrument with the CORDET Framework

    NASA Astrophysics Data System (ADS)

    Cechticky, V.; Ottensamer, R.; Pasetti, A.

    2015-09-01

    CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)

  14. Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument

    NASA Astrophysics Data System (ADS)

    DuPlain, Ron; Ransom, Scott; Demorest, Paul; Brandt, Patrick; Ford, John; Shelton, Amy L.

    2008-08-01

    The National Radio Astronomy Observatory (NRAO) is launching the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), a prototype flexible digital signal processor designed for pulsar observations with the Robert C. Byrd Green Bank Telescope (GBT). GUPPI uses field programmable gate array (FPGA) hardware and design tools developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. The NRAO has been concurrently developing GUPPI software and hardware using minimal software resources. The software handles instrument monitor and control, data acquisition, and hardware interfacing. GUPPI is currently an expert-only spectrometer, but supports future integration with the full GBT production system. The NRAO was able to take advantage of the unique flexibility of the CASPER FPGA hardware platform, develop hardware and software in parallel, and build a suite of software tools for monitoring, controlling, and acquiring data with a new instrument over a short timeline of just a few months. The NRAO interacts regularly with CASPER and its users, and GUPPI stands as an example of what reconfigurable computing and open-source development can do for radio astronomy. GUPPI is modular for portability, and the NRAO provides the results of development as an open-source resource.

  15. The Keck keyword layer

    NASA Technical Reports Server (NTRS)

    Conrad, A. R.; Lupton, W. F.

    1992-01-01

    Each Keck instrument presents a consistent software view to the user interface programmer. The view consists of a small library of functions, which are identical for all instruments, and a large set of keywords, that vary from instrument to instrument. All knowledge of the underlying task structure is hidden from the application programmer by the keyword layer. Image capture software uses the same function library to collect data for the image header. Because the image capture software and the instrument control software are built on top of the same keyword layer, a given observation can be 'replayed' by extracting keyword-value pairs from the image header and passing them back to the control system. The keyword layer features non-blocking as well as blocking I/O. A non-blocking keyword write operation (such as setting a filter position) specifies a callback to be invoked when the operation is complete. A non-blocking keyword read operation specifies a callback to be invoked whenever the keyword changes state. The keyword-callback style meshes well with the widget-callback style commonly used in X window programs. The first keyword library was built for the two Keck optical instruments. More recently, keyword libraries have been developed for the infrared instruments and for telescope control. Although the underlying mechanisms used for inter-process communication by each of these systems vary widely (Lick MUSIC, Sun RPC, and direct socket I/O, respectively), a basic user interface has been written that can be used with any of these systems. Since the keyword libraries are bound to user interface programs dynamically at run time, only a single set of user interface executables is needed. For example, the same program, 'xshow', can be used to display continuously the telescope's position, the time left in an instrument's exposure, or both values simultaneously. Less generic tools that operate on specific keywords, for example an X display that controls optical instrument exposures, have also been written using the keyword layer.

  16. XML in an Adaptive Framework for Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy J.

    2004-01-01

    NASA Goddard Space Flight Center is developing an extensible framework for instrument command and control, known as Instrument Remote Control (IRC), that combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms.

  17. WMAP C&DH Software

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan; Leath, Tim; Ferrer, Art; Miller, Todd; Walters, Mark; Savadkin, Bruce; Wu, Ji-Wei; Slegel, Steve; Stagmer, Emory

    2007-01-01

    The command-and-data-handling (C&DH) software of the Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft functions as the sole interface between (1) the spacecraft and its instrument subsystem and (2) ground operations equipment. This software includes a command-decoding and -distribution system, a telemetry/data-handling system, and a data-storage-and-playback system. This software performs onboard processing of attitude sensor data and generates commands for attitude-control actuators in a closed-loop fashion. It also processes stored commands and monitors health and safety functions for the spacecraft and its instrument subsystems. The basic functionality of this software is the same of that of the older C&DH software of the Rossi X-Ray Timing Explorer (RXTE) spacecraft, the main difference being the addition of the attitude-control functionality. Previously, the C&DH and attitude-control computations were performed by different processors because a single RXTE processor did not have enough processing power. The WMAP spacecraft includes a more-powerful processor capable of performing both computations.

  18. Complete LabVIEW-Controlled HPLC Lab: An Advanced Undergraduate Experience

    ERIC Educational Resources Information Center

    Beussman, Douglas J.; Walters, John P.

    2017-01-01

    Virtually all modern chemical instrumentation is controlled by computers. While software packages are continually becoming easier to use, allowing for more researchers to utilize more complex instruments, conveying some level of understanding as to how computers and instruments communicate is still an important part of the undergraduate…

  19. Remote control of astronomical instruments via the Internet

    NASA Astrophysics Data System (ADS)

    Ashley, M. C. B.; Brooks, P. W.; Lloyd, J. P.

    1996-01-01

    A software package called ERIC is described that provides a framework for allowing scientific instruments to be remotely controlled via the Internet. The package has been used to control four diverse astronomical instruments, and is now being made freely available to the community. For a description of ERIC's capabilities, and how to obtain a copy, see the conclusion to this paper.

  20. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  1. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  2. The artificial satellite observation chronograph controlled by single chip microcomputer.

    NASA Astrophysics Data System (ADS)

    Pan, Guangrong; Tan, Jufan; Ding, Yuanjun

    1991-06-01

    The instrument specifications, hardware structure, software design, and other characteristics of the chronograph mounting on a theodolite used for artificial satellite observation are presented. The instrument is a real time control system with a single chip microcomputer.

  3. Detailed design and first tests of the application software for the instrument control unit of Euclid-NISP

    NASA Astrophysics Data System (ADS)

    Ligori, S.; Corcione, L.; Capobianco, V.; Bonino, D.; Sirri, G.; Fornari, F.; Giacomini, F.; Patrizii, L.; Valenziano, L.; Travaglini, R.; Colodro, C.; Bortoletto, F.; Bonoli, C.; Chiarusi, T.; Margiotta, A.; Mauri, N.; Pasqualini, L.; Spurio, M.; Tenti, M.; Dal Corso, F.; Dusini, S.; Laudisio, F.; Sirignano, C.; Stanco, L.; Ventura, S.; Auricchio, N.; Balestra, A.; Franceschi, E.; Morgante, G.; Trifoglio, M.; Medinaceli, E.; Guizzo, G. P.; Debei, S.; Stephen, J. B.

    2016-07-01

    In this paper we describe the detailed design of the application software (ASW) of the instrument control unit (ICU) of NISP, the Near-Infrared Spectro-Photometer of the Euclid mission. This software is based on a real-time operating system (RTEMS) and will interface with all the subunits of NISP, as well as the command and data management unit (CDMU) of the spacecraft for telecommand and housekeeping management. We briefly review the main requirements driving the design and the architecture of the software that is approaching the Critical Design Review level. The interaction with the data processing unit (DPU), which is the intelligent subunit controlling the detector system, is described in detail, as well as the concept for the implementation of the failure detection, isolation and recovery (FDIR) algorithms. The first version of the software is under development on a Breadboard model produced by AIRBUS/CRISA. We describe the results of the tests and the main performances and budgets.

  4. The Influence of Software Complexity on the Maintenance Effort: Case Study on Software Developed within Educational Process

    ERIC Educational Resources Information Center

    Radulescu, Iulian Ionut

    2006-01-01

    Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…

  5. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    NASA Technical Reports Server (NTRS)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  6. The instrument control software package for the Habitable-Zone Planet Finder spectrometer

    NASA Astrophysics Data System (ADS)

    Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan

    2016-08-01

    We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.

  7. Design of Instrument Control Software for Solar Vector Magnetograph at Udaipur Solar Observatory

    NASA Astrophysics Data System (ADS)

    Gosain, Sanjay; Venkatakrishnan, P.; Venugopalan, K.

    2004-04-01

    A magnetograph is an instrument which makes measurement of solar magnetic field by measuring Zeeman induced polarization in solar spectral lines. In a typical filter based magnetograph there are three main modules namely, polarimeter, narrow-band spectrometer (filter), and imager(CCD camera). For a successful operation of magnetograph it is essential that these modules work in synchronization with each other. Here, we describe the design of instrument control system implemented for the Solar Vector Magnetograph under development at Udaipur Solar Observatory. The control software is written in Visual Basic and exploits the Component Object Model (COM) components for a fast and flexible application development. The user can interact with the instrument modules through a Graphical User Interface (GUI) and can program the sequence of magnetograph operations. The integration of Interactive Data Language (IDL) ActiveX components in the interface provides a powerful tool for online visualization, analysis and processing of images.

  8. Using XML and Java for Astronomical Instrument Control

    NASA Astrophysics Data System (ADS)

    Koons, L.; Ames, T.; Evans, R.; Warsaw, C.; Sall, K.

    1999-12-01

    Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests. Instrument description is too tightly coupled with details of implementation. NASA/Goddard Space Flight Center and AppNet, Inc. are developing a very general and highly extensible framework that applies to virtually any kind of instrument that can be controlled by a computer (e.g., telescopes, microscopes and printers). A key aspect of the object-oriented architecture, implemented in Java, involves software that is driven by an instrument description. The Astronomical Instrument Markup Language (AIML) is a domain-specific implementation of the more generalized Instrument Markup Language (IML). The software architecture combines the platform-independent processing capabilities of Java with the vendor-independent data description syntax of Extensible Markup Language (XML), a human-readable and machine-understandable way to describe structured data. IML is used to describe command sets (including parameters, datatypes, and constraints) and their associated formats, telemetry, and communication mechanisms. The software uses this description to present graphical user interfaces to control and monitor the instrument. Recent efforts have extended to command procedures (scripting) and representation of data pipeline inputs, outputs, and connections. Near future efforts are likely to include an XML description of data visualizations, as well as the potential use of XSL (Extensible Stylesheet Language) to permit astronomers to customize the user interface on several levels: per user, instrument, subsystem, or observatory-wide. Our initial prototyping effort was targeted for HAWC (High-resolution Airborne Wideband Camera), a first-light instrument of SOFIA (the Stratospheric Observatory for Infrared Astronomy). A production-level application of this technology is for one of the three candidate detectors of SPIRE (Spectral and Photometric Imaging REceiver), a focal plane instrument proposed for the European Space Agency's Far Infrared Space Telescope. The detectors are being developed by the Infrared Astrophysics Branch of NASA/GSFC.

  9. FPGA based control system for space instrumentation

    NASA Astrophysics Data System (ADS)

    Di Giorgio, Anna M.; Cerulli Irelli, Pasquale; Nuzzolo, Francesco; Orfei, Renato; Spinoglio, Luigi; Liu, Giovanni S.; Saraceno, Paolo

    2008-07-01

    The prototype for a general purpose FPGA based control system for space instrumentation is presented, with particular attention to the instrument control application software. The system HW is based on the LEON3FT processor, which gives the flexibility to configure the chip with only the necessary HW functionalities, from simple logic up to small dedicated processors. The instrument control SW is developed in ANSI C and for time critical (<10μs) commanding sequences implements an internal instructions sequencer, triggered via an interrupt service routine based on a HW high priority interrupt.

  10. Cybernetic Control of an Electrochemical Repertoire.

    ERIC Educational Resources Information Center

    He, Peixin; And Others

    1982-01-01

    Describes major features of a computer-operated, cybernetic potentiostat and the development, design, and operation of the software in ROM. The instrument contains control circuitry and software making it compatible with the static mercury drop electrode produced by EG&G Princeton Applied Research Corporation. Sample results using the…

  11. 75 FR 30077 - Advisory Committee On Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee On Digital I&C...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... Subcommittee On Digital I&C Systems The ACRS Subcommittee on Digital Instrumentation and Control (DI&C) Systems... the area of Digital Instrumentation and Control (DI&C) Probabilistic Risk Assessment (PRA). Topics... software reliability methods (QSRMs), NUREG/CR--6997, ``Modeling a Digital Feedwater Control System Using...

  12. The M68HC11 gripper controller electronics

    NASA Technical Reports Server (NTRS)

    Kelley, Robert B.; Bethel, Jeffrey

    1991-01-01

    This document describes the instrumentation, operational theory, circuit implementation, calibration procedures, and general notes for the CIRSSE general purpose pneumatic hand. The mechanical design and the control software are discussed. The circuit design, PCB layout, hand instrumentation, and controller construction described in detail in this document are the result of a senior project.

  13. WTEC monograph on instrumentation, control and safety systems of Canadian nuclear facilities

    NASA Technical Reports Server (NTRS)

    Uhrig, Robert E.; Carter, Richard J.

    1993-01-01

    This report updates a 1989-90 survey of advanced instrumentation and controls (I&C) technologies and associated human factors issues in the U.S. and Canadian nuclear industries carried out by a team from Oak Ridge National Laboratory (Carter and Uhrig 1990). The authors found that the most advanced I&C systems are in the Canadian CANDU plants, where the newest plant (Darlington) has digital systems in almost 100 percent of its control systems and in over 70 percent of its plant protection system. Increased emphasis on human factors and cognitive science in modern control rooms has resulted in a reduced workload for the operators and the elimination of many human errors. Automation implemented through digital instrumentation and control is effectively changing the role of the operator to that of a systems manager. The hypothesis that properly introducing digital systems increases safety is supported by the Canadian experience. The performance of these digital systems has been achieved using appropriate quality assurance programs for both hardware and software development. Recent regulatory authority review of the development of safety-critical software has resulted in the creation of isolated software modules with well defined interfaces and more formal structure in the software generation. The ability of digital systems to detect impending failures and initiate a fail-safe action is a significant safety issue that should be of special interest to nuclear utilities and regulatory authorities around the world.

  14. Flight software operation of the Hubble Space Telescope fine guidance sensor

    NASA Technical Reports Server (NTRS)

    Rodden, J. J.; Dougherty, H. J.; Cormier, D. J.

    1988-01-01

    The Hubble Space Telescope (HST) is to carry five major scientific instruments to collect imagery, spectrographic, and photometric astronomical data. The Pointing Control System is designed to achieve pointing accuracies and line of sight jitter levels an order of magnitude less than can be achieved with ground mounted telescopes. This paper describes the operation of the pointing control system flight software in targeting a celestial object in a science instrument aperture and in performing the coordinate transformations necessary for commanding the fine guidance sensor and determining the attitude-error corrections.

  15. SOFIA tracking image simulation

    NASA Astrophysics Data System (ADS)

    Taylor, Charles R.; Gross, Michael A. K.

    2016-09-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) tracking camera simulator is a component of the Telescope Assembly Simulator (TASim). TASim is a software simulation of the telescope optics, mounting, and control software. Currently in its fifth major version, TASim is relied upon for telescope operator training, mission planning and rehearsal, and mission control and science instrument software development and testing. TASim has recently been extended for hardware-in-the-loop operation in support of telescope and camera hardware development and control and tracking software improvements. All three SOFIA optical tracking cameras are simulated, including the Focal Plane Imager (FPI), which has recently been upgraded to the status of a science instrument that can be used on its own or in parallel with one of the seven infrared science instruments. The simulation includes tracking camera image simulation of starfields based on the UCAC4 catalog at real-time rates of 4-20 frames per second. For its role in training and planning, it is important for the tracker image simulation to provide images with a realistic appearance and response to changes in operating parameters. For its role in tracker software improvements, it is vital to have realistic signal and noise levels and precise star positions. The design of the software simulation for precise subpixel starfield rendering (including radial distortion), realistic point-spread function as a function of focus, tilt, and collimation, and streaking due to telescope motion will be described. The calibration of the simulation for light sensitivity, dark and bias signal, and noise will also be presented

  16. Real-time control of the robotic lunar observatory telescope

    USGS Publications Warehouse

    Anderson, J.M.; Becker, K.J.; Kieffer, H.H.; Dodd, D.N.

    1999-01-01

    The US Geological Survey operates an automated observatory dedicated to the radiometry of the Moon with the objective of developing a multispectral, spatially resolved photometric model of the Moon to be used in the calibration of Earth-orbiting spacecraft. Interference filters are used with two imaging instruments to observe the Moon in 32 passbands from 350-2500 nm. Three computers control the telescope mount and instruments with a fourth computer acting as a master system to control all observation activities. Real-time control software has been written to operate the instrumentation and to automate the observing process. The observing software algorithms use information including the positions of objects in the sky, the phase of the Moon, and the times of evening and morning twilight to decide how to observe program objects. The observatory has been operating in a routine mode since late 1995 and is expected to continue through at least 2002 without significant modifications.

  17. User interaction with the LUCIFER control software

    NASA Astrophysics Data System (ADS)

    Knierim, Volker; Jütte, Marcus; Polsterer, Kai; Schimmelmann, Jan

    2006-06-01

    We present the concept and design of the interaction between users and the LUCIFER Control Software Package. The necessary functionality that must be provided to a user depends on and differs greatly for the different user types (i.e., engineers and observers). While engineers want total control over every service provided by the software system, observers are typically only interested in a fault tolerant and efficient user interface that helps them to carry out their observations in the best possible way during the night. To provide the functionality engineers need, direct access to a service is necessary. This may harbor a possible threat to the instrument in the case of a faulty operation by the engineer, but is the only way to test every unit during integration and commissioning of the instrument, and for service time later on. The observer on the other hand should only have indirect access to the instrument, controlled by an instrument manager service that ensures the necessary safety checks so that no harm can be done to the instrument. Our design of the user interaction provides such an approach on a level that is transparent to any interaction component regardless of interface type (i.e., textual or graphical). Using the interface and inheritance concepts of the Java Programming Language and its tools to create graphical user interfaces, it is possible to provide the necessary level of flexibility for the different user types on one side, while ensuring maximum reusability of code on the other side.

  18. MPST Software: MoonKommand

    NASA Technical Reports Server (NTRS)

    Kwok, John H.; Call, Jared A.; Khanampornpan, Teerapat

    2012-01-01

    This software automatically processes Sally Ride Science (SRS) delivered MoonKAM camera control files (ccf) into uplink products for the GRAIL-A and GRAIL-B spacecraft as part of an education and public outreach (EPO) extension to the Grail Mission. Once properly validated and deemed safe for execution onboard the spacecraft, MoonKommand generates the command products via the Automated Sequence Processor (ASP) and generates uplink (.scmf) files for radiation to the Grail-A and/or Grail-B spacecraft. Any errors detected along the way are reported back to SRS via email. With Moon Kommand, SRS can control their EPO instrument as part of a fully automated process. Inputs are received from SRS as either image capture files (.ccficd) for new image requests, or downlink/delete files (.ccfdl) for requesting image downlink from the instrument and on-board memory management. The Moon - Kommand outputs are command and file-load (.scmf) files that will be uplinked by the Deep Space Network (DSN). Without MoonKommand software, uplink product generation for the MoonKAM instrument would be a manual process. The software is specific to the Moon - KAM instrument on the GRAIL mission. At the time of this writing, the GRAIL mission was making final preparations to begin the science phase, which was scheduled to continue until June 2012.

  19. Robust Control for the Mercury Laser Altimeter

    NASA Technical Reports Server (NTRS)

    Rosenberg, Jacob S.

    2006-01-01

    Mercury Laser Altimeter Science Algorithms is a software system for controlling the laser altimeter aboard the Messenger spacecraft, which is to enter into orbit about Mercury in 2011. The software will control the altimeter by dynamically modifying hardware inputs for gain, threshold, channel-disable flags, range-window start location, and range-window width, by using ranging information provided by the spacecraft and noise counts from instrument hardware. In addition, because of severe bandwidth restrictions, the software also selects returns for downlink.

  20. UCam: universal camera controller and data acquisition system

    NASA Astrophysics Data System (ADS)

    McLay, S. A.; Bezawada, N. N.; Atkinson, D. C.; Ives, D. J.

    2010-07-01

    This paper describes the software architecture and design concepts used in the UKATC's generic camera control and data acquisition software system (UCam) which was originally developed for use with the ARC controller hardware. The ARC detector control electronics are developed by Astronomical Research Cameras (ARC), of San Diego, USA. UCam provides an alternative software solution programmed in C/C++ and python that runs on a real-time Linux operating system to achieve critical speed performance for high time resolution instrumentation. UCam is a server based application that can be accessed remotely and easily integrated as part of a larger instrument control system. It comes with a user friendly client application interface that has several features including a FITS header editor and support for interfacing with network devices. Support is also provided for writing automated scripts in python or as text files. UCam has an application centric design where custom applications for different types of detectors and read out modes can be developed, downloaded and executed on the ARC controller. The built-in de-multiplexer can be easily reconfigured to readout any number of channels for almost any type of detector. It also provides support for numerous sampling modes such as CDS, FOWLER, NDR and threshold limited NDR. UCam has been developed over several years for use on many instruments such as the Wide Field Infra Red Camera (WFCAM) at UKIRT in Hawaii, the mid-IR imager/spectrometer UIST and is also used on instruments at SUBARU, Gemini and Palomar.

  1. VIMOS Instrument Control Software Design: an Object Oriented Approach

    NASA Astrophysics Data System (ADS)

    Brau-Nogué, Sylvie; Lucuix, Christian

    2002-12-01

    The Franco-Italian VIMOS instrument is a VIsible imaging Multi-Object Spectrograph with outstanding multiplex capabilities, allowing to take spectra of more than 800 objects simultaneously, or integral field spectroscopy mode in a 54x54 arcsec area. VIMOS is being installed at the Nasmyth focus of the third Unit Telescope of the European Southern Observatory Very Large Telescope (VLT) at Mount Paranal in Chile. This paper will describe the analysis, the design and the implementation of the VIMOS Instrument Control System, using UML notation. Our Control group followed an Object Oriented software process while keeping in mind the ESO VLT standard control concepts. At ESO VLT a complete software library is available. Rather than applying waterfall lifecycle, ICS project used iterative development, a lifecycle consisting of several iterations. Each iteration consisted in : capture and evaluate the requirements, visual modeling for analysis and design, implementation, test, and deployment. Depending of the project phases, iterations focused more or less on specific activity. The result is an object model (the design model), including use-case realizations. An implementation view and a deployment view complement this product. An extract of VIMOS ICS UML model will be presented and some implementation, integration and test issues will be discussed.

  2. Live interactive computer music performance practice

    NASA Astrophysics Data System (ADS)

    Wessel, David

    2002-05-01

    A live-performance musical instrument can be assembled around current lap-top computer technology. One adds a controller such as a keyboard or other gestural input device, a sound diffusion system, some form of connectivity processor(s) providing for audio I/O and gestural controller input, and reactive real-time native signal processing software. A system consisting of a hand gesture controller; software for gesture analysis and mapping, machine listening, composition, and sound synthesis; and a controllable radiation pattern loudspeaker are described. Interactivity begins in the set up wherein the speaker-room combination is tuned with an LMS procedure. This system was designed for improvisation. It is argued that software suitable for carrying out an improvised musical dialog with another performer poses special challenges. The processes underlying the generation of musical material must be very adaptable, capable of rapid changes in musical direction. Machine listening techniques are used to help the performer adapt to new contexts. Machine learning can play an important role in the development of such systems. In the end, as with any musical instrument, human skill is essential. Practice is required not only for the development of musically appropriate human motor programs but for the adaptation of the computer-based instrument as well.

  3. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  4. Service-oriented architecture for the ARGOS instrument control software

    NASA Astrophysics Data System (ADS)

    Borelli, J.; Barl, L.; Gässler, W.; Kulas, M.; Rabien, Sebastian

    2012-09-01

    The Advanced Rayleigh Guided ground layer Adaptive optic System, ARGOS, equips the Large Binocular Telescope (LBT) with a constellation of six rayleigh laser guide stars. By correcting atmospheric turbulence near the ground, the system is designed to increase the image quality of the multi-object spectrograph LUCIFER approximately by a factor of 3 over a field of 4 arc minute diameter. The control software has the critical task of orchestrating several devices, instruments, and high level services, including the already existing adaptive optic system and the telescope control software. All these components are widely distributed over the telescope, adding more complexity to the system design. The approach used by the ARGOS engineers is to write loosely coupled and distributed services under the control of different ownership systems, providing a uniform mechanism to offer, discover, interact and use these distributed capabilities. The control system counts with several finite state machines, vibration and flexure compensation loops, and safety mechanism, such as interlocks, aircraft, and satellite avoidance systems.

  5. Imaging Sensor Flight and Test Equipment Software

    NASA Technical Reports Server (NTRS)

    Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa

    2007-01-01

    The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes at user-selected locations.

  6. Designing communication and remote controlling of virtual instrument network system

    NASA Astrophysics Data System (ADS)

    Lei, Lin; Wang, Houjun; Zhou, Xue; Zhou, Wenjian

    2005-01-01

    In this paper, a virtual instrument network through the LAN and finally remote control of virtual instruments is realized based on virtual instrument and LabWindows/CVI software platform. The virtual instrument network system is made up of three subsystems. There are server subsystem, telnet client subsystem and local instrument control subsystem. This paper introduced virtual instrument network structure in detail based on LabWindows. Application procedure design of virtual instrument network communication, the Client/the programming mode of the server, remote PC and server communication far realizing, the control power of the workstation is transmitted, server program and so on essential technical were introduced. And virtual instruments network may connect to entire Internet on. Above-mentioned technology, through measuring the application in the electronic measurement virtual instrument network that is already built up, has verified the actual using value of the technology. Experiment and application validate that this design is resultful.

  7. The boot software of the control unit of the near infrared spectrograph of the Euclid space mission: technical specification

    NASA Astrophysics Data System (ADS)

    Gómez-Sáenz-de-Tejada, Jaime; Toledo-Moreo, Rafael; Colodro-Conde, Carlos; Pérez-Lizán, David; Fernández-Conde, Jesús; Sánchez-Prieto, Sebastián.

    2016-07-01

    The Near Infrared Spectrograph and Photometer (NISP) is one of the instruments on board the ESA EUCLID mission. The Boot Software (BSW) is in charge of initialization and communications after a reset occurs at hard- ware level. The Universidad Politecnica de Cartagena and Instituto de Astrofisica de Canarias are responsible of the Instrument Control Unit of the NISP (NI-ICU) in the Euclid Consortium. The NI-ICU BSW is developed by Universidad de Alcaĺa, and its main functions are: communication with the S/C for memory management, self-tests and start of a patchable Application Software (ASW). This paper presents the NI-ICU BSW status of definition and design at the end of the Technical Specification phase.

  8. Telescience Testbed Program: A study of software for SIRTF instrument control

    NASA Technical Reports Server (NTRS)

    Young, Erick T.

    1992-01-01

    As a continued element in the Telescience Testbed Program (TTP), the University of Arizona Steward Observatory and the Electrical and Computer Engineering Department (ECE) jointly developed a testbed to evaluate the Operations and Science Instrument System (OASIS) software package for remote control of an instrument for the Space Infrared Telescope Facility (SIRTF). SIRTF is a cryogenically-cooled telescope with three focal plane instruments that will be the infrared element of NASA's Great Observatory series. The anticipated launch date for SIRTF is currently 2001. Because of the complexity of the SIRTF mission, it was not expected that the OASIS package would be suitable for instrument control in the flight situation, however, its possible use as a common interface during the early development and ground test phases of the project was considered. The OASIS package, developed at the University of Colorado for control of the Solar Mesosphere Explorer (SME) satellite, serves as an interface between the operator and the remote instrument which is connected via a network. OASIS provides a rudimentary windowing system as well as support for standard spacecraft communications protocols. The experiment performed all of the functions required of the MIPS simulation program. Remote control of the instrument was demonstrated but found to be inappropriate for SIRTF at this time for the following reasons: (1) programming interface is too difficult; (2) significant computer resources were required to run OASIS; (3) the communications interface is too complicated; (4) response time was slow; and (5) quicklook of image data was not possible.

  9. NEWFIRM Software--System Integration Using OPC

    NASA Astrophysics Data System (ADS)

    Daly, P. N.

    2004-07-01

    The NOAO Extremely Wide-Field Infra-Red Mosaic (NEWFIRM) camera is being built to satisfy the survey science requirements on the KPNO Mayall and CTIO Blanco 4m telescopes in an era of 8m+ aperture telescopes. Rather than re-invent the wheel, the software system to control the instrument has taken existing software packages and re-used what is appropriate. The result is an end-to-end observation control system using technology components from DRAMA, ORAC, observing tools, GWC, existing in-house motor controllers and new developments like the MONSOON pixel server.

  10. The CFHT MegaCam control system: new solutions based on PLCs, WorldFIP fieldbus and Java softwares

    NASA Astrophysics Data System (ADS)

    Rousse, Jean Y.; Boulade, Olivier; Charlot, Xavier; Abbon, P.; Aune, Stephan; Borgeaud, Pierre; Carton, Pierre-Henri; Carty, M.; Da Costa, J.; Deschamps, H.; Desforge, D.; Eppele, Dominique; Gallais, Pascal; Gosset, L.; Granelli, Remy; Gros, Michel; de Kat, Jean; Loiseau, Denis; Ritou, J. L.; Starzynski, Pierre; Vignal, Nicolas; Vigroux, Laurent G.

    2003-03-01

    MegaCam is a wide-field imaging camera built for the prime focus of the 3.6m Canada-France-Hawaii Telescope. This large detector has required new approaches from the hardware up to the instrument control system software. Safe control of the three sub-systems of the instrument (cryogenics, filters and shutter), measurement of the exposure time with an accuracy of 0.1%, identification of the filters and management of the internal calibration source are the major challenges that are taken up by the control system. Another challenge is to insure all these functionalities with the minimum space available on the telescope structure for the electrical hardware and a minimum number of cables to keep the highest reliability. All these requirements have been met with a control system which different elements are linked by a WorldFip fieldbus on optical fiber. The diagnosis and remote user support will be insured with an Engineering Control System station based on software developed on Internet JAVA technologies (applets, servlets) and connected on the fieldbus.

  11. GPM Timeline Inhibits For IT Processing

    NASA Technical Reports Server (NTRS)

    Dion, Shirley K.

    2014-01-01

    The Safety Inhibit Timeline Tool was created as one approach to capturing and understanding inhibits and controls from IT through launch. Global Precipitation Measurement (GPM) Mission, which launched from Japan in March 2014, was a joint mission under a partnership between the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM was one of the first NASA Goddard in-house programs that extensively used software controls. Using this tool during the GPM buildup allowed a thorough review of inhibit and safety critical software design for hazardous subsystems such as the high gain antenna boom, solar array, and instrument deployments, transmitter turn-on, propulsion system release, and instrument radar turn-on. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As a result of this process, a new tool safety inhibit timeline was created for management of inhibits and their controls during spacecraft buildup and testing during IT at GSFC and at the launch range in Japan. The Safety Inhibit Timeline Tool was a pathfinder approach for reviewing software that controls the electrical inhibits. The Safety Inhibit Timeline Tool strengthens the Safety Analysts understanding of the removal of inhibits during the IT process with safety critical software. With this tool, the Safety Analyst can confirm proper safe configuration of a spacecraft during each IT test, track inhibit and software configuration changes, and assess software criticality. In addition to understanding inhibits and controls during IT, the tool allows the Safety Analyst to better communicate to engineers and management the changes in inhibit states with each phase of hardware and software testing and the impact of safety risks. Lessons learned from participating in the GPM campaign at NASA and JAXA will be discussed during this session.

  12. NASA Tech Briefs, February 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    opics covered include: Integrated Electrode Arrays for Neuro-Prosthetic Implants; Eroding Potentiometers; Common/Dependent-Pressure-Vessel Nickel-Hydrogen Batteries; 120-GHz HEMT Oscillator With Surface-Wave-Assisted Antenna; 80-GHz MMIC HEMT Voltage-Controlled Oscillator; High-Energy-Density Capacitors; Microscale Thermal-Transpiration Gas Pump; Instrument for Measuring Temperature of Water; Improved Measurement of Coherence in Presence of Instrument Noise; Compact Instruments Measure Helium-Leak Rates; Irreversible Entropy Production in Two-Phase Mixing Layers; Subsonic and Supersonic Effects in Bose-Einstein Condensate; Nanolaminate Mirrors With "Piston" Figure-Control Actuators; Mixed Conducting Electrodes for Better AMTEC Cells; Process for Encapsulating Protein Crystals; Lightweight, Self-Deployable Wheels; Grease-Resistant O Rings for Joints in Solid Rocket Motors; LabVIEW Serial Driver Software for an Electronic Load; Software Computes Tape-Casting Parameters; Software for Tracking Costs of Mars Projects; Software for Replicating Data Between X.500 and LDAP Directories; The Technical Work Plan Tracking Tool; Improved Multiple-DOF SAW Piezoelectric Motors; Propulsion Flight-Test Fixture; Mechanical Amplifier for a Piezoelectric Transducer; Swell Sleeves for Testing Explosive Devices; Linear Back-Drive Differentials; Miniature Inchworm Actuators Fabricated by Use of LIGA; Using ERF Devices to Control Deployments of Space Structures; High-Temperature Switched-Reluctance Electric Motor; System for Centering a Turbofan in a Nacelle During Tests; Fabricating Composite-Material Structures Containing SMA Ribbons; Optimal Feedback Control of Thermal Networks; Artifacts for Calibration of Submicron Width Measurements; Navigating a Mobile Robot Across Terrain Using Fuzzy Logic; Designing Facilities for Collaborative Operations; and Quantitating Iron in Serum Ferritin by Use of ICP-MS.

  13. ALR - Laser altimeter for the ASTER deep space mission. Simulated operation above a surface with crater

    NASA Astrophysics Data System (ADS)

    de Brum, A. G. V.; da Cruz, F. C.; Hetem, A., Jr.

    2015-10-01

    To assist in the investigation of the triple asteroid system 2001-SN263, the deep space mission ASTER will carry onboard a laser altimeter. The instrument was named ALR and its development is now in progress. In order to help in the instrument design, with a view to the creation of software to control the instrument, a package of computer programs was produced to simulate the operation of a pulsed laser altimeter with operating principle based on the measurement of the time of flight of the travelling pulse. This software Simulator was called ALR_Sim, and the results obtained with its use represent what should be expected as return signal when laser pulses are fired toward a target, reflect on it and return to be detected by the instrument. The program was successfully tested with regard to some of the most common situations expected. It constitutes now the main workbench dedicated to the creation and testing of control software to embark in the ALR. In addition, the Simulator constitutes also an important tool to assist the creation of software to be used on Earth, in the processing and analysis of the data received from the instrument. This work presents the results obtained in the special case which involves the modeling of a surface with crater, along with the simulation of the instrument operation above this type of terrain. This study points out that the comparison of the wave form obtained as return signal after reflection of the laser pulse on the surface of the crater with the expected return signal in the case of a flat and homogeneous surface is a useful method that can be applied for terrain details extraction.

  14. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springston, S. R.

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers throughmore » the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.« less

  15. SAO mission support software and data standards, version 1.0

    NASA Technical Reports Server (NTRS)

    Hsieh, P.

    1993-01-01

    This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.

  16. Research and realization of signal simulation on virtual instrument

    NASA Astrophysics Data System (ADS)

    Zhao, Qi; He, Wenting; Guan, Xiumei

    2010-02-01

    In the engineering project, arbitrary waveform generator controlled by software interface is needed by simulation and test. This article discussed the program using the SCPI (Standard Commands For Programmable Instruments) protocol and the VISA (Virtual Instrument System Architecture) library to control the Agilent signal generator (Agilent N5182A) by instrument communication over the LAN interface. The program can conduct several signal generations such as CW (continuous wave), AM (amplitude modulation), FM (frequency modulation), ΦM (phase modulation), Sweep. As the result, the program system has good operability and portability.

  17. The development procedures and tools applied for the attitude control software of the Italian satellite SAX

    NASA Astrophysics Data System (ADS)

    Hameetman, G. J.; Dekker, G. J.

    1993-11-01

    The Italian satellite (with a large Dutch contribution) SAX is a scientific satellite which has the mission to study roentgen sources. One main requirement for the Attitude and Orbit Control Subsystem (AOCS) is to achieve and maintain a stable pointing accuracy with a limit cycle of less than 90 arcsec during pointings of maximal 28 hours. The main SAX instrument, the Narrow Field Instrument, is highly sensitive to (indirect) radiation coming from the Sun. This sensitivity leads to another main requirement that under no circumstances the safe attitude domain may be left. The paper describes the application software in relation with the overall SAX AOCS subsystem, the CASE tools that have been used during the development, some advantages and disadvantages of the use of these tools, the measures taken to meet the more or less conflicting requirements of reliability and flexibility, and the lessons learned during development. The quality of the approach to the development has proven the (separately executed) hardware/software integration tests. During these tests, a neglectible number of software errors has been detected in the application software.

  18. LabVIEW-based control software for para-hydrogen induced polarization instrumentation.

    PubMed

    Agraz, Jose; Grunfeld, Alexander; Li, Debiao; Cunningham, Karl; Willey, Cindy; Pozos, Robert; Wagner, Shawn

    2014-04-01

    The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10,000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ((13)C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (Bo), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures. Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of (13)C based endogenous contrast agents used in molecular imaging.

  19. LabVIEW-based control software for para-hydrogen induced polarization instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agraz, Jose, E-mail: joseagraz@ucla.edu; Grunfeld, Alexander; Li, Debiao

    2014-04-15

    The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10 000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ({sup 13}C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (B{sub o}), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures.more » Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of {sup 13}C based endogenous contrast agents used in molecular imaging.« less

  20. A new approach for instrument software at Gemini

    NASA Astrophysics Data System (ADS)

    Gillies, Kim; Nunez, Arturo; Dunn, Jennifer

    2008-07-01

    Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.

  1. The TI-99/4A Software.

    ERIC Educational Resources Information Center

    Wrege, Rachael; And Others

    1982-01-01

    Describes the software modules produced by Texas Instruments for use with the TI-99/4A home computer. Among the modules described are: Personal Real Estate, Programing Aids, Home Financial Decisions, Music Maker, Weight Control and Nutrition, Early Learning Fun, and Tax/Investment Record Keeping. (JL)

  2. Applying Computer-Assisted Musical Instruction to Music Appreciation Course: An Example with Chinese Musical Instruments

    ERIC Educational Resources Information Center

    Lou, Shi-Jer; Guo, Yuan-Chang; Zhu, Yi-Zhen; Shih, Ru-Chu; Dzan, Wei-Yuan

    2011-01-01

    This study aims to explore the effectiveness of computer-assisted musical instruction (CAMI) in the Learning Chinese Musical Instruments (LCMI) course. The CAMI software for Chinese musical instruments was developed and administered to 228 students in a vocational high school. A pretest-posttest non-equivalent control group design with three…

  3. PScan 1.0: flexible software framework for polygon based multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Li, Yongxiao; Lee, Woei Ming

    2016-12-01

    Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.

  4. Membrane Transfer Phenomena (MTP)

    NASA Technical Reports Server (NTRS)

    Mason, Larry

    1996-01-01

    Progress has been made in several areas of the definition, design, and development of the Membrane Transport Apparatus (MTA) instrument and associated sensors and systems. Progress is also reported in the development of software modules for instrument control, experimental image and data acquisition, and data analysis.

  5. Controlling CAMAC instrumentation through the USB port

    NASA Astrophysics Data System (ADS)

    Ribas, R. V.

    2012-02-01

    A programmable device to interface CAMAC instrumentation to the USB port of computers, without the need of heavy, noisy and expensive CAMAC crates is described in this article. Up to four single-width modules can be used. Also, all software necessary for a multi-parametric data acquisition system was developed. A standard crate-controller based on the same project is being designed.

  6. The Development of Point Doppler Velocimeter Data Acquisition and Processing Software

    NASA Technical Reports Server (NTRS)

    Cavone, Angelo A.

    2008-01-01

    In order to develop efficient and quiet aircraft and validate Computational Fluid Dynamic predications, aerodynamic researchers require flow parameter measurements to characterize flow fields about wind tunnel models and jet flows. A one-component Point Doppler Velocimeter (pDv), a non-intrusive, laser-based instrument, was constructed using a design/develop/test/validate/deploy approach. A primary component of the instrument is software required for system control/management and data collection/reduction. This software along with evaluation algorithms, advanced pDv from a laboratory curiosity to a production level instrument. Simultaneous pDv and pitot probe velocity measurements obtained at the centerline of a flow exiting a two-inch jet, matched within 0.4%. Flow turbulence spectra obtained with pDv and a hot-wire detected the primary and secondary harmonics with equal dynamic range produced by the fan driving the flow. Novel,hardware and software methods were developed, tested and incorporated into the system to eliminate and/or minimize error sources and improve system reliability.

  7. HERMES travels by CAN bus

    NASA Astrophysics Data System (ADS)

    Waller, Lewis G.; Shortridge, Keith; Farrell, Tony J.; Vuong, Minh; Muller, Rolf; Sheinis, Andrew I.

    2014-07-01

    The new HERMES spectrograph represents the first foray by AAO into the use of commercial off-the-shelf industrial field bus technology for instrument control, and we regard the final system, with its relatively simple wiring requirements, as a great success. However, both software and hardware teams had to work together to solve a number of problems integrating the chosen CANopen/CAN bus system into our normal observing systems. A Linux system running in an industrial PC chassis ran the HERMES control software, using a PCI CAN bus interface connected to a number of distributed CANopen/CAN bus I/O devices and servo amplifiers. In the main, the servo amplifiers performed impressively, although some experimentation with homing algorithms was required, and we hit a significant hurdle when we discovered that we needed to disable some of the encoders used during observations; we learned a lot about how servo amplifiers respond when their encoders are turned off, and about how encoders react to losing power. The software was based around a commercial CANopen library from Copley Controls. Early worries about how this heavily multithreaded library would work with our standard data acquisition system led to the development of a very low-level CANopen software simulator to verify the design. This also enabled the software group to develop and test almost all the control software well in advance of the construction of the hardware. In the end, the instrument went from initial installation at the telescope to successful commissioning remarkably smoothly.

  8. Software for the EVLA

    NASA Astrophysics Data System (ADS)

    Butler, Bryan J.; van Moorsel, Gustaaf; Tody, Doug

    2004-09-01

    The Expanded Very Large Array (EVLA) project is the next generation instrument for high resolution long-millimeter to short-meter wavelength radio astronomy. It is currently funded by NSF, with completion scheduled for 2012. The EVLA will upgrade the VLA with new feeds, receivers, data transmission hardware, correlator, and a new software system to enable the instrument to achieve its full potential. This software includes both that required for controlling and monitoring the instrument and that involved with the scientific dataflow. We concentrate here on a portion of the dataflow software, including: proposal preparation, submission, and handling; observation preparation, scheduling, and remote monitoring; data archiving; and data post-processing, including both automated (pipeline) and manual processing. The primary goals of the software are: to maximize the scientific return of the EVLA; provide ease of use, for both novices and experts; exploit commonality amongst all NRAO telescopes where possible. This last point is both a bane and a blessing: we are not at liberty to do whatever we want in the software, but on the other hand we may borrow from other projects (notably ALMA and GBT) where appropriate. The software design methodology includes detailed initial use-cases and requirements from the scientists, intimate interaction between the scientists and the programmers during design and implementation, and a thorough testing and acceptance plan.

  9. Spectrophotometers for plutonium monitoring in HB-line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lascola, R. J.; O'Rourke, P. E.; Kyser, E. A.

    2016-02-12

    This report describes the equipment, control software, calibrations for total plutonium and plutonium oxidation state, and qualification studies for the instrument. It also provides a detailed description of the uncertainty analysis, which includes source terms associated with plutonium calibration standards, instrument drift, and inter-instrument variability. Also included are work instructions for instrument, flow cell, and optical fiber setup, work instructions for routine maintenance, and drawings and schematic diagrams.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springston, Stephen R.

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO 2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO 2 + hυ1 →SO 2 *→SO 2 + hυ2 The emitted light is proportional to the concentration of SO 2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed tomore » interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.« less

  11. Remote Control and Data Acquisition: A Case Study

    NASA Technical Reports Server (NTRS)

    DeGennaro, Alfred J.; Wilkinson, R. Allen

    2000-01-01

    This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.

  12. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  13. Integration of the instrument control electronics for the ESPRESSO spectrograph at ESO-VLT

    NASA Astrophysics Data System (ADS)

    Baldini, V.; Calderone, G.; Cirami, R.; Coretti, I.; Cristiani, S.; Di Marcantonio, P.; Mégevand, D.; Riva, M.; Santin, P.

    2016-07-01

    ESPRESSO, the Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations of the ESO - Very Large Telescope site, is now in its integration phase. The large number of functions of this complex instrument are fully controlled by a Beckhoff PLC based control electronics architecture. Four small and one large cabinets host the main electronic parts to control all the sensors, motorized stages and other analogue and digital functions of ESPRESSO. The Instrument Control Electronics (ICE) is built following the latest ESO standards and requirements. Two main PLC CPUs are used and are programmed through the TwinCAT Beckhoff dedicated software. The assembly, integration and verification phase of ESPRESSO, due to its distributed nature and different geographical locations of the consortium partners, is quite challenging. After the preliminary assembling and test of the electronic components at the Astronomical Observatory of Trieste and the test of some electronics and software parts at ESO (Garching), the complete system for the control of the four Front End Unit (FEU) arms of ESPRESSO has been fully assembled and tested in Merate (Italy) at the beginning of 2016. After these first tests, the system will be located at the Geneva Observatory (Switzerland) until the Preliminary Acceptance Europe (PAE) and finally shipped to Chile for the commissioning. This paper describes the integration strategy of the ICE workpackage of ESPRESSO, the hardware and software tests that have been performed, with an overall view of the experience gained during these project's phases.

  14. The Future of Data Reduction at UKIRT

    NASA Astrophysics Data System (ADS)

    Economou, F.; Bridger, A.; Wright, G. S.; Rees, N. P.; Jenness, T.

    The Observatory Reduction and Acquisition Control (ORAC) project is a comprehensive re-implementation of all existing instrument user interfaces and data handling software involved at the United Kingdom Infrared Telescope (UKIRT). This paper addresses the design of the data reduction part of the system. Our main aim is to provide data reduction facilities for the new generation of UKIRT instruments of a similar standard to our current software packages, which have enjoyed success because of their science-driven approach. Additionally we wish to use modern software techniques in order to produce a system that is portable, flexible and extensible so as to have modest maintenance requirements, both in the medium and the longer term.

  15. Global Precipitation Measurement (GPM) Safety Inhibit Timeline Tool

    NASA Technical Reports Server (NTRS)

    Dion, Shirley

    2012-01-01

    The Global Precipitation Measurement (GPM) Observatory is a joint mission under the partnership by National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA), Japan. The NASA Goddard Space Flight Center (GSFC) has the lead management responsibility for NASA on GPM. The GPM program will measure precipitation on a global basis with sufficient quality, Earth coverage, and sampling to improve prediction of the Earth's climate, weather, and specific components of the global water cycle. As part of the development process, NASA built the spacecraft (built in-house at GSFC) and provided one instrument (GPM Microwave Imager (GMI) developed by Ball Aerospace) JAXA provided the launch vehicle (H2-A by MHI) and provided one instrument (Dual-Frequency Precipitation Radar (DPR) developed by NTSpace). Each instrument developer provided a safety assessment which was incorporated into the NASA GPM Safety Hazard Assessment. Inhibit design was reviewed for hazardous subsystems which included the High Gain Antenna System (HGAS) deployment, solar array deployment, transmitter turn on, propulsion system release, GMI deployment, and DPR radar turn on. The safety inhibits for these listed hazards are controlled by software. GPM developed a "pathfinder" approach for reviewing software that controls the electrical inhibits. This is one of the first GSFC in-house programs that extensively used software controls. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As part of this process a new tool "safety inhibit time line" was created for management of inhibits and their controls during spacecraft buildup and testing during 1& Tat GSFC and at the Range in Japan. In addition to understanding inhibits and controls during 1& T the tool allows the safety analyst to better communicate with others the changes in inhibit states with each phase of hardware and software testing. The tool was very useful for communicating compliance with safety requirements especially when working with a foreign partner.

  16. International Instrumentation Symposium, 34th, Albuquerque, NM, May 2-6, 1988, Proceedings

    NASA Astrophysics Data System (ADS)

    Various papers on aerospace instrumentation are presented. The general topics addressed include: blast and shock, wind tunnel instrumentations and controls, digital/optical sensors, software design/development, special test facilities, fiber optic techniques, electro/fiber optical measurement systems, measurement uncertainty, real time systems, pressure. Also discussed are: flight test and avionics instrumentation, data acquisition techniques, computer applications, thermal force and displacement, science and government, modeling techniques, reentry vehicle testing, strain and pressure.

  17. Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer

    NASA Astrophysics Data System (ADS)

    Stryjewski, Wieslaw J.

    1991-08-01

    A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.

  18. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  19. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  20. Real-time solar magnetograph operation system software design and user's guide

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1984-01-01

    The Real Time Solar Magnetograph (RTSM) Operation system software design on PDP11/23+ is presented along with the User's Guide. The RTSM operation software is for real time instrumentation control, data collection and data management. The data is used for vector analysis, plotting or graphics display. The processed data is then easily compared with solar data from other sources, such as the Solar Maximum Mission (SMM).

  1. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, William A.

    1991-01-01

    During 1991, the software developed allowed an operator to configure and checkout the TSI, Inc. laser velocimeter (LV) system prior to a run. This setup procedure established the operating conditions for the TSI MI-990 multichannel interface and the RMR-1989 rotating machinery resolver. In addition to initializing the instruments, the software package provides a means of specifying LV calibration constants, controlling the sampling process, and identifying the test parameters.

  2. Single-crystal diffraction instrument TriCS at SINQ

    NASA Astrophysics Data System (ADS)

    Schefer, J.; Könnecke, M.; Murasik, A.; Czopnik, A.; Strässle, Th; Keller, P.; Schlumpf, N.

    2000-03-01

    The single-crystal diffractometer TriCS at the Swiss Continuous Spallation Source (SINQ) is presently in the commissioning phase. A two-dimensional wire detector produced by EMBL was delivered in March 1999. The instrument is presently tested with a single detector. First measurements on magnetic structures have been performed. The instrument is remotely controlled using JAVA-based software and a UNIX DEC-α host computer.

  3. Simultaneous control of multiple instruments at the Advanced Technology Solar Telescope

    NASA Astrophysics Data System (ADS)

    Johansson, Erik M.; Goodrich, Bret

    2012-09-01

    The Advanced Technology Solar Telescope (ATST) is a 4-meter solar observatory under construction at Haleakala, Hawaii. The simultaneous use of multiple instruments is one of the unique capabilities that makes the ATST a premier ground based solar observatory. Control of the instrument suite is accomplished by the Instrument Control System (ICS), a layer of software between the Observatory Control System (OCS) and the instruments. The ICS presents a single narrow interface to the OCS and provides a standard interface for the instruments to be controlled. It is built upon the ATST Common Services Framework (CSF), an infrastructure for the implementation of a distributed control system. The ICS responds to OCS commands and events, coordinating and distributing them to the various instruments while monitoring their progress and reporting the status back to the OCS. The ICS requires no specific knowledge about the instruments. All information about the instruments used in an experiment is passed by the OCS to the ICS, which extracts and forwards the parameters to the appropriate instrument controllers. The instruments participating in an experiment define the active instrument set. A subset of those instruments must complete their observing activities in order for the experiment to be considered complete and are referred to as the must-complete instrument set. In addition, instruments may participate in eavesdrop mode, outside of the control of the ICS. All instrument controllers use the same standard narrow interface, which allows new instruments to be added without having to modify the interface or any existing instrument controllers.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos

    Selects locations of interest for liquid microjunction surface sampling coupled to a subsequent analysis is done in a user friendly way. That information is then transferred to instrument control softwares. In addition, readout of a laser sensor allows for robust probe-to-surface distance measurement. Furthermore, pictures taken by the software from a camera provides feedback to judge on successful microjunction sampling.

  5. Status of the JWST Science Instrument Payload

    NASA Technical Reports Server (NTRS)

    Greenhouse, Matt

    2016-01-01

    The James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) system consists of five sensors (4 science): Mid-Infrared Instrument (MIRI), Near Infrared Imager and Slitless Spectrograph (NIRISS), Fine Guidance Sensor (FGS), Near InfraRed Camera (NIRCam), Near InfraRed Spectrograph (NIRSpec); and nine instrument support systems: Optical metering structure system, Electrical Harness System; Harness Radiator System, ISIM Electronics Compartment, ISIM Remote Services Unit, Cryogenic Thermal Control System, Command and Data Handling System, Flight Software System, Operations Scripts System.

  6. Software for Testing Electroactive Structural Components

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar

    2003-01-01

    A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.

  7. Eight microprocessor-based instrument data systems in the Galileo Orbiter spacecraft

    NASA Technical Reports Server (NTRS)

    Barry, R. C.

    1980-01-01

    Instrument data systems consist of a microprocessor, 3K bytes of Read Only Memory and 3K bytes of Random Access Memory. It interfaces with the spacecraft data bus through an isolated user interface with a direct memory access bus adaptor, and/or parallel data from instrument devices such as registers, buffers, analog to digital converters, multiplexers, and solid state sensors. These data systems support the spacecraft hardware and software communication protocol, decode and process instrument commands, generate continuous instrument operating modes, control the instrument mechanisms, acquire, process, format, and output instrument science data.

  8. LMJ Points Plus v2.6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos

    Short summary of the software's functionality: • built-in scan feature to acquire optical image of the surface to be analyzed • click-and-point selection of points of interest on the surface • supporting standalone autosampler/HPLC/MS operation: creating independent batch files after points of interests are selected for LEAPShell (autosampler control software from Leap Technologies) and Analyst® (mass spectrometry (MS) software from AB Sciex) • supporting integrated autosampler/HPLC/MS operation: creating one batch file for all instruments controlled by Analyst® (mass spectrometry software from AB Sciex) after points of interests are selected •creating heatmaps of analytes of interests from collected MS files inmore » a hand-off fashion« less

  9. Astrobo: Towards a new observatory control system for the Garching Observatory 0.6m

    NASA Astrophysics Data System (ADS)

    Schweyer, T.; Jarmatz, P.; Burwitz, V.

    2016-12-01

    The recently installed Campus Observatory Garching (COG) 0.6m telescope features a wide array of instruments, including a wide-field imager and a variety of spectrographs. To support all these different instruments and improve time usage, it was decided to develop a new control system from scratch, that will be able to safely observe autonomously as well as manually (for student lab courses). It is built using an hierarchical microservice architecture, which allows well-specified communication between its components regardless of the programming language used. This modular design allows for fast prototyping of components as well as easy implementation of complex instrumentation control software.

  10. The Validation of a Software Evaluation Instrument.

    ERIC Educational Resources Information Center

    Schmitt, Dorren Rafael

    This study, conducted at six southern universities, analyzed the validity and reliability of a researcher developed instrument designed to evaluate educational software in secondary mathematics. The instrument called the Instrument for Software Evaluation for Educators uses measurement scales, presents a summary section of the evaluation, and…

  11. ARNICA and LonGSp: the refurbishment of two near infrared instruments

    NASA Astrophysics Data System (ADS)

    Koshida, Shintaro; Vanzi, Leonardo; Guzman, Dani; Leiva, Rodrigo; Bonati, Marco A.; Avilés, Roberto L.; Baffa, Carlo; Palla, Francesco; Mannucci, Filippo; Shen, Tzu Chiang; Suc, Vincent

    2014-07-01

    ARNICA and LonGSp are two NICMOS based near infrared instruments developed in the 90's by the Astrophysical Observatory of Arcetri. After more than 10 years from decommissioning we refurbished the two instruments with a new read-out electronics and control software. We present the performances of the refurbished systems and compare them with the historic behavior. Both instruments are currently used for testing purposes in the Lab and at the telescope, we present some example applications.

  12. The software architecture to control the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Oya, I.; Füßling, M.; Antonino, P. O.; Conforti, V.; Hagge, L.; Melkumyan, D.; Morgenstern, A.; Tosti, G.; Schwanke, U.; Schwarz, J.; Wegner, P.; Colomé, J.; Lyard, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) project is an initiative to build two large arrays of Cherenkov gamma- ray telescopes. CTA will be deployed as two installations, one in the northern and the other in the southern hemisphere, containing dozens of telescopes of different sizes. CTA is a big step forward in the field of ground- based gamma-ray astronomy, not only because of the expected scientific return, but also due to the order-of- magnitude larger scale of the instrument to be controlled. The performance requirements associated with such a large and distributed astronomical installation require a thoughtful analysis to determine the best software solutions. The array control and data acquisition (ACTL) work-package within the CTA initiative will deliver the software to control and acquire the data from the CTA instrumentation. In this contribution we present the current status of the formal ACTL system decomposition into software building blocks and the relationships among them. The system is modelled via the Systems Modelling Language (SysML) formalism. To cope with the complexity of the system, this architecture model is sub-divided into different perspectives. The relationships with the stakeholders and external systems are used to create the first perspective, the context of the ACTL software system. Use cases are employed to describe the interaction of those external elements with the ACTL system and are traced to a hierarchy of functionalities (abstract system functions) describing the internal structure of the ACTL system. These functions are then traced to fully specified logical elements (software components), the deployment of which as technical elements, is also described. This modelling approach allows us to decompose the ACTL software in elements to be created and the ow of information within the system, providing us with a clear way to identify sub-system interdependencies. This architectural approach allows us to build the ACTL system model and trace requirements to deliverables (source code, documentation, etc.), and permits the implementation of a flexible use-case driven software development approach thanks to the traceability from use cases to the logical software elements. The Alma Common Software (ACS) container/component framework, used for the control of the Atacama Large Millimeter/submillimeter Array (ALMA) is the basis for the ACTL software and as such it is considered as an integral part of the software architecture.

  13. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  14. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup--The hardware, firmware, and software implementation.

    PubMed

    Antony, Joby; Mathuria, D S; Datta, T S; Maity, Tanmoy

    2015-12-01

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as "CADS," which stands for "Complete Automation of Distribution System." CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local Area Network). A group of six categories of such instruments have been identified for all cryogenic applications required for linac operation which were designed to build this medium-scale cryogenic automation setup. These devices have special features like remote rebooters, daughter boards for PIDs (Proportional Integral Derivative), etc., to operate them remotely in radiation areas and also have emergency switches by which each device can be taken to emergency mode temporarily. Finally, all the data are monitored, logged, controlled, and analyzed online at a central control room which has a user-friendly control interface developed using LabVIEW(®). This paper discusses the overall hardware, firmware, software design, and implementation for the cryogenics setup.

  15. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup—The hardware, firmware, and software implementation

    NASA Astrophysics Data System (ADS)

    Antony, Joby; Mathuria, D. S.; Datta, T. S.; Maity, Tanmoy

    2015-12-01

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similar control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as "CADS," which stands for "Complete Automation of Distribution System." CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local Area Network). A group of six categories of such instruments have been identified for all cryogenic applications required for linac operation which were designed to build this medium-scale cryogenic automation setup. These devices have special features like remote rebooters, daughter boards for PIDs (Proportional Integral Derivative), etc., to operate them remotely in radiation areas and also have emergency switches by which each device can be taken to emergency mode temporarily. Finally, all the data are monitored, logged, controlled, and analyzed online at a central control room which has a user-friendly control interface developed using LabVIEW®. This paper discusses the overall hardware, firmware, software design, and implementation for the cryogenics setup.

  16. Development of intelligent instruments with embedded HTTP servers for control and data acquisition in a cryogenic setup—The hardware, firmware, and software implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antony, Joby; Mathuria, D. S.; Datta, T. S.

    The power of Ethernet for control and automation technology is being largely understood by the automation industry in recent times. Ethernet with HTTP (Hypertext Transfer Protocol) is one of the most widely accepted communication standards today. Ethernet is best known for being able to control through internet from anywhere in the globe. The Ethernet interface with built-in on-chip embedded servers ensures global connections for crate-less model of control and data acquisition systems which have several advantages over traditional crate-based control architectures for slow applications. This architecture will completely eliminate the use of any extra PLC (Programmable Logic Controller) or similarmore » control hardware in any automation network as the control functions are firmware coded inside intelligent meters itself. Here, we describe the indigenously built project of a cryogenic control system built for linear accelerator at Inter University Accelerator Centre, known as “CADS,” which stands for “Complete Automation of Distribution System.” CADS deals with complete hardware, firmware, and software implementation of the automated linac cryogenic distribution system using many Ethernet based embedded cryogenic instruments developed in-house. Each instrument works as an intelligent meter called device-server which has the control functions and control loops built inside the firmware itself. Dedicated meters with built-in servers were designed out of ARM (Acorn RISC (Reduced Instruction Set Computer) Machine) and ATMEL processors and COTS (Commercially Off-the-Shelf) SMD (Surface Mount Devices) components, with analog sensor front-end and a digital back-end web server implementing remote procedure call over HTTP for digital control and readout functions. At present, 24 instruments which run 58 embedded servers inside, each specific to a particular type of sensor-actuator combination for closed loop operations, are now deployed and distributed across control LAN (Local Area Network). A group of six categories of such instruments have been identified for all cryogenic applications required for linac operation which were designed to build this medium-scale cryogenic automation setup. These devices have special features like remote rebooters, daughter boards for PIDs (Proportional Integral Derivative), etc., to operate them remotely in radiation areas and also have emergency switches by which each device can be taken to emergency mode temporarily. Finally, all the data are monitored, logged, controlled, and analyzed online at a central control room which has a user-friendly control interface developed using LabVIEW{sup ®}. This paper discusses the overall hardware, firmware, software design, and implementation for the cryogenics setup.« less

  17. Integration of analytical instruments with computer scripting.

    PubMed

    Carvalho, Matheus C

    2013-08-01

    Automation of laboratory routines aided by computer software enables high productivity and is the norm nowadays. However, the integration of different instruments made by different suppliers is still difficult, because to accomplish it, the user must have knowledge of electronics and/or low-level programming. An alternative approach is to control different instruments without an electronic connection between them, relying only on their software interface on a computer. This can be achieved through scripting, which is the emulation of user operations (mouse clicks and keyboard inputs) on the computer. The main advantages of this approach are its simplicity, which enables people with minimal knowledge of computer programming to employ it, and its universality, which enables the integration of instruments made by different suppliers, meaning that the user is totally free to choose the devices to be integrated. Therefore, scripting can be a useful, accessible, and economic solution for laboratory automation.

  18. Evolution of the SOFIA tracking control system

    NASA Astrophysics Data System (ADS)

    Fiebig, Norbert; Jakob, Holger; Pfüller, Enrico; Röser, Hans-Peter; Wiedemann, Manuel; Wolf, Jürgen

    2014-07-01

    The airborne observatory SOFIA (Stratospheric Observatory for Infrared Astronomy) is undergoing a modernization of its tracking system. This included new, highly sensitive tracking cameras, control computers, filter wheels and other equipment, as well as a major redesign of the control software. The experiences along the migration path from an aged 19" VMbus based control system to the application of modern industrial PCs, from VxWorks real-time operating system to embedded Linux and a state of the art software architecture are presented. Further, the concept is presented to operate the new camera also as a scientific instrument, in parallel to tracking.

  19. The Application Design of Solar Radio Spectrometer Based on FPGA

    NASA Astrophysics Data System (ADS)

    Du, Q. F.; Chen, R. J.; Zhao, Y. C.; Feng, S. W.; Chen, Y.; Song, Y.

    2017-10-01

    The Solar radio spectrometer is the key instrument to observe solar radio. By programing the computer software, we control the AD signal acquisition card which is based on FPGA to get a mass of data. The data are transferred by using PCI-E port. This program has realized the function of timing data collection, finding data in specific time and controlling acquisition meter in real time. It can also map the solar radio power intensity graph. By doing the experiment, we verify the reliability of solar radio spectrum instrument, in the meanwhile, the instrument simplifies the operation in observing the sun.

  20. Fiber optic interferometry for industrial process monitoring and control applications

    NASA Astrophysics Data System (ADS)

    Marcus, Michael A.

    2002-02-01

    Over the past few years we have been developing applications for a high-resolution (sub-micron accuracy) fiber optic coupled dual Michelson interferometer-based instrument. It is being utilized in a variety of applications including monitoring liquid layer thickness uniformity on coating hoppers, film base thickness uniformity measurement, digital camera focus assessment, optical cell path length assessment and imager and wafer surface profile mapping. The instrument includes both coherent and non-coherent light sources, custom application dependent optical probes and sample interfaces, a Michelson interferometer, custom electronics, a Pentium-based PC with data acquisition cards and LabWindows CVI or LabView based application specific software. This paper describes the development evolution of this instrument platform and applications highlighting robust instrument design, hardware, software, and user interfaces development. The talk concludes with a discussion of a new high-speed instrument configuration, which can be utilized for high speed surface profiling and as an on-line web thickness gauge.

  1. Study on virtual instrument developing system based on intelligent virtual control

    NASA Astrophysics Data System (ADS)

    Tang, Baoping; Cheng, Fabin; Qin, Shuren

    2005-01-01

    The paper introduces a non-programming developing system of a virtual instument (VI), i.e., a virtual measurement instrument developing system (VMIDS) based on intelligent virtual control (IVC). The background of the IVC-based VMIDS is described briefly, and the hierarchical message bus (HMB)-based software architecture of VMIDS is discussed in detail. The three parts and functions of VMIDS are introduced, and the process of non-programming developing VI is further described.

  2. Use of the Photo-Electromyogram to Objectively Diagnose and Monitor Treatment of Post-TBI Light Sensitivity

    DTIC Science & Technology

    2012-10-01

    in place. Mark Ginsberg, one of our local jewelry story owners has acquired 3D extruding printers for medical instrumentation applications and will...comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE...tested out our software, which was written to control the monitor brightness, duration, and color for each visual stimulus. The software has been

  3. A microcomputer controlled snow ski binding system--I. Instrumentation and field evaluation.

    PubMed

    MacGregor, D; Hull, M L; Dorius, L K

    1985-01-01

    This paper presents the design and field evaluation of the first microcomputer controlled ski binding system. This system incorporates an Intel 8086 microcomputer controller and an integral binding/dynamometer. This instrumentation system not only undertakes real time control, but also it records dynamometer data via a miniature digital cassette tape recorder. The integral binding/dynamometer offers the same operational and mounting convenience of commercially available mechanical bindings. The binding may be released either manually or electrically via the controller. Comprised of four octagonal half strain rings, the strain gage dynamometer measures the three moment load components at the boot. To enable the user to conveniently operate the computer, extensive operating software was developed. The operating software is discussed in relation to both the acquisition and storage of data from the dynamometer and the control of the electro-mechanical snow ski binding. The binding system has been used successfully to both record boot moment components and control ski binding release during actual skiing maneuvers. Moment histories typical of three common recreational skiing maneuvers are presented.

  4. Advanced communications technology satellite high burst rate link evaluation terminal communication protocol software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Communication Protocol Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Communication Protocol Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Communication Protocol Software allows users to control and configure the Intermediate Frequency Switch Matrix (IFSM) on board the ACTS to yield a desired path through the spacecraft payload. Besides IFSM control, the C&PM Software System is also responsible for instrument control during HBR-LET experiments, uplink power control of the HBR-LET to demonstrate power augmentation during signal fade events, and data display. The Communication Protocol Software User's Guide, Version 1.0 (NASA CR-189162) outlines the commands and procedures to install and operate the Communication Protocol Software. Configuration files used to control the IFSM, operator commands, and error recovery procedures are discussed. The Communication Protocol Software Maintenance Manual, Version 1.0 (NASA CR-189163, to be published) is a programmer's guide to the Communication Protocol Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Communication Protocol Software, computer algorithms, format representations, and computer hardware configuration. The Communication Protocol Software Test Plan (NASA CR-189164, to be published) provides a step-by-step procedure to verify the operation of the software. Included in the Test Plan is command transmission, telemetry reception, error detection, and error recovery procedures.

  5. Development of a PC-based ground support system for a small satellite instrument

    NASA Astrophysics Data System (ADS)

    Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.

    1993-11-01

    The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.

  6. Cyber security best practices for the nuclear industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badr, I.

    2012-07-01

    When deploying software based systems, such as, digital instrumentation and controls for the nuclear industry, it is vital to include cyber security assessment as part of architecture and development process. When integrating and delivering software-intensive systems for the nuclear industry, engineering teams should make use of a secure, requirements driven, software development life cycle, ensuring security compliance and optimum return on investment. Reliability protections, data loss prevention, and privacy enforcement provide a strong case for installing strict cyber security policies. (authors)

  7. MODIS. Volume 1: MODIS level 1A software baseline requirements

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James

    1994-01-01

    This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.

  8. Radiometer Calibration and Characterization (RCC) User's Manual: Windows Version 4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andreas, Afshin M.; Wilcox, Stephen M.

    2016-02-29

    The Radiometer Calibration and Characterization (RCC) software is a data acquisition and data archival system for performing Broadband Outdoor Radiometer Calibrations (BORCAL). RCC provides a unique method of calibrating broadband atmospheric longwave and solar shortwave radiometers using techniques that reduce measurement uncertainty and better characterize a radiometer's response profile. The RCC software automatically monitors and controls many of the components that contribute to uncertainty in an instrument's responsivity. This is a user's manual and guide to the RCC software.

  9. Design and implementation of the mobility assessment tool: software description.

    PubMed

    Barnard, Ryan T; Marsh, Anthony P; Rejeski, Walter Jack; Pecorella, Anthony; Ip, Edward H

    2013-07-23

    In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications--one built in Java, the other in Objective-C for the Apple iPad--were then built that could present the instrument described in the XML document and collect participants' responses. Separating the instrument's definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine.

  10. Implementation of Distributed Services for a Deep Sea Moored Instrument Network

    NASA Astrophysics Data System (ADS)

    Oreilly, T. C.; Headley, K. L.; Risi, M.; Davis, D.; Edgington, D. R.; Salamy, K. A.; Chaffey, M.

    2004-12-01

    The Monterey Ocean Observing System (MOOS) is a moored observatory network consisting of interconnected instrument nodes on the sea surface, midwater, and deep sea floor. We describe Software Infrastructure and Applications for MOOS ("SIAM"), which implement the management, control, and data acquisition infrastructure for the moored observatory. Links in the MOOS network include fiber-optic and 10-BaseT copper connections between the at-sea nodes. A Globalstar satellite transceiver or 900 MHz Freewave terrestrial line-of-sight RF modem provides the link to shore. All of these links support Internet protocols, providing TCP/IP connectivity throughout a system that extends from shore to sensor nodes at the air-sea interface, through the oceanic water column to a benthic network of sensor nodes extending across the deep sea floor. Exploiting this TCP/IP infrastructure as well as capabilities provided by MBARI's MOOS mooring controller, we use powerful Internet software technologies to implement a distributed management, control and data acquisition system for the moored observatory. The system design meets the demanding functional requirements specified for MOOS. Nodes and their instruments are represented by Java RMI "services" having well defined software interfaces. Clients anywhere on the network can interact with any node or instrument through its corresponding service. A client may be on the same node as the service, may be on another node, or may reside on shore. Clients may be human, e.g. when a scientist on shore accesses a deployed instrument in real-time through a user interface. Clients may also be software components that interact autonomously with instruments and nodes, e.g. for purposes such as system resource management or autonomous detection and response to scientifically interesting events. All electrical power to the moored network is provided by solar and wind energy, and the RF shore-to-mooring links are intermittent and relatively low-bandwidth connections. Thus power and wireless bandwidth are limited resources that constrain our choice of service technologies and wireless access strategy. We describe and evaluate system performance in light of actual deployment of observatory elements in Monterey Bay, and discuss how the system can be developed further. We also consider management and control strategies for the cable-to-shore observatory known as MARS ("Monterey Accelerated Research System"). The MARS cable will provide high power and continuous high-bandwidth connectivity between seafloor instrument nodes and shore, thus removing key limitations of the moored observatory. Moreover MARS functional requirements may differ significantly from MOOS requirements. In light of these differences, we discuss how elements of our MOOS moored observatory architecture might be adapted to MARS.

  11. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  12. Software systems for operation, control, and monitoring of the EBEX instrument

    NASA Astrophysics Data System (ADS)

    Milligan, Michael; Ade, Peter; Aubin, François; Baccigalupi, Carlo; Bao, Chaoyun; Borrill, Julian; Cantalupo, Christopher; Chapman, Daniel; Didier, Joy; Dobbs, Matt; Grainger, Will; Hanany, Shaul; Hillbrand, Seth; Hubmayr, Johannes; Hyland, Peter; Jaffe, Andrew; Johnson, Bradley; Kisner, Theodore; Klein, Jeff; Korotkov, Andrei; Leach, Sam; Lee, Adrian; Levinson, Lorne; Limon, Michele; MacDermid, Kevin; Matsumura, Tomotake; Miller, Amber; Pascale, Enzo; Polsgrove, Daniel; Ponthieu, Nicolas; Raach, Kate; Reichborn-Kjennerud, Britt; Sagiv, Ilan; Tran, Huan; Tucker, Gregory S.; Vinokurov, Yury; Yadav, Amit; Zaldarriaga, Matias; Zilic, Kyle

    2010-07-01

    We present the hardware and software systems implementing autonomous operation, distributed real-time monitoring, and control for the EBEX instrument. EBEX is a NASA-funded balloon-borne microwave polarimeter designed for a 14 day Antarctic flight that circumnavigates the pole. To meet its science goals the EBEX instrument autonomously executes several tasks in parallel: it collects attitude data and maintains pointing control in order to adhere to an observing schedule; tunes and operates up to 1920 TES bolometers and 120 SQUID amplifiers controlled by as many as 30 embedded computers; coordinates and dispatches jobs across an onboard computer network to manage this detector readout system; logs over 3 GiB/hour of science and housekeeping data to an onboard disk storage array; responds to a variety of commands and exogenous events; and downlinks multiple heterogeneous data streams representing a selected subset of the total logged data. Most of the systems implementing these functions have been tested during a recent engineering flight of the payload, and have proven to meet the target requirements. The EBEX ground segment couples uplink and downlink hardware to a client-server software stack, enabling real-time monitoring and command responsibility to be distributed across the public internet or other standard computer networks. Using the emerging dirfile standard as a uniform intermediate data format, a variety of front end programs provide access to different components and views of the downlinked data products. This distributed architecture was demonstrated operating across multiple widely dispersed sites prior to and during the EBEX engineering flight.

  13. Control devices and steering strategies in pathway surgery.

    PubMed

    Fan, Chunman; Jelínek, Filip; Dodou, Dimitra; Breedveld, Paul

    2015-02-01

    For pathway surgery, that is, minimally invasive procedures carried out transluminally or through instrument-created pathways, handheld maneuverable instruments are being developed. As the accompanying control interfaces of such instruments have not been optimized for intuitive manipulation, we investigated the effect of control mode (1DoF or 2DoF), and control device (joystick or handgrip) on human performance in a navigation task. The experiments were conducted using the Endo-PaC (Endoscopic-Path Controller), a simulator that emulates the shaft and handle of a maneuverable instrument, combined with custom-developed software animating pathway surgical scenarios. Participants were asked to guide a virtual instrument without collisions toward a target located at the end of a virtual curved tunnel. The performance was assessed in terms of task completion time, path length traveled by the virtual instrument, motion smoothness, collision metrics, subjective workload, and personal preference. The results indicate that 2DoF control leads to faster task completion and fewer collisions with the tunnel wall combined with a strong subjective preference compared with 1DoF control. Handgrip control appeared to be more intuitive to master than joystick control. However, the participants experienced greater physical demand and had longer path lengths with handgrip than joystick control. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Time Analyzer for Time Synchronization and Monitor of the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Cole, Steven; Gonzalez, Jorge, Jr.; Calhoun, Malcolm; Tjoelker, Robert

    2003-01-01

    A software package has been developed to measure, monitor, and archive the performance of timing signals distributed in the NASA Deep Space Network. Timing signals are generated from a central master clock and distributed to over 100 users at distances up to 30 kilometers. The time offset due to internal distribution delays and time jitter with respect to the central master clock are critical for successful spacecraft navigation, radio science, and very long baseline interferometry (VLBI) applications. The instrument controller and operator interface software is written in LabView and runs on the Linux operating system. The software controls a commercial multiplexer to switch 120 separate timing signals to measure offset and jitter with a time-interval counter referenced to the master clock. The offset of each channel is displayed in histogram form, and "out of specification" alarms are sent to a central complex monitor and control system. At any time, the measurement cycle of 120 signals can be interrupted for diagnostic tests on an individual channel. The instrument also routinely monitors and archives the long-term stability of all frequency standards or any other 1-pps source compared against the master clock. All data is stored and made available for

  15. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  16. Automating spectral measurements

    NASA Astrophysics Data System (ADS)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  17. Air-condition Control System of Weaving Workshop Based on LabVIEW

    NASA Astrophysics Data System (ADS)

    Song, Jian

    The project of air-condition measurement and control system based on LabVIEW is put forward for the sake of controlling effectively the environmental targets in the weaving workshop. In this project, which is based on the virtual instrument technology and in which LabVIEW development platform by NI is adopted, the system is constructed on the basis of the virtual instrument technology. It is composed of the upper PC, central control nodes based on CC2530, sensor nodes, sensor modules and executive device. Fuzzy control algorithm is employed to achieve the accuracy control of the temperature and humidity. A user-friendly man-machine interaction interface is designed with virtual instrument technology at the core of the software. It is shown by experiments that the measurement and control system can run stably and reliably and meet the functional requirements for controlling the weaving workshop.

  18. The EMIR experience in the use of software control simulators to speed up the time to telescope

    NASA Astrophysics Data System (ADS)

    Lopez Ramos, Pablo; López-Ruiz, J. C.; Moreno Arce, Heidy; Rosich, Josefina; Perez Menor, José Maria

    2012-09-01

    One of the main problems facing development teams working on instrument control systems consists on the need to access mechanisms which are not available until well into the integration phase. The need to work with real hardware creates additional problems like, among others: certain faults cannot be tested due to the possibility of hardware damage, taking the system to the limit may shorten its operational lifespan and the full system may not be available during some periods due to maintenance and/or testing of individual components. These problems can be treated with the use of simulators and by applying software/hardware standards. Since information on the construction and performance of electro-mechanical systems is available at relatively early stages of the project, simulators are developed in advance (before the existence of the mechanism) or, if conventions and standards have been correctly followed, a previously developed simulator might be used. This article describes our experience in building software simulators and the main advantages we have identified, which are: the control software can be developed even in the absence of real hardware, critical tests can be prepared using the simulated systems, test system behavior for hardware failure situations that represent a risk of the real system, and the speed up of in house integration of the entire instrument. The use of simulators allows us to reduce development, testing and integration time.

  19. Distributed framework for dyanmic telescope and instrument control

    NASA Astrophysics Data System (ADS)

    Ames, Troy J.; Case, Lynne

    2003-02-01

    Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see http://www.jxta.org) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a devices IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have been working with the Submillimetre High

  20. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  1. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  2. The U.S./IAEA Workshop on Software Sustainability for Safeguards Instrumentation: Report to the NNSA DOE Office of International Nuclear Safeguards (NA-241)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pepper, Susan E.; Pickett, Chris A.; Queirolo, Al

    The U.S Department of Energy (DOE) National Nuclear Security Administration (NNSA) Next Generation Safeguards Initiative (NGSI) and the International Atomic Energy Agency (IAEA) convened a workshop on Software Sustainability for Safeguards Instrumentation in Vienna, Austria, May 6-8, 2014. Safeguards instrumentation software must be sustained in a changing environment to ensure existing instruments can continue to perform as designed, with improved security. The approaches to the development and maintenance of instrument software used in the past may not be the best model for the future and, therefore, the organizers’ goal was to investigate these past approaches and to determine an optimalmore » path forward. The purpose of this report is to provide input for the DOE NNSA Office of International Nuclear Safeguards (NA-241) and other stakeholders that can be utilized when making decisions related to the development and maintenance of software used in the implementation of international nuclear safeguards. For example, this guidance can be used when determining whether to fund the development, upgrade, or replacement of a particular software product. The report identifies the challenges related to sustaining software, and makes recommendations for addressing these challenges, supported by summaries and detailed notes from the workshop discussions. In addition the authors provide a set of recommendations for institutionalizing software sustainability practices in the safeguards community. The term “software sustainability” was defined for this workshop as ensuring that safeguards instrument software and algorithm functionality can be maintained efficiently throughout the instrument lifecycle, without interruption and providing the ability to continue to improve that software as needs arise.« less

  3. Feasibility study of the solar scientific instruments for Spacelab/Orbiter

    NASA Technical Reports Server (NTRS)

    Leritz, J.; Rasser, T.; Stone, E.; Lockhart, B.; Nobles, W.; Parham, J.; Eimers, D.; Peterson, D.; Barnhart, W.; Schrock, S.

    1981-01-01

    The feasibility and economics of mounting and operating a set of solar scientific instruments in the backup Skylab Apollo Telescope Mount (ATM) hardware was evaluated. The instruments used as the study test payload and integrated into the ATM were: the Solar EUV Telescope/Spectrometer; the Solar Active Region Observing Telescope; and the Lyman Alpha White Light Coronagraph. The backup ATM hardware consists of a central cruciform structure, called the "SPAR', a "Sun End Canister' and a "Multiple Docking Adapter End Canister'. Basically, the ATM hardware and software provides a structural interface for the instruments; a closely controlled thermal environment; and a very accurate attitude and pointing control capability. The hardware is an identical set to the hardware that flow on Skylab.

  4. The Software Design for the Wide-Field Infrared Explorer Attitude Control System

    NASA Technical Reports Server (NTRS)

    Anderson, Mark O.; Barnes, Kenneth C.; Melhorn, Charles M.; Phillips, Tom

    1998-01-01

    The Wide-Field Infrared Explorer (WIRE), currently scheduled for launch in September 1998, is the fifth of five spacecraft in the NASA/Goddard Small Explorer (SMEX) series. This paper presents the design of WIRE's Attitude Control System flight software (ACS FSW). WIRE is a momentum-biased, three-axis stabilized stellar pointer which provides high-accuracy pointing and autonomous acquisition for eight to ten stellar targets per orbit. WIRE's short mission life and limited cryogen supply motivate requirements for Sun and Earth avoidance constraints which are designed to prevent catastrophic instrument damage and to minimize the heat load on the cryostat. The FSW implements autonomous fault detection and handling (FDH) to enforce these instrument constraints and to perform several other checks which insure the safety of the spacecraft. The ACS FSW implements modules for sensor data processing, attitude determination, attitude control, guide star acquisition, actuator command generation, command/telemetry processing, and FDH. These software components are integrated with a hierarchical control mode managing module that dictates which software components are currently active. The lowest mode in the hierarchy is the 'safest' one, in the sense that it utilizes a minimal complement of sensors and actuators to keep the spacecraft in a stable configuration (power and pointing constraints are maintained). As higher modes in the hierarchy are achieved, the various software functions are activated by the mode manager, and an increasing level of attitude control accuracy is provided. If FDH detects a constraint violation or other anomaly, it triggers a safing transition to a lower control mode. The WIRE ACS FSW satisfies all target acquisition and pointing accuracy requirements, enforces all pointing constraints, provides the ground with a simple means for reconfiguring the system via table load, and meets all the demands of its real-time embedded environment (16 MHz Intel 80386 processor with 80387 coprocessor running under the VRTX operating system). The mode manager organizes and controls all the software modules used to accomplish these goals, and in particular, the FDH module is tightly coupled with the mode manager.

  5. Management software for a universal device communication controller: application to monitoring and computerized infusions.

    PubMed

    Coussaert, E J; Cantraine, F R

    1996-11-01

    We designed a virtual device for a local area network observing, operating and connecting devices to a personal computer. To keep the widest field of application, we proceeded by using abstraction and specification rules of software engineering in the design and implementation of the hardware and software for the Infusion Monitor. We specially built a box of hardware to interface multiple medical instruments with different communication protocols to a PC via a single serial port. We called that box the Universal Device Communication Controller (UDCC). The use of the virtual device driver is illustrated by the Infusion Monitor implemented for the anaesthesia and intensive care workstation.

  6. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  7. Spatio-temporally resolved spectral measurements of laser-produced plasma and semiautomated spectral measurement-control and analysis software

    NASA Astrophysics Data System (ADS)

    Cao, S. Q.; Su, M. G.; Min, Q.; Sun, D. X.; O'Sullivan, G.; Dong, C. Z.

    2018-02-01

    A spatio-temporally resolved spectral measurement system of highly charged ions from laser-produced plasmas is presented. Corresponding semiautomated computer software for measurement control and spectral analysis has been written to achieve the best synchronicity possible among the instruments. This avoids the tedious comparative processes between experimental and theoretical results. To demonstrate the capabilities of this system, a series of spatio-temporally resolved experiments of laser-produced Al plasmas have been performed and applied to benchmark the software. The system is a useful tool for studying the spectral structures of highly charged ions and for evaluating the spatio-temporal evolution of laser-produced plasmas.

  8. T-LECS: The Control Software System for MOIRCS

    NASA Astrophysics Data System (ADS)

    Yoshikawa, T.; Omata, K.; Konishi, M.; Ichikawa, T.; Suzuki, R.; Tokoku, C.; Katsuno, Y.; Nishimura, T.

    2006-07-01

    MOIRCS (Multi-Object Infrared Camera and Spectrograph) is a new instrument for the Subaru Telescope. We present the system design of the control software system for MOIRCS, named T-LECS (Tohoku University - Layered Electronic Control System). T-LECS is a PC-Linux based network distributed system. Two PCs equipped with the focal plane array system operate two HAWAII2 detectors, respectively, and another PC is used for user interfaces and a database server. Moreover, these PCs control various devices for observations distributed on a TCP/IP network. T-LECS has three interfaces; interfaces to the devices and two user interfaces. One of the user interfaces is to the integrated observation control system (Subaru Observation Software System) for observers, and another one provides the system developers the direct access to the devices of MOIRCS. In order to help the communication between these interfaces, we employ an SQL database system.

  9. Fault Detection and Correction for the Solar Dynamics Observatory Attitude Control System

    NASA Technical Reports Server (NTRS)

    Starin, Scott R.; Vess, Melissa F.; Kenney, Thomas M.; Maldonado, Manuel D.; Morgenstern, Wendy M.

    2007-01-01

    The Solar Dynamics Observatory is an Explorer-class mission that will launch in early 2009. The spacecraft will operate in a geosynchronous orbit, sending data 24 hours a day to a devoted ground station in White Sands, New Mexico. It will carry a suite of instruments designed to observe the Sun in multiple wavelengths at unprecedented resolution. The Atmospheric Imaging Assembly includes four telescopes with focal plane CCDs that can image the full solar disk in four different visible wavelengths. The Extreme-ultraviolet Variability Experiment will collect time-correlated data on the activity of the Sun's corona. The Helioseismic and Magnetic Imager will enable study of pressure waves moving through the body of the Sun. The attitude control system on Solar Dynamics Observatory is responsible for four main phases of activity. The physical safety of the spacecraft after separation must be guaranteed. Fine attitude determination and control must be sufficient for instrument calibration maneuvers. The mission science mode requires 2-arcsecond control according to error signals provided by guide telescopes on the Atmospheric Imaging Assembly, one of the three instruments to be carried. Lastly, accurate execution of linear and angular momentum changes to the spacecraft must be provided for momentum management and orbit maintenance. In thsp aper, single-fault tolerant fault detection and correction of the Solar Dynamics Observatory attitude control system is described. The attitude control hardware suite for the mission is catalogued, with special attention to redundancy at the hardware level. Four reaction wheels are used where any three are satisfactory. Four pairs of redundant thrusters are employed for orbit change maneuvers and momentum management. Three two-axis gyroscopes provide full redundancy for rate sensing. A digital Sun sensor and two autonomous star trackers provide two-out-of-three redundancy for fine attitude determination. The use of software to maximize chances of recovery from any hardware or software fault is detailed. A generic fault detection and correction software structure is used, allowing additions, deletions, and adjustments to fault detection and correction rules. This software structure is fed by in-line fault tests that are also able to take appropriate actions to avoid corruption of the data stream.

  10. SINBAD flight software, the on-board software of NOMAD in ExoMars 2016

    NASA Astrophysics Data System (ADS)

    Pastor-Morales, M. C.; Rodríguez-Gómez, Julio F.; Morales-Muñoz, Rafael; Gómez-López, Juan M.; Aparicio-del-Moral, Beatriz; Candini, Gian Paolo; Jerónimo-Zafra, Jose M.; López-Moreno, Jose J.; Robles-Muñoz, Nicolás. F.; Sanz-Mesa, Rosario; Neefs, Eddy; Vandaele, Ann C.; Drummond, Rachel; Thomas, Ian R.; Berkenbosch, Sophie; Clairquin, Roland; Delanoye, Sofie; Ristic, Bojan; Maes, Jeroen; Bonnewijn, Sabrina; Patel, Manish R.; Leese, Mark; Mason, Jon P.

    2016-07-01

    The Spacecraft INterface and control Board for NomAD (SINBAD) is an electronic interface designed by the Instituto de Astroffisica de Andalucfia (IAA-CSIC). It is part of the Nadir and Occultation for MArs Discovery instrument (NOMAD) on board in the ESAs ExoMars Trace Gas Orbiter mission. This mission was launched in March 2016. The SINBAD Flight Software (SFS) is the software embedded in SINBAD. It is in charge of managing the interfaces, devices, data, observing sequences, patching and contingencies of NOMAD. It is presented in this paper the most remarkable aspects of the SFS design, likewise the main problems and lessons learned during the software development process.

  11. Investigating Near Space Interaction Regions: Developing a Remote Observatory

    NASA Astrophysics Data System (ADS)

    Gallant, M.; Mierkiewicz, E. J.; Oliversen, R. J.; Jaehnig, K.; Percival, J.; Harlander, J.; Englert, C. R.; Kallio, R.; Roesler, F. L.; Nossal, S. M.; Gardner, D.; Rosborough, S.

    2016-12-01

    The Investigating Near Space Interaction Regions (INSpIRe) effort will (1) establish an adaptable research station capable of contributing to terrestrial and planetary aeronomy; (2) integrate two state-of-the-art second generation Fabry-Perot (FP) and Spatial Heteorodyne Spectrometers (SHS) into a remotely operable configuration; (3) deploy this instrumentation to a clear-air site, establishing a stable, well-calibrated observatory; (4) embark on a series of observations designed to contribute to three major areas of geocoronal research: geocoronal physics, structure/coupling, and variability. This poster describes the development of the INSpIRe remote observatory. Based at Embry-Riddle Aeronautical University (ERAU), initiative INSpIRe provides a platform to encourage the next generation of researchers to apply knowledge gained in the classroom to real-world science and engineering. Students at ERAU contribute to the INSpIRe effort's hardware and software needs. Mechanical/optical systems are in design to bring light to any of four instruments. Control software is in development to allow remote users to control everything from dome and optical system operations to calibration and data collection. In April 2016, we also installed and tested our first science instrument in the INSpIRe trailer, the Redline DASH Demonstration Instrument (REDDI). REDDI uses Doppler Asymmetric Spatial Heterodyne (DASH) spectroscopy, and its deployment as part of INSpIRe is a collaborative research effort between the Naval Research Lab, St Cloud State University, and ERAU. Similar to a stepped Michelson device, REDDI measures oxygen (630.0 nm) winds from the thermosphere. REDDI is currently mounted in a temporary location under INSpIRe's main siderostat until its entrance optical system can be modified. First light tests produced good signal-to-noise fringes in ten minute integrations, indicating that we will soon be able to measure thermospheric winds from our Daytona Beach testing site. Future work will involve installation and software integration of FP and SHS systems and the Embry-Riddle Instrument Control System. The INSpIRe project is funded through NSF-CAREER award AGS135231 and the NASA Planetary Solar System Observations Program. The REDDI instrument was supported by the Chief of Naval Research.

  12. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    NASA Technical Reports Server (NTRS)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  13. Automatisms in EMIR instrument to improve operation, safety and maintenance

    NASA Astrophysics Data System (ADS)

    Fernández Izquierdo, Patricia; Núñez Cagigal, Miguel; Barreto Rodríguez, Roberto; Martínez Rey, Noelia; Santana Tschudi, Samuel; Barreto Cabrera, Maria; Patrón Recio, Jesús; Garzón López, Francisco

    2014-08-01

    EMIR is the NIR imager and multiobject spectrograph being built as a common user instrument for the 10-m class GTC. Big cryogenic instruments demand a reliable design and a specific hardware and software to increase its safety and productivity. EMIR vacuum, cooling and heating systems are monitored and partially controlled by a Programmable Logic Controller (PLC) in industrial format with a touch screen. The PLC aids the instrument operator in the maintenance tasks recovering autonomously vacuum if required or proposing preventive maintenance actions. The PLC and its associated hardware improve EMIR safety having immediate reactions against eventual failure modes in the instrument or in external supplies, including hardware failures during the heating procedure or failure in the PLC itself. EMIR PLC provides detailed information periodically about status and alarms of vacuum and cooling components or external supplies.

  14. NASA Tech Briefs, February 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Topics discussed include: Instrumentation for Sensitive Gas Measurements; Apparatus for Testing Flat Specimens of Thermal Insulation; Quadrupole Ion Mass Spectrometer for Masses of 2 to 50 Da; Miniature Laser Doppler Velocimeter for Measuring Wall Shear; Coherent Laser Instrument Would Measure Range and Velocity; Printed Microinductors for Flexible Substrates; Digital Receiver for Microwave Radiometry; Printed Antennas Made Reconfigurable by Use of MEMS Switches; Traffic-Light-Preemption Vehicle-Transponder Software Module; Intersection-Controller Software Module; Central-Monitor Software Module; Estimating Effects of Multipath Propagation on GPS Signals; Parallel Adaptive Mesh Refinement Library; Predicting Noise From Aircraft Turbine-Engine Combustors; Generating Animated Displays of Spacecraft Orbits; Diagnosis and Prognosis of Weapon Systems; Training Software in Artificial-Intelligence Computing Techniques; APGEN Version 5.0; Single-Command Approach and Instrument Placement by a Robot on a Target; Three-Dimensional Audio Client Library; Isogrid Membranes for Precise, Singly Curved Reflectors; Nickel-Tin Electrode Materials for Nonaqueous Li-Ion Cells; Photocatalytic Coats in Glass Drinking-Water Bottles; Fast Laser Shutters With Low Vibratory Disturbances; Series-Connected Buck Boost Regulators; Space Physics Data Facility Web Services; Split-Resonator, Integrated-Post Vibratory Microgyroscope; Blended Buffet-Load-Alleviation System for Fighter Airplane; Gifford-McMahon/Joule-Thomson Refrigerator Cools to 2.5 K; High-Temperature, High-Load-Capacity Radial Magnetic Bearing; Fabrication of Spherical Reflectors in Outer Space; Automated Rapid Prototyping of 3D Ceramic Parts; Tissue Engineering Using Transfected Growth-Factor Genes; Automation of Vapor-Diffusion Growth of Protein Crystals; Atom Skimmers and Atom Lasers Utilizing Them; Gears Based on Carbon Nanotubes; Patched Off-Axis Bending/Twisting Actuators for Thin Mirrors; and Improving Control in a Joule-Thomson Refrigerator.

  15. CARMENES. IV: instrument control software

    NASA Astrophysics Data System (ADS)

    Guàrdia, Josep; Colomé, Josep; Ribas, Ignasi; Hagen, Hans-Jürgen; Morales, Rafael; Abril, Miguel; Galadí-Enríquez, David; Seifert, Walter; Sánchez Carrasco, Miguel A.; Quirrenbach, Andreas; Amado, Pedro J.; Caballero, Jose A.; Mandel, Holger

    2012-09-01

    The overall purpose of the CARMENES instrument is to perform high-precision measurements of radial velocities of late-type stars with long-term stability. CARMENES will be installed in 2014 at the 3.5 m telescope in the German- Spanish Astronomical Center at Calar Alto observatory (CAHA, Spain) and will be equipped with two spectrographs in the near-infrared and visible windows. The technology involved in such instrument represents a challenge at all levels. The instrument coordination and management is handled by the Instrument Control System (ICS), which is responsible of carrying out the operations of the different subsystems and providing a tool to operate the instrument from low to high user interaction level. The main goal of the ICS and the CARMENES control layer architecture is to maximize the instrument efficiency by reducing time overheads and by operating it in an integrated manner. The ICS implements the CARMENES operational design. A description of the ICS architecture and the application programming interfaces for low- and high-level communication is given. Internet Communications Engine is the technology selected to implement most of the interface protocols.

  16. Development of a portable multispectral thermal infrared camera

    NASA Technical Reports Server (NTRS)

    Osterwisch, Frederick G.

    1991-01-01

    The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.

  17. Control and Information Systems for the National Ignition Facility

    DOE PAGES

    Brunton, Gordon; Casey, Allan; Christensen, Marvin; ...

    2017-03-23

    Orchestration of every National Ignition Facility (NIF) shot cycle is managed by the Integrated Computer Control System (ICCS), which uses a scalable software architecture running code on more than 1950 front-end processors, embedded controllers, and supervisory servers. The ICCS operates laser and industrial control hardware containing 66 000 control and monitor points to ensure that all of NIF’s laser beams arrive at the target within 30 ps of each other and are aligned to a pointing accuracy of less than 50 μm root-mean-square, while ensuring that a host of diagnostic instruments record data in a few billionths of a second.more » NIF’s automated control subsystems are built from a common object-oriented software framework that distributes the software across the computer network and achieves interoperation between different software languages and target architectures. A large suite of business and scientific software tools supports experimental planning, experimental setup, facility configuration, and post-shot analysis. Standard business services using open-source software, commercial workflow tools, and database and messaging technologies have been developed. An information technology infrastructure consisting of servers, network devices, and storage provides the foundation for these systems. Thus, this work is an overview of the control and information systems used to support a wide variety of experiments during the National Ignition Campaign.« less

  18. Control and Information Systems for the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunton, Gordon; Casey, Allan; Christensen, Marvin

    Orchestration of every National Ignition Facility (NIF) shot cycle is managed by the Integrated Computer Control System (ICCS), which uses a scalable software architecture running code on more than 1950 front-end processors, embedded controllers, and supervisory servers. The ICCS operates laser and industrial control hardware containing 66 000 control and monitor points to ensure that all of NIF’s laser beams arrive at the target within 30 ps of each other and are aligned to a pointing accuracy of less than 50 μm root-mean-square, while ensuring that a host of diagnostic instruments record data in a few billionths of a second.more » NIF’s automated control subsystems are built from a common object-oriented software framework that distributes the software across the computer network and achieves interoperation between different software languages and target architectures. A large suite of business and scientific software tools supports experimental planning, experimental setup, facility configuration, and post-shot analysis. Standard business services using open-source software, commercial workflow tools, and database and messaging technologies have been developed. An information technology infrastructure consisting of servers, network devices, and storage provides the foundation for these systems. Thus, this work is an overview of the control and information systems used to support a wide variety of experiments during the National Ignition Campaign.« less

  19. Advanced laser stratospheric monitoring systems analyses

    NASA Technical Reports Server (NTRS)

    Larsen, J. C.

    1984-01-01

    This report describes the software support supplied by Systems and Applied Sciences Corporation for the study of Advanced Laser Stratospheric Monitoring Systems Analyses under contract No. NAS1-15806. This report discusses improvements to the Langley spectroscopic data base, development of LHS instrument control software and data analyses and validation software. The effect of diurnal variations on the retrieved concentrations of NO, NO2 and C L O from a space and balloon borne measurement platform are discussed along with the selection of optimum IF channels for sensing stratospheric species from space.

  20. Perchlorate Detection at Nanomolar Concentrations by Surface-Enhanced Raman Scattering

    DTIC Science & Technology

    2009-01-01

    grooves/mm grating light path controlled by Renishaw WiRE software and analyzed by Galactic GRAMS software. RESULTS AND DISCUSSION Quantitative... Federal Rights License 14. ABSTRACT Perchlorate (ClO4 ) has emerged as a widespread environmental contaminant and has been detected in various food...by means of dynamic light scattering using a ZetaPlus particle size analyzer (Brookhaven Instruments, Holtsville, NY). Data were collected for every

  1. Speckle interferometry. Data acquisition and control for the SPID instrument.

    NASA Astrophysics Data System (ADS)

    Altarac, S.; Tallon, M.; Thiebaut, E.; Foy, R.

    1998-08-01

    SPID (SPeckle Imaging by Deconvolution) is a new speckle camera currently under construction at CRAL-Observatoire de Lyon. Its high spectral resolution and high image restoration capabilities open new astrophysical programs. The instrument SPID is composed of four main optical modules which are fully automated and computer controlled by a software written in Tcl/Tk/Tix and C. This software provides an intelligent assistance to the user by choosing observational parameters as a function of atmospheric parameters, computed in real time, and the desired restored image quality. Data acquisition is made by a photon-counting detector (CP40). A VME-based computer under OS9 controls the detector and stocks the data. The intelligent system runs under Linux on a PC. A slave PC under DOS commands the motors. These 3 computers communicate through an Ethernet network. SPID can be considered as a precursor for VLT's (Very Large Telescope, four 8-meter telescopes currently built in Chile by European Southern Observatory) very high spatial resolution camera.

  2. Distributed Software for Observations in the Near Infrared

    NASA Astrophysics Data System (ADS)

    Gavryusev, V.; Baffa, C.; Giani, E.

    We have developed an integrated system that performs astronomical observations in Near Infrared bands operating two-dimensional instruments at the Italian National Infrared Facility's \\htmllink{ARNICA}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/arnica/arnica.html} and \\htmllink{LONGSP}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/longsp/longsp.html}. This software consists of several communicating processes, generally executed across a network, as well as on a single computer. The user interface is organized as widget-based X11 client. The interprocess communication is provided by sockets and uses TCP/IP. The processes denoted for control of hardware (telescope and other instruments) should be executed currently on a PC dedicated for this task under DESQview/X, while all other components (user interface, tools for the data analysis, etc.) can also work under UNIX\\@. The hardware independent part of software is based on the Athena Widget Set and is compiled by GNU C to provide maximum portability.

  3. Neutron probes for the Construction and Resource Utilization eXplorer (CRUX)

    NASA Technical Reports Server (NTRS)

    Elphic, R. C.; Hahn, S.; Lawrence, D. J.; Feldman, W. C.; Johnson, J. B.; Haldemann, A. F. C.

    2006-01-01

    The Construction and Resource Utilization eXplorer (CRUX) project is developing a flexible integrated suite of instruments with data fusion software and an executive controller for in situ regolith resource assessment and characterization.

  4. Adding Support to the ALMA Common Software for Real-Time Operations through the Usage of a POSIX-Compliant RTOS

    NASA Astrophysics Data System (ADS)

    Tobar, R. J.; von Brand, H.; Araya, M. A.; Juerges, T.

    2010-12-01

    The ALMA Common Software (ACS) framework lacks of the real-time capabilities to control the antennas’ instrumentation — as has been probed by previous works — which has lead to non-portable workarounds to the problem. Indeed, the time service used in ACS, based in the Container/Component model, presents plenty of results that confirm this statement. This work addresses the problem of design and integrate a real-time service for ACS, providing to the framework an implementation such that the control operations over the different instruments could be done within real-time constraints. This implementation is compared with the current time service, showing the difference between the two systems when subjecting them to common scenarios. Also, the new implementation is done following the POSIX specification, ensuring interoperability and portability through different operating systems.

  5. Beam Instrument Development System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOOLITTLE, LAWRENCE; HUANG, GANG; DU, QIANG

    Beam Instrumentation Development System (BIDS) is a collection of common support libraries and modules developed during a series of Low-Level Radio Frequency (LLRF) control and timing/synchronization projects. BIDS includes a collection of Hardware Description Language (HDL) libraries and software libraries. The BIDS can be used for the development of any FPGA-based system, such as LLRF controllers. HDL code in this library is generic and supports common Digital Signal Processing (DSP) functions, FPGA-specific drivers (high-speed serial link wrappers, clock generation, etc.), ADC/DAC drivers, Ethernet MAC implementation, etc.

  6. EPICS Channel Access Server for LabVIEW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhukov, Alexander P.

    It can be challenging to interface National Instruments LabVIEW (http://www.ni.com/labview/) with EPICS (http://www.aps.anl.gov/epics/). Such interface is required when an instrument control program was developed in LabVIEW but it also has to be part of global control system. This is frequently useful in big accelerator facilities. The Channel Access Server is written in LabVIEW, so it works on any hardware/software platform where LabVIEW is available. It provides full server functionality, so any EPICS client can communicate with it.

  7. Earth Observing System (EOS)/Advanced Microwave Sounding Unit-A (AMSU-A) software assurance plan

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert; Smith, Claude

    1994-01-01

    This document defines the responsibilities of Software Quality Assurance (SOA) for the development of the flight software installed in EOS/AMSU-A instruments, and the ground support software used in the test and integration of the EOS/AMSU-A instruments.

  8. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  9. Instrument Control (iC) – An Open-Source Software to Automate Test Equipment

    PubMed Central

    Pernstich, K. P.

    2012-01-01

    It has become common practice to automate data acquisition from programmable instrumentation, and a range of different software solutions fulfill this task. Many routine measurements require sequential processing of certain tasks, for instance to adjust the temperature of a sample stage, take a measurement, and repeat that cycle for other temperatures. This paper introduces an open-source Java program that processes a series of text-based commands that define the measurement sequence. These commands are in an intuitive format which provides great flexibility and allows quick and easy adaptation to various measurement needs. For each of these commands, the iC-framework calls a corresponding Java method that addresses the specified instrument to perform the desired task. The functionality of iC can be extended with minimal programming effort in Java or Python, and new measurement equipment can be addressed by defining new commands in a text file without any programming. PMID:26900522

  10. Instrument Control (iC) - An Open-Source Software to Automate Test Equipment.

    PubMed

    Pernstich, K P

    2012-01-01

    It has become common practice to automate data acquisition from programmable instrumentation, and a range of different software solutions fulfill this task. Many routine measurements require sequential processing of certain tasks, for instance to adjust the temperature of a sample stage, take a measurement, and repeat that cycle for other temperatures. This paper introduces an open-source Java program that processes a series of text-based commands that define the measurement sequence. These commands are in an intuitive format which provides great flexibility and allows quick and easy adaptation to various measurement needs. For each of these commands, the iC-framework calls a corresponding Java method that addresses the specified instrument to perform the desired task. The functionality of iC can be extended with minimal programming effort in Java or Python, and new measurement equipment can be addressed by defining new commands in a text file without any programming.

  11. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  12. Detailed design of a Ride Quality Augmentation System for commuter aircraft

    NASA Technical Reports Server (NTRS)

    Suikat, Reiner; Donaldson, Kent E.; Downing, David R.

    1989-01-01

    The design of a Ride Quality Augmentation System (RQAS) for commuter aircraft is documented. The RQAS is designed for a Cessna 402B, an 8 passenger prop twin representative to this class of aircraft. The purpose of the RQAS is the reduction of vertical and lateral accelerations of the aircraft due to atmospheric turbulence by the application of active control. The detailed design of the hardware (the aircraft modifications, the Ride Quality Instrumentation System (RQIS), and the required computer software) is examined. The aircraft modifications, consisting of the dedicated control surfaces and the hydraulic actuation system, were designed at Cessna Aircraft by Kansas University-Flight Research Laboratory. The instrumentation system, which consist of the sensor package, the flight computer, a Data Acquisition System, and the pilot and test engineer control panels, was designed by NASA-Langley. The overall system design and the design of the software, both for flight control algorithms and ground system checkout are detailed. The system performance is predicted from linear simulation results and from power spectral densities of the aircraft response to a Dryden gust. The results indicate that both accelerations are possible.

  13. Formal assessment instrument for ensuring the security of NASA's networks, systems and software

    NASA Technical Reports Server (NTRS)

    Gilliam, D. P.; Powell, J. D.; Sherif, J.

    2002-01-01

    To address the problem of security for NASA's networks, systems and software, NASA has funded the Jet Propulsion Lab in conjunction with UC Davis to begin work on developing a software security assessment instrument for use in the software development and maintenance life cycle.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voorhees, D.R.; Rossmassler, R.L.; Zimmer, G.

    The tritium analytical system at TFTR is used to determine the purity of tritium bearing gas streams in order to provide inventory and accountability measurements. The system includes a quadrupole mass spectrometer (QMS) and beta scintillator originally configured at Monsanto Mound Research Laboratory. The system was commissioned and tested in 1992 and is used daily for analysis of calibration standards, incoming tritium shipments, gases evolved from uranium storage beds and effluent gases from the tokamak. The instruments are controlled by a personal computer with customized software written with a graphical programming system designed for data acquisition and control. A discussionmore » of the instrumentation, control systems, system parameters, procedural methods, algorithms, and operational issues will be presented. Measurements of gas holding tanks and tritiated water waste streams using ion chamber instrumentation are discussed elsewhere. 7 refs., 3 figs.« less

  15. TELICS—A Telescope Instrument Control System for Small/Medium Sized Astronomical Observatories

    NASA Astrophysics Data System (ADS)

    Srivastava, Mudit K.; Ramaprakash, A. N.; Burse, Mahesh P.; Chordia, Pravin A.; Chillal, Kalpesh S.; Mestry, Vilas B.; Das, Hillol K.; Kohok, Abhay A.

    2009-10-01

    For any modern astronomical observatory, it is essential to have an efficient interface between the telescope and its back-end instruments. However, for small and medium-sized observatories, this requirement is often limited by tight financial constraints. Therefore a simple yet versatile and low-cost control system is required for such observatories to minimize cost and effort. Here we report the development of a modern, multipurpose instrument control system TELICS (Telescope Instrument Control System) to integrate the controls of various instruments and devices mounted on the telescope. TELICS consists of an embedded hardware unit known as a common control unit (CCU) in combination with Linux-based data acquisition and user interface. The hardware of the CCU is built around the ATmega 128 microcontroller (Atmel Corp.) and is designed with a backplane, master-slave architecture. A Qt-based graphical user interface (GUI) has been developed and the back-end application software is based on C/C++. TELICS provides feedback mechanisms that give the operator good visibility and a quick-look display of the status and modes of instruments as well as data. TELICS has been used for regular science observations since 2008 March on the 2 m, f/10 IUCAA Telescope located at Girawali in Pune, India.

  16. Software for simulation of a computed tomography imaging spectrometer using optical design software

    NASA Astrophysics Data System (ADS)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  17. Designing Spatial Visualisation Tasks for Middle School Students with a 3D Modelling Software: An Instrumental Approach

    ERIC Educational Resources Information Center

    Turgut, Melih; Uygan, Candas

    2015-01-01

    In this work, certain task designs to enhance middle school students' spatial visualisation ability, in the context of an instrumental approach, have been developed. 3D modelling software, SketchUp®, was used. In the design process, software tools were focused on and, thereafter, the aim was to interpret the instrumental genesis and spatial…

  18. Distributed Framework for Dynamic Telescope and Instrument Control

    NASA Astrophysics Data System (ADS)

    Ames, Troy J.; Case, Lynne

    2002-12-01

    Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see http://www.jxta.org) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device?s IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a principal investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have been working with the Submillimetre High Angular Resolution Camera IInd Generation (SHARCII) at the CSO to investigate using IRC capabilities with the SHARC instrument.

  19. Distributed Framework for Dynamic Telescope and Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy J.; Case, Lynne

    2002-01-01

    Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see httD://www.jxta.org,) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device's IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have been working with the Submillimetre High Angular Resolution Camera IInd Generation (SHARCII) at the CSO to investigate using IRC capabilities with the SHARC instrument.

  20. Microcomputer control soft tube measuring-testing instrument

    NASA Astrophysics Data System (ADS)

    Zhou, Yanzhou; Jiang, Xiu-Zhen; Wang, Wen-Yi

    1993-09-01

    Soft tube are key and easily spoiled parts used by the vehicles in the transportation with large numbers. Measuring and testing of the tubes were made by hands for a long time. Cooperating with Harbin Railway Bureau recently we have developed a new kind of automatical measuring and testing instrument In the paper the instrument structure property and measuring principle are presented in details. Centre of the system is a singlechip processor INTEL 80C31 . It can collect deal with data and display the results on LED. Furthermore it brings electromagnetic valves and motors under control. Five soft tubes are measured and tested in the same time all the process is finished automatically. On the hardware and software counter-electromagnetic disturbance methods is adopted efficiently so the performance of the instrument is improved significantly. In the long run the instrument is reliable and practical It solves a quite difficult problem in the railway transportation.

  1. Realizing software longevity over a system's lifetime

    NASA Astrophysics Data System (ADS)

    Lanclos, Kyle; Deich, William T. S.; Kibrick, Robert I.; Allen, Steven L.; Gates, John

    2010-07-01

    A successful instrument or telescope will measure its productive lifetime in decades; over that period, the technology behind the control hardware and software will evolve, and be replaced on a per-component basis. These new components must successfully integrate with the old, and the difficulty of that integration depends strongly on the design decisions made over the course of the facility's history. The same decisions impact the ultimate success of each upgrade, as measured in terms of observing efficiency and maintenance cost. We offer a case study of these critical design decisions, analyzing the layers of software deployed for instruments under the care of UCO/Lick Observatory, including recent upgrades to the Low Resolution Imaging Spectrometer (LRIS) at Keck Observatory in Hawaii, as well as the Kast spectrograph, Lick Adaptive Optics system, and Hamilton spectrograph, all at Lick Observatory's Shane 3-meter Telescope at Mt. Hamilton. These issues play directly into design considerations for the software intended for use at the next generation of telescopes, such as the Thirty Meter Telescope. We conduct our analysis with the future of observational astronomy infrastructure firmly in mind.

  2. Benefits of an automated GLP final report preparation software solution.

    PubMed

    Elvebak, Larry E

    2011-07-01

    The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.

  3. Telescience at the University of California, Berkeley

    NASA Technical Reports Server (NTRS)

    Chakrabarti, S.; Marchant, W. T.; Kaplan, G. C.; Dobson, C. A.; Jernigan, J. G.; Lampton, M. L.; Malina, R. F.

    1989-01-01

    The University of California at Berkeley (UCB) is a member of a university consortium involved in telescience testbed activities under the sponsorship of NASA. Our Telescience Testbed Project consists of three experiments using flight hardware being developed for the Extreme Ultraviolet Explorer project at UCB's Space Sciences Laboratory. The first one is a teleoperation experiment investigating remote instrument control using a computer network such as the Internet. The second experiment is an effort to develop a system for operation of a network of remote workstations allowing coordinated software development, evaluation, and use by widely dispersed groups. The final experiment concerns simulation as a method to facilitate the concurrent development of instrument hardware and support software. We describe our progress in these areas.

  4. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Gouda, N.; Yano, T.; Sako, N.; Hatsutori, Y.; Tanaka, T.; Yamauchi, M.

    We explain simulation tools in JASMINE project(JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations of error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). The simulation system should include all objects in JASMINE such as observation techniques, models of instruments and bus design, orbit, data transfer, data analysis etc. in order to resolve all issues which can be expected beforehand and make it easy to cope with some unexpected problems which might occur during the mission of JASMINE. So, the JASMINE Simulator is designed as handling events such as photons from astronomical objects, control signals for devices, disturbances for satellite attitude, by instruments such as mirrors and detectors, successively. The simulator is also applied to the technical demonstration "Nano-JASMINE". The accuracy of ordinary sensor is not enough for initial phase attitude control. Mission instruments may be a good sensor for this purpose. The problem of attitude control in initial phase is a good example of this software because the problem is closely related to both mission instruments and satellite bus systems.

  5. JASMINE Simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Gouda, N.; Yano, T.; Kobayashi, Y.; Suganuma, M.; Tsujimoto, T.; Sako, N.; Hatsutori, Y.; Tanaka, T.

    2006-08-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations of error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented (OO) methodologies are ideal tools for the simulation system of JASMINE (the JASMINE simulator). The simulation system should include all objects in JASMINE such as observation techniques, models of instruments and bus design, orbit, data transfer, data analysis etc. in order to resolve all issues which can be expected beforehand and make it easy to cope with some unexpected problems which might occur during the mission of JASMINE. So, the JASMINE Simulator is designed as handling events such as photons from astronomical objects, control signals for devices, disturbances for satellite attitude, by instruments such as mirrors and detectors, successively. The simulator is also applied to the technical demonstration "Nano-JASMINE". The accuracy of ordinary sensor is not enough for initial phase attitude control. Mission instruments may be a good sensor for this purpose. The problem of attitude control in initial phase is a good example of this software because the problem is closely related to both mission instruments and satellite bus systems.

  6. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."

  7. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  8. An update on the development of IO:I: a NIR imager for the Liverpool Telescope

    NASA Astrophysics Data System (ADS)

    Barnsley, R. M.; Steele, I. A.; Bates, S. D.; Mottram, C. J.

    2014-07-01

    IO:I is a new instrument in development for the Liverpool Telescope, extending current imaging capabilities beyond the optical and into the near infrared. Cost has been minimised by use of a previously decommissioned instrument's dewar as the base for a prototype, and retrofitting it with a 1.7μm cutoff Hawaii-2RG HgCdTe detector, SIDECAR ASIC controller and JADE2 interface card. Development of this prototype is nearing completion and will be operational mid 2014. In this paper, the mechanical, electronic and cryogenic facets of the dewar retrofitting process will be discussed together with a description of the instrument control system software/hardware setup. Finally, a brief overview of some initial testing undertaken on the engineering grade array will be given, along with future commissioning plans for the instrument.

  9. Software for imaging phase-shift interference microscope

    NASA Astrophysics Data System (ADS)

    Malinovski, I.; França, R. S.; Couceiro, I. B.

    2018-03-01

    In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.

  10. Interaction design challenges and solutions for ALMA operations monitoring and control

    NASA Astrophysics Data System (ADS)

    Pietriga, Emmanuel; Cubaud, Pierre; Schwarz, Joseph; Primet, Romain; Schilling, Marcus; Barkats, Denis; Barrios, Emilio; Vila Vilaro, Baltasar

    2012-09-01

    The ALMA radio-telescope, currently under construction in northern Chile, is a very advanced instrument that presents numerous challenges. From a software perspective, one critical issue is the design of graphical user interfaces for operations monitoring and control that scale to the complexity of the system and to the massive amounts of data users are faced with. Early experience operating the telescope with only a few antennas has shown that conventional user interface technologies are not adequate in this context. They consume too much screen real-estate, require many unnecessary interactions to access relevant information, and fail to provide operators and astronomers with a clear mental map of the instrument. They increase extraneous cognitive load, impeding tasks that call for quick diagnosis and action. To address this challenge, the ALMA software division adopted a user-centered design approach. For the last two years, astronomers, operators, software engineers and human-computer interaction researchers have been involved in participatory design workshops, with the aim of designing better user interfaces based on state-of-the-art visualization techniques. This paper describes the process that led to the development of those interface components and to a proposal for the science and operations console setup: brainstorming sessions, rapid prototyping, joint implementation work involving software engineers and human-computer interaction researchers, feedback collection from a broader range of users, further iterations and testing.

  11. A Closed-Loop Proportional-Integral (PI) Control Software for Fully Mechanically Controlled Automated Electron Microscopic Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    REN, GANG; LIU, JINXIN; LI, HONGCHANG

    A closed-loop proportional-integral (PI) control software is provided for fully mechanically controlled automated electron microscopic tomography. The software is developed based on Gatan DigitalMicrograph, and is compatible with Zeiss LIBRA 120 transmission electron microscope. However, it can be expanded to other TEM instrument with modification. The software consists of a graphical user interface, a digital PI controller, an image analyzing unit, and other drive units (i.e.: image acquire unit and goniometer drive unit). During a tomography data collection process, the image analyzing unit analyzes both the accumulated shift and defocus value of the latest acquired image, and provides the resultsmore » to the digital PI controller. The digital PI control compares the results with the preset values and determines the optimum adjustments of the goniometer. The goniometer drive unit adjusts the spatial position of the specimen according to the instructions given by the digital PI controller for the next tilt angle and image acquisition. The goniometer drive unit achieves high precision positioning by using a backlash elimination method. The major benefits of the software are: 1) the goniometer drive unit keeps pre-aligned/optimized beam conditions unchanged and achieves position tracking solely through mechanical control; 2) the image analyzing unit relies on only historical data and therefore does not require additional images/exposures; 3) the PI controller enables the system to dynamically track the imaging target with extremely low system error.« less

  12. Multiple-function multi-input/multi-output digital control and on-line analysis

    NASA Technical Reports Server (NTRS)

    Hoadley, Sherwood T.; Wieseman, Carol D.; Mcgraw, Sandra M.

    1992-01-01

    The design and capabilities of two digital controller systems for aeroelastic wind-tunnel models are described. The first allowed control of flutter while performing roll maneuvers with wing load control as well as coordinating the acquisition, storage, and transfer of data for on-line analysis. This system, which employs several digital signal multi-processor (DSP) boards programmed in high-level software languages, is housed in a SUN Workstation environment. A second DCS provides a measure of wind-tunnel safety by functioning as a trip system during testing in the case of high model dynamic response or in case the first DCS fails. The second DCS uses National Instruments LabVIEW Software and Hardware within a Macintosh environment.

  13. Portable gas chromatograph-mass spectrometer

    DOEpatents

    Andresen, Brian D.; Eckels, Joel D.; Kimmons, James F.; Myers, David W.

    1996-01-01

    A gas chromatograph-mass spectrometer (GC-MS) for use as a field portable organic chemical analysis instrument. The GC-MS is designed to be contained in a standard size suitcase, weighs less than 70 pounds, and requires less than 600 watts of electrical power at peak power (all systems on). The GC-MS includes: a conduction heated, forced air cooled small bore capillary gas chromatograph, a small injector assembly, a self-contained ion/sorption pump vacuum system, a hydrogen supply, a dual computer system used to control the hardware and acquire spectrum data, and operational software used to control the pumping system and the gas chromatograph. This instrument incorporates a modified commercial quadrupole mass spectrometer to achieve the instrument sensitivity and mass resolution characteristic of laboratory bench top units.

  14. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  15. Development of Labview based data acquisition and multichannel analyzer software for radioactive particle tracking system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd

    2015-04-29

    A DAQ (data acquisition) software called RPTv2.0 has been developed for Radioactive Particle Tracking System in Malaysian Nuclear Agency. RPTv2.0 that features scanning control GUI, data acquisition from 12-channel counter via RS-232 interface, and multichannel analyzer (MCA). This software is fully developed on National Instruments Labview 8.6 platform. Ludlum Model 4612 Counter is used to count the signals from the scintillation detectors while a host computer is used to send control parameters, acquire and display data, and compute results. Each detector channel consists of independent high voltage control, threshold or sensitivity value and window settings. The counter is configured withmore » a host board and twelve slave boards. The host board collects the counts from each slave board and communicates with the computer via RS-232 data interface.« less

  16. Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A) software management plan

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert

    1994-01-01

    This document defines the responsibilites for the management of the like-cycle development of the flight software installed in the AMSU-A instruments, and the ground support software used in the test and integration of the AMSU-A instruments.

  17. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  18. WFF TOPEX Software Documentation Altimeter Instrument File (AIF) Processing, October 1998. Volume 3

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  19. Design of virtual three-dimensional instruments for sound control

    NASA Astrophysics Data System (ADS)

    Mulder, Axel Gezienus Elith

    An environment for designing virtual instruments with 3D geometry has been prototyped and applied to real-time sound control and design. It enables a sound artist, musical performer or composer to design an instrument according to preferred or required gestural and musical constraints instead of constraints based only on physical laws as they apply to an instrument with a particular geometry. Sounds can be created, edited or performed in real-time by changing parameters like position, orientation and shape of a virtual 3D input device. The virtual instrument can only be perceived through a visualization and acoustic representation, or sonification, of the control surface. No haptic representation is available. This environment was implemented using CyberGloves, Polhemus sensors, an SGI Onyx and by extending a real- time, visual programming language called Max/FTS, which was originally designed for sound synthesis. The extension involves software objects that interface the sensors and software objects that compute human movement and virtual object features. Two pilot studies have been performed, involving virtual input devices with the behaviours of a rubber balloon and a rubber sheet for the control of sound spatialization and timbre parameters. Both manipulation and sonification methods affect the naturalness of the interaction. Informal evaluation showed that a sonification inspired by the physical world appears natural and effective. More research is required for a natural sonification of virtual input device features such as shape, taking into account possible co- articulation of these features. While both hands can be used for manipulation, left-hand-only interaction with a virtual instrument may be a useful replacement for and extension of the standard keyboard modulation wheel. More research is needed to identify and apply manipulation pragmatics and movement features, and to investigate how they are co-articulated, in the mapping of virtual object parameters. While the virtual instruments can be adapted to exploit many manipulation gestures, further work is required to reduce the need for technical expertise to realize adaptations. Better virtual object simulation techniques and faster sensor data acquisition will improve the performance of virtual instruments. The design environment which has been developed should prove useful as a (musical) instrument prototyping tool and as a tool for researching the optimal adaptation of machines to humans.

  20. Instrumentation & Data Acquisition System (D AS) Engineer

    NASA Technical Reports Server (NTRS)

    Jackson, Markus Deon

    2015-01-01

    The primary job of an Instrumentation and Data Acquisition System (DAS) Engineer is to properly measure physical phenomenon of hardware using appropriate instrumentation and DAS equipment designed to record data during a specified test of the hardware. A DAS system includes a CPU or processor, a data storage device such as a hard drive, a data communication bus such as Universal Serial Bus, software to control the DAS system processes like calibrations, recording of data and processing of data. It also includes signal conditioning amplifiers, and certain sensors for specified measurements. My internship responsibilities have included testing and adjusting Pacific Instruments Model 9355 signal conditioning amplifiers, writing and performing checkout procedures, writing and performing calibration procedures while learning the basics of instrumentation.

  1. Canal wall planning by engine-driven nickel-titanium instruments, compared with stainless-steel hand instrumentation.

    PubMed

    Tucker, D M; Wenckus, C S; Bentkover, S K

    1997-03-01

    Twenty-two mesial roots of extracted human mandibular molars were divided into two groups based on root curvature and length. The mesiolingual canals were instrumented using either Flexofiles in a step-back anticurvature filing method, or they were instrumented with engine-driven 0.02 taper nickel-titanium files. Ground sections were prepared at 1-, 2.5-, and 5-mm levels from the working length. The mesiobuccal canal was used as an uninstrumented control for predentin character. Digitizing software was used to calculate the instrumented portion as a percentage of the total canal perimeter. The results indicated no significant difference in overall canal wall planning between the two groups and no significant difference at each of the three levels.

  2. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    PubMed

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  3. Instrumentation Automation for Concrete Structures: Report 2, Automation Hardware and Retrofitting Techniques, and Report 3, Available Data Collection and Reduction Software

    DTIC Science & Technology

    1987-06-01

    commercial products. · OP -- Typical cutout at a plumbiinc location where an automated monitoring system has bv :• installed. The sensor used with the...This report provides a description of commercially available sensors , instruments, and ADP equipment that may be selected to fully automate...automated. The automated plumbline monitoring system includes up to twelve sensors , repeaters, a system controller, and a printer. The system may

  4. A simulation of the instrument pointing system for the Astro-1 mission

    NASA Technical Reports Server (NTRS)

    Whorton, M.; West, M.; Rakoczy, J.

    1991-01-01

    NASA has recently completed a shuttle-borne stellar ultraviolet astronomy mission known as Astro-1. A three axis instrument pointing system (IPS) was employed to accurately point the science instruments. In order to analyze the pointing control system and verify pointing performance, a simulation of the IPS was developed using the multibody dynamics software TREETOPS. The TREETOPS IPS simulation is capable of accurately modeling the multibody IPS system undergoing large angle, nonlinear motion. The simulation is documented and example cases are presented demonstrating disturbance rejection, fine pointing operations, and multiple target pointing and slewing of the IPS.

  5. A New Polarimeter at the Universite de Montreal

    NASA Astrophysics Data System (ADS)

    Manset, Nadine; Bastien, Pierre

    1995-05-01

    We present Beauty and The Beast, a new polarimeter of the Universite de Montreal, formerly built for the Canada-France-Hawaii telescope (CFHT) but never commissioned there. This computer-controlled Pockels cell polarimeter has been restored to working order and offers a wide range of possibilities: almost all functions are under remote control, linear or circular polarization observations are both possible, a filter slide provides easy access to up to six different bandpasses, and the Pockels cell and Fabry lenses are kept at a constant temperature. In addition to controlling the instrument, the software allows the use of pre-defined sequences of observation, and does data acquisition and reduction. (SECTION: Astronomical Instrumentation)

  6. Development status of the life marker chip instrument for ExoMars

    NASA Astrophysics Data System (ADS)

    Sims, Mark R.; Cullen, David C.; Rix, Catherine S.; Buckley, Alan; Derveni, Mariliza; Evans, Daniel; Miguel García-Con, Luis; Rhodes, Andrew; Rato, Carla C.; Stefinovic, Marijan; Sephton, Mark A.; Court, Richard W.; Bulloch, Christopher; Kitchingman, Ian; Ali, Zeshan; Pullan, Derek; Holt, John; Blake, Oliver; Sykes, Jonathan; Samara-Ratna, Piyal; Canali, Massimiliano; Borst, Guus; Leeuwis, Henk; Prak, Albert; Norfini, Aleandro; Geraci, Ennio; Tavanti, Marco; Brucato, John; Holm, Nils

    2012-11-01

    The Life Marker Chip (LMC) is one of the instruments being developed for possible flight on the 2018 ExoMars mission. The instrument uses solvents to extract organic compounds from samples of martian regolith and to transfer the extracts to dedicated detectors based around the use of antibodies. The scientific aims of the instrument are to detect organics in the form of biomarkers that might be associated with extinct life, extant life or abiotic sources of organics. The instrument relies on a novel surfactant-based solvent system and bespoke, commercial and research-developed antibodies against a number of distinct biomarkers or molecular types. The LMC comprises of a number of subsystems designed to accept up to four discrete samples of martian regolith or crushed rock, implement the solvent extraction, perform microfluidic-based multiplexed antibody-assays for biomarkers and other targets, optically detect the fluorescent output of the assays, control the internal instrument pressure and temperature, in addition to the associated instrument control electronics and software. The principle of operation, the design and the instrument development status as of December 2011 are reported here. The instrument principle can be extended to other configurations and missions as needed.

  7. Neutron imaging data processing using the Mantid framework

    NASA Astrophysics Data System (ADS)

    Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried

    2016-09-01

    Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.

  8. Improving Control of Two Motor Controllers

    NASA Technical Reports Server (NTRS)

    Toland, Ronald W.

    2004-01-01

    A computer program controls motors that drive translation stages in a metrology system that consists of a pair of two-axis cathetometers. This program is specific to Compumotor Gemini (or equivalent) motors and the Compumotor 6K-series (or equivalent) motor controller. Relative to the software supplied with the controller, this program affords more capabilities and is easier to use. Written as a Virtual Instrument in the LabVIEW software system, the program presents an imitation control panel that the user can manipulate by use of a keyboard and mouse. There are three modes of operation: command, movement, and joystick. In command mode, single commands are sent to the controller for troubleshooting. In movement mode, distance, speed, and/or acceleration commands are sent to the controller. Position readouts from the motors and from position encoders on the translation stages are displayed in marked fields. At any time, the position readouts can be recorded in a file named by the user. In joystick mode, the program yields control of the motors to a joystick. The program sends commands to, and receives data from, the controller via a serial cable connection, using the serial-communication portion of the software supplied with the controller.

  9. MathWorks Simulink and C++ integration with the new VLT PLC-based standard development platform for instrument control systems

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca

    2014-07-01

    ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.

  10. Digitally controlled sonars

    NASA Technical Reports Server (NTRS)

    Hansen, G. R.

    1983-01-01

    Sonars are usually designed and constructed as stand alone instruments. That is, all elements or subsystems of the sonar are provided: power conditioning, displays, intercommunications, control, receiver, transmitter, and transducer. The sonars which are a part of the Advanced Ocean Test Development Platform (AOTDP) represent a departure from this manner of implementation and are configured more like an instrumentation system. Only the transducer, transmitter, and receiver which are unique to a particular sonar function; Up, Down, Side Scan, exist as separable subsystems. The remaining functions are reserved to the AOTDP and serve all sonars and other instrumentation in a shared manner. The organization and functions of the common AOTDP elements were described and then the interface with the sonars discussed. The techniques for software control of the sonar parameters were explained followed by the details of the realization of the sonar functions and some discussion of the performance of the side scan sonars.

  11. Portable gas chromatograph-mass spectrometer

    DOEpatents

    Andresen, B.D.; Eckels, J.D.; Kimmons, J.F.; Myers, D.W.

    1996-06-11

    A gas chromatograph-mass spectrometer (GC-MS) is described for use as a field portable organic chemical analysis instrument. The GC-MS is designed to be contained in a standard size suitcase, weighs less than 70 pounds, and requires less than 600 watts of electrical power at peak power (all systems on). The GC-MS includes: a conduction heated, forced air cooled small bore capillary gas chromatograph, a small injector assembly, a self-contained ion/sorption pump vacuum system, a hydrogen supply, a dual computer system used to control the hardware and acquire spectrum data, and operational software used to control the pumping system and the gas chromatograph. This instrument incorporates a modified commercial quadrupole mass spectrometer to achieve the instrument sensitivity and mass resolution characteristic of laboratory bench top units. 4 figs.

  12. Surface and borehole neutron probes for the Construction and Resource Utilization eXplorer (CRUX)

    NASA Technical Reports Server (NTRS)

    Elphic, Richard C.; Hahn, Sangkoo; Lawrence, David J.; Feldman, William C.; Johnson, Jerome B.; Haldemann, Albert F. C.

    2006-01-01

    The Construction and Resource Utilization eXplorer (CRUX) project aims to develop an integrated, flexible suite of instruments with data fusion software and an executive controller for the purpose of in situ resource assessment and characterization for future space exploration.

  13. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  14. Wireless Acoustic Measurement System

    NASA Technical Reports Server (NTRS)

    Anderson, Paul D.; Dorland, Wade D.; Jolly, Ronald L.

    2007-01-01

    A prototype wireless acoustic measurement system (WAMS) is one of two main subsystems of the Acoustic Prediction/ Measurement Tool, which comprises software, acoustic instrumentation, and electronic hardware combined to afford integrated capabilities for predicting and measuring noise emitted by rocket and jet engines. The other main subsystem is described in the article on page 8. The WAMS includes analog acoustic measurement instrumentation and analog and digital electronic circuitry combined with computer wireless local-area networking to enable (1) measurement of sound-pressure levels at multiple locations in the sound field of an engine under test and (2) recording and processing of the measurement data. At each field location, the measurements are taken by a portable unit, denoted a field station. There are ten field stations, each of which can take two channels of measurements. Each field station is equipped with two instrumentation microphones, a micro- ATX computer, a wireless network adapter, an environmental enclosure, a directional radio antenna, and a battery power supply. The environmental enclosure shields the computer from weather and from extreme acoustically induced vibrations. The power supply is based on a marine-service lead-acid storage battery that has enough capacity to support operation for as long as 10 hours. A desktop computer serves as a control server for the WAMS. The server is connected to a wireless router for communication with the field stations via a wireless local-area network that complies with wireless-network standard 802.11b of the Institute of Electrical and Electronics Engineers. The router and the wireless network adapters are controlled by use of Linux-compatible driver software. The server runs custom Linux software for synchronizing the recording of measurement data in the field stations. The software includes a module that provides an intuitive graphical user interface through which an operator at the control server can control the operations of the field stations for calibration and for recording of measurement data. A test engineer positions and activates the WAMS. The WAMS automatically establishes the wireless network. Next, the engineer performs pretest calibrations. Then the engineer executes the test and measurement procedures. After the test, the raw measurement files are copied and transferred, through the wireless network, to a hard disk in the control server. Subsequently, the data are processed into 1.3-octave spectrograms.

  15. Wireless Acoustic Measurement System

    NASA Technical Reports Server (NTRS)

    Anderson, Paul D.; Dorland, Wade D.

    2005-01-01

    A prototype wireless acoustic measurement system (WAMS) is one of two main subsystems of the Acoustic Prediction/Measurement Tool, which comprises software, acoustic instrumentation, and electronic hardware combined to afford integrated capabilities for predicting and measuring noise emitted by rocket and jet engines. The other main subsystem is described in "Predicting Rocket or Jet Noise in Real Time" (SSC-00215-1), which appears elsewhere in this issue of NASA Tech Briefs. The WAMS includes analog acoustic measurement instrumentation and analog and digital electronic circuitry combined with computer wireless local-area networking to enable (1) measurement of sound-pressure levels at multiple locations in the sound field of an engine under test and (2) recording and processing of the measurement data. At each field location, the measurements are taken by a portable unit, denoted a field station. There are ten field stations, each of which can take two channels of measurements. Each field station is equipped with two instrumentation microphones, a micro-ATX computer, a wireless network adapter, an environmental enclosure, a directional radio antenna, and a battery power supply. The environmental enclosure shields the computer from weather and from extreme acoustically induced vibrations. The power supply is based on a marine-service lead-acid storage battery that has enough capacity to support operation for as long as 10 hours. A desktop computer serves as a control server for the WAMS. The server is connected to a wireless router for communication with the field stations via a wireless local-area network that complies with wireless-network standard 802.11b of the Institute of Electrical and Electronics Engineers. The router and the wireless network adapters are controlled by use of Linux-compatible driver software. The server runs custom Linux software for synchronizing the recording of measurement data in the field stations. The software includes a module that provides an intuitive graphical user interface through which an operator at the control server can control the operations of the field stations for calibration and for recording of measurement data. A test engineer positions and activates the WAMS. The WAMS automatically establishes the wireless network. Next, the engineer performs pretest calibrations. Then the engineer executes the test and measurement procedures. After the test, the raw measurement files are copied and transferred, through the wireless network, to a hard disk in the control server. Subsequently, the data are processed into 1/3-octave spectrograms.

  16. March of the Starbugs: Configuring Fiber-bearing Robots on the UK-Schmidt Optical Plane

    NASA Astrophysics Data System (ADS)

    Lorente, N. P. F.; Vuong, M.; Satorre, C.; Hong, S. E.; Shortridge, K.; Goodwin, M.; Kuehn, K.

    2015-09-01

    The TAIPAN instrument, currently being developed for the Australian Astronomical Observatory's UK Schmidt telescope at Siding Spring Observatory, makes use of the AAO's Starbug technology to deploy 150 science fibers to target positions on the optical plane. This paper describes the software system for controlling and deploying the fiber-bearing Starbug robots. The TAIPAN software is responsible for allocating each Starbug to its next target position based on its current position and the distribution of targets, finding a collision-free path for each Starbug, and then simultaneously controlling the Starbug hardware in a closed loop, with a metrology camera used to determine the position of each Starbug in the field during reconfiguration. The software is written in C++ and Java and employs a DRAMA middleware layer (Farrell et al. 1995).

  17. Highly Sophisticated Virtual Laboratory Instruments in Education

    NASA Astrophysics Data System (ADS)

    Gaskins, T.

    2006-12-01

    Many areas of Science have advanced or stalled according to the ability to see what can not normally be seen. Visual understanding has been key to many of the world's greatest breakthroughs, such as discovery of DNAs double helix. Scientists use sophisticated instruments to see what the human eye can not. Light microscopes, scanning electron microscopes (SEM), spectrometers and atomic force microscopes are employed to examine and learn the details of the extremely minute. It's rare that students prior to university have access to such instruments, or are granted full ability to probe and magnify as desired. Virtual Lab, by providing highly authentic software instruments and comprehensive imagery of real specimens, provides them this opportunity. Virtual Lab's instruments let explorers operate virtual devices on a personal computer to examine real specimens. Exhaustive sets of images systematically and robotically photographed at thousands of positions and multiple magnifications and focal points allow students to zoom in and focus on the most minute detail of each specimen. Controls on each Virtual Lab device interactively and smoothly move the viewer through these images to display the specimen as the instrument saw it. Users control position, magnification, focal length, filters and other parameters. Energy dispersion spectrometry is combined with SEM imagery to enable exploration of chemical composition at minute scale and arbitrary location. Annotation capabilities allow scientists, teachers and students to indicate important features or areas. Virtual Lab is a joint project of NASA and the Beckman Institute at the University of Illinois at Urbana- Champaign. Four instruments currently compose the Virtual Lab suite: A scanning electron microscope and companion energy dispersion spectrometer, a high-power light microscope, and a scanning probe microscope that captures surface properties to the level of atoms. Descriptions of instrument operating principles and uses are also part of Virtual Lab. The Virtual Lab software and its increasingly rich collection of specimens are free to anyone. This presentation describes Virtual Lab and its uses in formal and informal education.

  18. Telescience capability for the Sondre Stromfjord, Greenland, incoherent-scatter radar facility

    NASA Astrophysics Data System (ADS)

    Zambre, Yadunath B.

    1993-01-01

    SRI International (SRI) operates an upper-atmospheric research facility in Sondre Stromfjord (Sondrestrom), Greenland. In the past, the facility's remote location and limited logistical support imposed constraints on the research that could be carried out at the site. Campaigns involving multiple instruments were often constrained due to limited space, and experiments requiring coordination with other geographically separated facilities, though possible, were difficult. To provide greater access to the facility, an electronic connection between Sondrestrom and the mainland U.S.A. was established, providing access to the National Science Internet. SRI developed telescience software that sends data from the incoherent scatter radar at the Sondrestrom facility to SRI's offices in Menlo Park, California. This software uses the transmission control protocol (TCP/IP) to transmit the data in near real time between the two locations and the X window system to generate displays of the data in Menlo Park. This is in contrast to using the X window system to display data remotely across a wide-area network. Using CP to transport data over the long distance network has resulted in significantly improved network throughput and latency. While currently used to transport radar data, the telescience software is designed and intended for simultaneous use with other instruments at Sondrestrom and other facilities. Work incorporating additional instruments is currently in progress.

  19. Studies and research concerning BNFP: computerized nuclear materials control and accounting system development evaluation report, FY 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, J M; Ehinger, M H; Joseph, C

    1978-10-01

    Development work on a computerized system for nuclear materials control and accounting in a nuclear fuel reprocessing plant is described and evaluated. Hardware and software were installed and tested to demonstrate key measurement, measurement control, and accounting requirements at accountability input/output points using natural uranium. The demonstration included a remote data acquisition system which interfaces process and special instrumentation to a cenral processing unit.

  20. Instrumental Develovement of 50 Meters Free Style Swimming Speed Measurement Based on Microcontroller Arduino Uno

    NASA Astrophysics Data System (ADS)

    Badruzaman; Rusdiana, A.; Gilang, M. R.; Martini, T.

    2017-03-01

    This study is purposed to make a software and hardware instrument in controlling the velocity of 50 meters free style swimming speed measurement based on microcontroller Arduino Uno. The writer uses 6 participants of advanced 2015 college students of sport education. The materials he uses are electronical series of microcontroller Arduino Uno base, laser sensors shone on light dependent resistor, laser receiver functions as a detector of laser cutting block, cables as connector transfering the data. This device consist of 4 installable censors in every 10 meters with the result of swimming speed showed on the monitors using visual basic 6.0 software. This instrument automatically works when the buzzer is pushed and also runs the timer on the application. For the procedure, the writer asks the participants to swim in free style along 50 meters. When the athlete swims, they will cut the laser of every censors so that it gives a signal to stop the running timer on the monitoring application. The output result the writer gets from this used instrument is to know how fast a swimmer swim in maximum speed, to know the time and distance of acceleration and decelaration that happens. The result of validity instrument shows 0,605 (high), while the reliability is 0,833 (very high).

  1. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  2. ICESat (GLAS) Science Processing Software Document Series. Volume 2; Science Data Management Plan; 4.0

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Hancock, David W., III

    1999-01-01

    This document provides the Data Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Facility (ISF) Software. This Plan addresses the identification, authority, and description of the interface nodes associated with the GLAS Standard Data Products and the GLAS Ancillary Data.

  3. imzML: Imaging Mass Spectrometry Markup Language: A common data format for mass spectrometry imaging.

    PubMed

    Römpp, Andreas; Schramm, Thorsten; Hester, Alfons; Klinkert, Ivo; Both, Jean-Pierre; Heeren, Ron M A; Stöckli, Markus; Spengler, Bernhard

    2011-01-01

    Imaging mass spectrometry is the method of scanning a sample of interest and generating an "image" of the intensity distribution of a specific analyte. The data sets consist of a large number of mass spectra which are usually acquired with identical settings. Existing data formats are not sufficient to describe an MS imaging experiment completely. The data format imzML was developed to allow the flexible and efficient exchange of MS imaging data between different instruments and data analysis software.For this purpose, the MS imaging data is divided in two separate files. The mass spectral data is stored in a binary file to ensure efficient storage. All metadata (e.g., instrumental parameters, sample details) are stored in an XML file which is based on the standard data format mzML developed by HUPO-PSI. The original mzML controlled vocabulary was extended to include specific parameters of imaging mass spectrometry (such as x/y position and spatial resolution). The two files (XML and binary) are connected by offset values in the XML file and are unambiguously linked by a universally unique identifier. The resulting datasets are comparable in size to the raw data and the separate metadata file allows flexible handling of large datasets.Several imaging MS software tools already support imzML. This allows choosing from a (growing) number of processing tools. One is no longer limited to proprietary software, but is able to use the processing software which is best suited for a specific question or application. On the other hand, measurements from different instruments can be compared within one software application using identical settings for data processing. All necessary information for evaluating and implementing imzML can be found at http://www.imzML.org .

  4. The CARIBU EBIS control and synchronization system

    NASA Astrophysics Data System (ADS)

    Dickerson, Clayton; Peters, Christopher

    2015-01-01

    The Californium Rare Isotope Breeder Upgrade (CARIBU) Electron Beam Ion Source (EBIS) charge breeder has been built and tested. The bases of the CARIBU EBIS electrical system are four voltage platforms on which both DC and pulsed high voltage outputs are controlled. The high voltage output pulses are created with either a combination of a function generator and a high voltage amplifier, or two high voltage DC power supplies and a high voltage solid state switch. Proper synchronization of the pulsed voltages, fundamental to optimizing the charge breeding performance, is achieved with triggering from a digital delay pulse generator. The control system is based on National Instruments realtime controllers and LabVIEW software implementing Functional Global Variables (FGV) to store and access instrument parameters. Fiber optic converters enable network communication and triggering across the platforms.

  5. Building an open-source robotic stereotaxic instrument.

    PubMed

    Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O

    2013-10-29

    This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.

  6. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  7. Design of LabVIEW®-based software for the control of sequential injection analysis instrumentation for the determination of morphine

    PubMed Central

    Lenehan, Claire E.; Lewis, Simon W.

    2002-01-01

    LabVIEW®-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 × 10-10 to 5 × 10-6 M) with a line of best fit of y=1.05x+8.9164 (R2 =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 × 10-8 M). The limit of detection (3σ) was determined as 5 × 10-11 M morphine. PMID:18924729

  8. Design of LabVIEW-based software for the control of sequential injection analysis instrumentation for the determination of morphine.

    PubMed

    Lenehan, Claire E; Barnett, Neil W; Lewis, Simon W

    2002-01-01

    LabVIEW-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 x 10(-10) to 5 x 10(-6) M) with a line of best fit of y=1.05(x)+8.9164 (R(2) =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 x 10(-8) M). The limit of detection (3sigma) was determined as 5 x 10(-11) M morphine.

  9. A subscale facility for liquid rocket propulsion diagnostics at Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Raines, N. G.; Bircher, F. E.; Chenevert, D. J.

    1991-01-01

    The Diagnostics Testbed Facility (DTF) at NASA's John C. Stennis Space Center in Mississippi was designed to provide a testbed for the development of rocket engine exhaust plume diagnostics instrumentation. A 1200-lb thrust liquid oxygen/gaseous hydrogen thruster is used as the plume source for experimentation and instrument development. Theoretical comparative studies have been performed with aerothermodynamic codes to ensure that the DTF thruster (DTFT) has been optimized to produce a plume with pressure and temperature conditions as much like the plume of the Space Shuttle Main Engine as possible. Operation of the DTFT is controlled by an icon-driven software program using a series of soft switches. Data acquisition is performed using the same software program. A number of plume diagnostics experiments have utilized the unique capabilities of the DTF.

  10. Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei

    2001-01-01

    This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

  11. ITOS to EDGE "Bridge" Software for Morpheus Lunar/Martian Vehicle

    NASA Technical Reports Server (NTRS)

    Hirsh, Robert; Fuchs, Jordan

    2012-01-01

    My project Involved Improving upon existing software and writing new software for the Project Morpheus Team. Specifically, I created and updated Integrated Test and Operations Systems (ITOS) user Interfaces for on-board Interaction with the vehicle during archive playback as well as live streaming data. These Interfaces are an integral part of the testing and operations for the Morpheus vehicle providing any and all information from the vehicle to evaluate instruments and insure coherence and control of the vehicle during Morpheus missions. I also created a "bridge" program for Interfacing "live" telemetry data with the Engineering DOUG Graphics Engine (EDGE) software for a graphical (standalone or VR dome) view of live Morpheus nights or archive replays, providing graphical representation of vehicle night and movement during subsequent tests and in real missions.

  12. Preliminary design of the HARMONI science software

    NASA Astrophysics Data System (ADS)

    Piqueras, Laure; Jarno, Aurelien; Pécontal-Rousset, Arlette; Loupias, Magali; Richard, Johan; Schwartz, Noah; Fusco, Thierry; Sauvage, Jean-François; Neichel, Benoît; Correia, Carlos M.

    2016-08-01

    This paper introduces the science software of HARMONI. The Instrument Numerical Model simulates the instrument from the optical point of view and provides synthetic exposures simulating detector readouts from data-cubes containing astrophysical scenes. The Data Reduction Software converts raw-data frames into a fully calibrated, scientifically usable data cube. We present the functionalities and the preliminary design of this software, describe some of the methods and algorithms used and highlight the challenges that we will have to face.

  13. Scalable Multiprocessor for High-Speed Computing in Space

    NASA Technical Reports Server (NTRS)

    Lux, James; Lang, Minh; Nishimoto, Kouji; Clark, Douglas; Stosic, Dorothy; Bachmann, Alex; Wilkinson, William; Steffke, Richard

    2004-01-01

    A report discusses the continuing development of a scalable multiprocessor computing system for hard real-time applications aboard a spacecraft. "Hard realtime applications" signifies applications, like real-time radar signal processing, in which the data to be processed are generated at "hundreds" of pulses per second, each pulse "requiring" millions of arithmetic operations. In these applications, the digital processors must be tightly integrated with analog instrumentation (e.g., radar equipment), and data input/output must be synchronized with analog instrumentation, controlled to within fractions of a microsecond. The scalable multiprocessor is a cluster of identical commercial-off-the-shelf generic DSP (digital-signal-processing) computers plus generic interface circuits, including analog-to-digital converters, all controlled by software. The processors are computers interconnected by high-speed serial links. Performance can be increased by adding hardware modules and correspondingly modifying the software. Work is distributed among the processors in a parallel or pipeline fashion by means of a flexible master/slave control and timing scheme. Each processor operates under its own local clock; synchronization is achieved by broadcasting master time signals to all the processors, which compute offsets between the master clock and their local clocks.

  14. The Open AUC Project.

    PubMed

    Cölfen, Helmut; Laue, Thomas M; Wohlleben, Wendel; Schilling, Kristian; Karabudak, Engin; Langhorst, Bradley W; Brookes, Emre; Dubbs, Bruce; Zollars, Dan; Rocco, Mattia; Demeler, Borries

    2010-02-01

    Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software.

  15. Software feedback for monochromator tuning at UNICAT (abstract)

    NASA Astrophysics Data System (ADS)

    Jemian, Pete R.

    2002-03-01

    Automatic tuning of double-crystal monochromators presents an interesting challenge in software. The goal is to either maximize, or hold constant, the throughput of the monochromator. An additional goal of the software feedback is to disable itself when there is no beam and then, at the user's discretion, re-enable itself when the beam returns. These and other routine goals, such as adherence to limits of travel for positioners, are maintained by software controls. Many solutions exist to lock in and maintain a fixed throughput. Among these include a hardware solution involving a wave form generator, and a lock-in amplifier to autocorrelate the movement of a piezoelectric transducer (PZT) providing fine adjustment of the second crystal Bragg angle. This solution does not work when the positioner is a slow acting device such as a stepping motor. Proportional integral differential (PID) loops have been used to provide feedback through software but additional controls must be provided to maximize the monochromator throughput. Presented here is a software variation of the PID loop which meets the above goals. By using two floating point variables as inputs, representing the intensity of x rays measured before and after the monochromator, it attempts to maximize (or hold constant) the ratio of these two inputs by adjusting an output floating point variable. These floating point variables are connected to hardware channels corresponding to detectors and positioners. When the inputs go out of range, the software will stop making adjustments to the control output. Not limited to monochromator feedback, the software could be used, with beam steering positioners, to maintain a measure of beam position. Advantages of this software feedback are the flexibility of its various components. It has been used with stepping motors and PZTs as positioners. Various devices such as ion chambers, scintillation counters, photodiodes, and photoelectron collectors have been used as detectors. The software provides significant cost savings over hardware feedback methods. Presently implemented in EPICS, the software is sufficiently general to any automated instrument control system.

  16. Computer software configuration description, 241-AY and 241-AZ tank farm MICON automation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkelman, W.D.

    This document describes the configuration process, choices and conventions used during the configuration activities, and issues involved in making changes to the configuration. Includes the master listings of the Tag definitions, which should be revised to authorize any changes. Revision 2 incorporates minor changes to ensure the document setpoints accurately reflect limits (including exhaust stack flow of 800 scfm) established in OSD-T-151-00019. The MICON DCS software controls and monitors the instrumentation and equipment associated with plant systems and processes.

  17. Precision pointing of the international ultraviolet explorer /IUE/ scientific instrument using a gyroscopic and stellar reference

    NASA Technical Reports Server (NTRS)

    Moore, J. V.

    1976-01-01

    The Attitude Control System for the IUE spacecraft is described. The basic mission objectives are stated and a sequential discussion of the mission is presented. Desired accuracy for each mission phase is noted and where applicable the onboard control mechanization is shown. Sensors and actuator systems utilized by the control algorithms are described. Finally, onboard software is discussed to a level necessary to understand the prime mission mode operation.

  18. IEEE 1451.2 based Smart sensor system using ADuc847

    NASA Astrophysics Data System (ADS)

    Sreejithlal, A.; Ajith, Jose

    IEEE 1451 standard defines a standard interface for connecting transducers to microprocessor based data acquisition systems, instrumentation systems, control and field networks. Smart transducer interface module (STIM) acts as a unit which provides signal conditioning, digitization and data packet generation functions to the transducers connected to it. This paper describes the implementation of a microcontroller based smart transducer interface module based on IEEE 1451.2 standard. The module, implemented using ADuc847 microcontroller has 2 transducer channels and is programmed using Embedded C language. The Sensor system consists of a Network Controlled Application Processor (NCAP) module which controls the Smart transducer interface module (STIM) over an IEEE1451.2-RS232 bus. The NCAP module is implemented as a software module in C# language. The hardware details, control principles involved and the software implementation for the STIM are described in detail.

  19. Knowledge-based engineering of a PLC controlled telescope

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Raskin, Gert; Saey, Philippe; Van Winckel, Hans; Deconinck, Geert

    2016-08-01

    As the new control system of the Mercator Telescope is being finalized, we can review some technologies and design methodologies that are advantageous, despite their relative uncommonness in astronomical instrumentation. Particular for the Mercator Telescope is that it is controlled by a single high-end soft-PLC (Programmable Logic Controller). Using off-the-shelf components only, our distributed embedded system controls all subsystems of the telescope such as the pneumatic primary mirror support, the hydrostatic bearing, the telescope axes, the dome, the safety system, and so on. We show how real-time application logic can be written conveniently in typical PLC languages (IEC 61131-3) and in C++ (to implement the pointing kernel) using the commercial TwinCAT 3 programming environment. This software processes the inputs and outputs of the distributed system in real-time via an observatory-wide EtherCAT network, which is synchronized with high precision to an IEEE 1588 (PTP, Precision Time Protocol) time reference clock. Taking full advantage of the ability of soft-PLCs to run both real-time and non real-time software, the same device also hosts the most important user interfaces (HMIs or Human Machine Interfaces) and communication servers (OPC UA for process data, FTP for XML configuration data, and VNC for remote control). To manage the complexity of the system and to streamline the development process, we show how most of the software, electronics and systems engineering aspects of the control system have been modeled as a set of scripts written in a Domain Specific Language (DSL). When executed, these scripts populate a Knowledge Base (KB) which can be queried to retrieve specific information. By feeding the results of those queries to a template system, we were able to generate very detailed "browsable" web-based documentation about the system, but also PLC software code, Python client code, model verification reports, etc. The aim of this paper is to demonstrate the added value that technologies such as soft-PLCs and DSL-scripts and design methodologies such as knowledge-based engineering can bring to astronomical instrumentation.

  20. A new development on measurement and control software of SANS BATAN spectrometer (SMARTer) in Serpong, Indonesia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bharoto,; Suparno, Nadi; Putra, Edy Giri Rachman

    In 2005, the main computer for data acquisition and control system of Small-angle Neutron Scattering (SANS) BATAN Spectrometer (SMARTer) was replaced since it halted to operate the spectrometer. According to this replacement, the new software for data acquisition and control system has been developed in-house. Visual Basic programming language is used in developing the software. In the last two years, many developments have been made both in the hardware and also the software to conduct the experiment is more effective and efficient. Lately, the previous motor controller card (ISA Card) was replaced with the programmable motor controller card (PCI Card)more » for driving one motor of position sensitive detector (PSD), eight motors of four collimators, and six motors of six pinhole discs. This new control system software makes all motors can be moved simultaneously, then it reduces significantly the consuming time of setting up the instrument before running the experiment. Along with that development, the new data acquisition software under MS Windows operating system is also developed to drive a beam stopper in X-Y directions as well as to read the equipment status such as position of the collimators and PSD, to acquire neutron counts on monitor and PSD detectors, and also to manage 12 samples position automatically. A timer object which is set in one second to read the equipment status via serial port of the computer (RS232C), and general purpose interface board (GPIB) for reading the total counts of each pixel of the PSD from histogram memory was used in this new software. The experiment result displayed in real time on the main window, and the data is saved in the special format for further data reduction and analysis. The new software has been implemented and performed for experiment using a preset count or preset time mode for absolute scattering intensity method.« less

  1. A new development on measurement and control software of SANS BATAN spectrometer (SMARTer) in Serpong, Indonesia

    NASA Astrophysics Data System (ADS)

    Bharoto, Suparno, Nadi; Putra, Edy Giri Rachman

    2015-04-01

    In 2005, the main computer for data acquisition and control system of Small-angle Neutron Scattering (SANS) BATAN Spectrometer (SMARTer) was replaced since it halted to operate the spectrometer. According to this replacement, the new software for data acquisition and control system has been developed in-house. Visual Basic programming language is used in developing the software. In the last two years, many developments have been made both in the hardware and also the software to conduct the experiment is more effective and efficient. Lately, the previous motor controller card (ISA Card) was replaced with the programmable motor controller card (PCI Card) for driving one motor of position sensitive detector (PSD), eight motors of four collimators, and six motors of six pinhole discs. This new control system software makes all motors can be moved simultaneously, then it reduces significantly the consuming time of setting up the instrument before running the experiment. Along with that development, the new data acquisition software under MS Windows operating system is also developed to drive a beam stopper in X-Y directions as well as to read the equipment status such as position of the collimators and PSD, to acquire neutron counts on monitor and PSD detectors, and also to manage 12 samples position automatically. A timer object which is set in one second to read the equipment status via serial port of the computer (RS232C), and general purpose interface board (GPIB) for reading the total counts of each pixel of the PSD from histogram memory was used in this new software. The experiment result displayed in real time on the main window, and the data is saved in the special format for further data reduction and analysis. The new software has been implemented and performed for experiment using a preset count or preset time mode for absolute scattering intensity method.

  2. Software development kit for a compact cryo-refrigerator

    NASA Astrophysics Data System (ADS)

    Gardiner, J.; Hamilton, J.; Lawton, J.; Knight, K.; Wilson, A.; Spagna, S.

    2017-12-01

    This paper introduces a Software Development Kit (SDK) that enables the creation of custom software applications that automate the control of a cryo-refrigerator (Quantum Design model GA-1) in third party instruments. A remote interface allows real time tracking and logging of critical system diagnostics such as pressures, temperatures, valve states and run modes. The helium compressor scroll capsule speed and Gifford-McMahon (G-M) cold head speed can be manually adjusted over a serial communication line via a CAN interface. This configuration optimizes cooling power, while reducing wear on moving components thus extending service life. Additionally, a proportional speed control mode allows for automated throttling of speeds based on temperature or pressure feedback from a 3rd party device. Warm up and cool down modes allow 1st and 2nd stage temperatures to be adjusted without the use of external heaters.

  3. Automated complex for research of electric drives control

    NASA Astrophysics Data System (ADS)

    Avlasko, P. V.; Antonenko, D. A.

    2018-05-01

    In the article, the automated complex intended for research of various control modes of electric motors including the inductor motor of double-way feed is described. As a basis of the created complex, the National Instruments platform is chosen. The operating controller built in a platform is delivered with an operating system of real-time for creation of systems of measurement and management. The software developed in the environment of LabVIEW consists of several connected modules which are in different elements of a complex. Besides the software for automated management by experimental installation, the program complex is developed for modelling of processes in the electric drive. As a result there is an opportunity to compare simulated and received experimentally transitional characteristics of the electric drive in various operating modes.

  4. ChiMS: Open-source instrument control software platform on LabVIEW for imaging/depth profiling mass spectrometers.

    PubMed

    Cui, Yang; Hanley, Luke

    2015-06-01

    ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.

  5. ChiMS: Open-source instrument control software platform on LabVIEW for imaging/depth profiling mass spectrometers

    PubMed Central

    Cui, Yang; Hanley, Luke

    2015-01-01

    ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science. PMID:26133872

  6. ChiMS: Open-source instrument control software platform on LabVIEW for imaging/depth profiling mass spectrometers

    NASA Astrophysics Data System (ADS)

    Cui, Yang; Hanley, Luke

    2015-06-01

    ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.

  7. Software for the Hydra Instrument on the Polar Spacecraft

    NASA Technical Reports Server (NTRS)

    Fillius, Walker

    1996-01-01

    This software was developed by UCSD for the Hydra instrument and conforms with the contractural Statement of Work, with the exception, ordered by the NASA Technical Monitor, that the programming language was assembly language rather than Forth.

  8. Evaluation of FNS control systems: software development and sensor characterization.

    PubMed

    Riess, J; Abbas, J J

    1997-01-01

    Functional Neuromuscular Stimulation (FNS) systems activate paralyzed limbs by electrically stimulating motor neurons. These systems have been used to restore functions such as standing and stepping in people with thoracic level spinal cord injury. Research in our laboratory is directed at the design and evaluation of the control algorithms for generating posture and movement. This paper describes software developed for implementing FNS control systems and the characterization of a sensor system used to implement and evaluate controllers in the laboratory. In order to assess FNS control algorithms, we have developed a versatile software package using Lab VIEW (National Instruments, Corp). This package provides the ability to interface with sensor systems via serial port or A/D board, implement data processing and real-time control algorithms, and interface with neuromuscular stimulation devices. In our laboratory, we use the Flock of Birds (Ascension Technology Corp.) motion tracking sensor system to monitor limb segment position and orientation (6 degrees of freedom). Errors in the sensor system have been characterized and nonlinear polynomial models have been developed to account for these errors. With this compensation, the error in the distance measurement is reduced by 90 % so that the maximum error is less than 1 cm.

  9. Development of Universal Controller Architecture for SiC Based Power Electronic Building Blocks

    DTIC Science & Technology

    2017-10-30

    time control and control network routing and the other for non -real time instrumentation and monitoring. The two subsystems are isolated and share...directly to the processor without any software intervention. We use a non -real time I Gb/s Ethernet interface for monitoring and control of the module...NOTC1 802.lW Spanning tree Prot. 76.96 184.0 107.04 Multiple point Private Line l NOTC1 203.2 382.3 179.1 N/ A Non applicable 1 No traffic control at

  10. Microorganism penetration in dentinal tubules of instrumented and retreated root canal walls. In vitro SEM study.

    PubMed

    Al-Nazhan, Saad; Al-Sulaiman, Alaa; Al-Rasheed, Fellwa; Alnajjar, Fatimah; Al-Abdulwahab, Bander; Al-Badah, Abdulhakeem

    2014-11-01

    This in vitro study aimed to investigate the ability of Candida albicans (C. albicans) and Enterococcus faecalis (E. faecalis) to penetrate dentinal tubules of instrumented and retreated root canal surface of split human teeth. Sixty intact extracted human single-rooted teeth were divided into 4 groups, negative control, positive control without canal instrumentation, instrumented, and retreated. Root canals in the instrumented group were enlarged with endodontic instruments, while root canals in the retreated group were enlarged, filled, and then removed the canal filling materials. The teeth were split longitudinally after canal preparation in 3 groups except the negative control group. The teeth were inoculated with both microorganisms separately and in combination. Teeth specimens were examined by scanning electron microscopy (SEM), and the depth of penetration into the dentinal tubules was assessed using the SMILE view software (JEOL Ltd). Penetration of C. albicans and E. faecalis into the dentinal tubules was observed in all 3 groups, although penetration was partially restricted by dentin debris of tubules in the instrumented group and remnants of canal filling materials in the retreated group. In all 3 groups, E. faecalis penetrated deeper into the dentinal tubules by way of cell division than C. albicans which built colonies and penetrated by means of hyphae. Microorganisms can easily penetrate dentinal tubules of root canals with different appearance based on the microorganism size and status of dentinal tubules.

  11. Time-resolved laser-induced fluorescence system

    NASA Astrophysics Data System (ADS)

    Bautista, F. J.; De la Rosa, J.; Gallegos, F. J.

    2006-02-01

    Fluorescence methods are being used increasingly in the measurement of species concentrations in gases, liquids and solids. Laser induced fluorescence is spontaneous emission from atoms or molecules that have been excited by laser radiation. Here we present a time resolved fluorescence instrument that consists of a 5 μJ Nitrogen laser (337.1 nm), a sample holder, a quartz optical fiber, a spectrometer, a PMT and a PC that allows the measurement of visible fluorescence spectra (350-750 nm). Time response of the system is approximately 5 ns. The instrument has been used in the measurement of colored bond paper, antifreeze, diesel, cochineal pigment and malignant tissues. The data acquisition was achieved through computer control of a digital oscilloscope (using General Purpose Interface Bus GPIB) and the spectrometer via serial (RS232). The instrument software provides a graphic interface that lets make some data acquisition tasks like finding fluorescence spectra, and fluorescence lifetimes. The software was developed using the Lab-View 6i graphic programming package and can be easily managed in order to add more functions to it.

  12. CÆLIS: software for assimilation, management and processing data of an atmospheric measurement network

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamín; Cachorro, Victoria E.; de Frutos, Ángel M.

    2018-02-01

    Given the importance of the atmospheric aerosol, the number of instruments and measurement networks which focus on its characterization are growing. Many challenges are derived from standardization of protocols, monitoring of the instrument status to evaluate the network data quality and manipulation and distribution of large volume of data (raw and processed). CÆLIS is a software system which aims at simplifying the management of a network, providing tools by monitoring the instruments, processing the data in real time and offering the scientific community a new tool to work with the data. Since 2008 CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, in the framework of Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits the network management and data quality control. The present work describes the system architecture of CÆLIS and some examples of applications and data processing.

  13. Simulating the WFIRST coronagraph integral field spectrograph

    NASA Astrophysics Data System (ADS)

    Rizzo, Maxime J.; Groff, Tyler D.; Zimmermann, Neil T.; Gong, Qian; Mandell, Avi M.; Saxena, Prabal; McElwain, Michael W.; Roberge, Aki; Krist, John; Riggs, A. J. Eldorado; Cady, Eric J.; Mejia Prada, Camilo; Brandt, Timothy; Douglas, Ewan; Cahoy, Kerri

    2017-09-01

    A primary goal of direct imaging techniques is to spectrally characterize the atmospheres of planets around other stars at extremely high contrast levels. To achieve this goal, coronagraphic instruments have favored integral field spectrographs (IFS) as the science cameras to disperse the entire search area at once and obtain spectra at each location, since the planet position is not known a priori. These spectrographs are useful against confusion from speckles and background objects, and can also help in the speckle subtraction and wavefront control stages of the coronagraphic observation. We present a software package, the Coronagraph and Rapid Imaging Spectrograph in Python (crispy) to simulate the IFS of the WFIRST Coronagraph Instrument (CGI). The software propagates input science cubes using spatially and spectrally resolved coronagraphic focal plane cubes, transforms them into IFS detector maps and ultimately reconstructs the spatio-spectral input scene as a 3D datacube. Simulated IFS cubes can be used to test data extraction techniques, refine sensitivity analyses and carry out design trade studies of the flight CGI-IFS instrument. crispy is a publicly available Python package and can be adapted to other IFS designs.

  14. A new on-axis micro-spectrophotometer for combining Raman, fluorescence and UV/Vis absorption spectroscopy with macromolecular crystallography at the Swiss Light Source

    PubMed Central

    Pompidor, Guillaume; Dworkowski, Florian S. N.; Thominet, Vincent; Schulze-Briese, Clemens; Fuchs, Martin R.

    2013-01-01

    The combination of X-ray diffraction experiments with optical methods such as Raman, UV/Vis absorption and fluorescence spectroscopy greatly enhances and complements the specificity of the obtained information. The upgraded version of the in situ on-axis micro-spectrophotometer, MS2, at the macromolecular crystallography beamline X10SA of the Swiss Light Source is presented. The instrument newly supports Raman and resonance Raman spectroscopy, in addition to the previously available UV/Vis absorption and fluorescence modes. With the recent upgrades of the spectral bandwidth, instrument stability, detection efficiency and control software, the application range of the instrument and its ease of operation were greatly improved. Its on-axis geometry with collinear X-ray and optical axes to ensure optimal control of the overlap of sample volumes probed by each technique is still unique amongst comparable facilities worldwide and the instrument has now been in general user operation for over two years. PMID:23955041

  15. A compact holographic optical tweezers instrument

    NASA Astrophysics Data System (ADS)

    Gibson, G. M.; Bowman, R. W.; Linnenberger, A.; Dienerowitz, M.; Phillips, D. B.; Carberry, D. M.; Miles, M. J.; Padgett, M. J.

    2012-11-01

    Holographic optical tweezers have found many applications including the construction of complex micron-scale 3D structures and the control of tools and probes for position, force, and viscosity measurement. We have developed a compact, stable, holographic optical tweezers instrument which can be easily transported and is compatible with a wide range of microscopy techniques, making it a valuable tool for collaborative research. The instrument measures approximately 30×30×35 cm and is designed around a custom inverted microscope, incorporating a fibre laser operating at 1070 nm. We designed the control software to be easily accessible for the non-specialist, and have further improved its ease of use with a multi-touch iPad interface. A high-speed camera allows multiple trapped objects to be tracked simultaneously. We demonstrate that the compact instrument is stable to 0.5 nm for a 10 s measurement time by plotting the Allan variance of the measured position of a trapped 2 μm silica bead. We also present a range of objects that have been successfully manipulated.

  16. A new on-axis micro-spectrophotometer for combining Raman, fluorescence and UV/Vis absorption spectroscopy with macromolecular crystallography at the Swiss Light Source.

    PubMed

    Pompidor, Guillaume; Dworkowski, Florian S N; Thominet, Vincent; Schulze-Briese, Clemens; Fuchs, Martin R

    2013-09-01

    The combination of X-ray diffraction experiments with optical methods such as Raman, UV/Vis absorption and fluorescence spectroscopy greatly enhances and complements the specificity of the obtained information. The upgraded version of the in situ on-axis micro-spectrophotometer, MS2, at the macromolecular crystallography beamline X10SA of the Swiss Light Source is presented. The instrument newly supports Raman and resonance Raman spectroscopy, in addition to the previously available UV/Vis absorption and fluorescence modes. With the recent upgrades of the spectral bandwidth, instrument stability, detection efficiency and control software, the application range of the instrument and its ease of operation were greatly improved. Its on-axis geometry with collinear X-ray and optical axes to ensure optimal control of the overlap of sample volumes probed by each technique is still unique amongst comparable facilities worldwide and the instrument has now been in general user operation for over two years.

  17. CoRoTlog

    NASA Astrophysics Data System (ADS)

    Plasson, Ph.

    2006-11-01

    LESIA, in close cooperation with CNES, DLR and IWF, is responsible for the tests and validation of the CoRoT instrument digital process unit which is made up of the BEX and DPU assembly. The main part of the work has consisted in validating the DPU software and in testing the BEX/DPU coupling. This work took more than two years due to the central role of the software tested and its technical complexity. The first task, in the validation process, was to carry out the acceptance tests of the DPU software. These tests consisted in checking each of the 325 requirements identified in the URD (User Requirements Document) and were played in a configuration using the DPU coupled to a BEX simulator. During the acceptance tests, all the transversal functionalities of the DPU software, like the TC/TM management, the state machine management, the BEX driving, the system monitoring or the maintenance functionalities were checked in depth. The functionalities associated with the seismology and exoplanetology processing, like the loading of window and mask descriptors or the configuration of the service execution parameters, were also exhaustively tested. After having validated the DPU software against the user requirements using a BEX simulator, the following step consisted in coupling the DPU and the BEX in order to check that the formed unit worked correctly and met the performance requirements. These tests were conducted in two phases: the first one was devoted to the functional aspects and the tests of interface, the second one to the performance aspects. The performance tests were based on the use of the DPU software scientific services and on the use of full images representative of a realistic sky as inputs. These tests were also based on the use of a reference set of windows and parameters, which was provided by the scientific team and was representative, in terms of load and complexity, of the one that could be used during the observation mode of the CoRoT instrument. Theywere played in a configuration using either a BCC simulator or a real BCC coupled to a video simulator, to feed the BEX/DPU unit. The validation of the scientific algorithms was conducted in parallel to the phase of the BEX/DPU coupling tests. The objective of this phase was to check that the algorithms implemented in the scientific services of the DPU software were in good conformity with those specified in the URD and that the obtained numerical precision corresponded to that expected. Forty cases of tests were defined covering the fine and rough angular error measurement processing, the rejection of the brilliant pixels, the subtraction of the offset and the sky background, the photometry algorithms, the SAA handling and reference image management. For each test case, the LESIA scientific team produced, by simulation, using the model instrument, the dynamic data files and the parameter sets allowing to feed the DPU on the one hand, and, on the other hand, a model of the onboard software. These data files correspond to FITS images (black windows, star windows, offset windows) containing more or less disturbances and making it possible to test the DPU software in dynamic mode over durations of up to 48 hours. To perform the test and validation activities of the CoRoT instrument digital process unit, a set of software testing tools was developed by LESIA (Software Ground Support Equipment, hereafter "SGSE"). Thanks to their versatility and modularity, these software testing tools were actually used during all the activities of integration, tests and validation of the instrument and its subsystems CoRoTCase and CoRoTCam. The CoRoT SGSE were specified, designed and developed by LESIA. The objective was to have a software system allowing the users (validation team of the onboard software, instrument integration team, etc.) to remotely control and monitor the whole instrument or only one of the subsystems of the instrument like the DPU coupled to a simulator BEX or the BEX/DPU unit coupled to a BCC simulator. The idea was to be able to interact in real time with the system under test by driving the various EGSE, but also to play test procedures implemented as scripts organized into libraries, to record the telemetries and housekeeping data in a database, and to be able to carry out post-mortem analyses.

  18. The CARIBU EBIS control and synchronization system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickerson, Clayton, E-mail: cdickerson@anl.gov; Peters, Christopher, E-mail: cdickerson@anl.gov

    2015-01-09

    The Californium Rare Isotope Breeder Upgrade (CARIBU) Electron Beam Ion Source (EBIS) charge breeder has been built and tested. The bases of the CARIBU EBIS electrical system are four voltage platforms on which both DC and pulsed high voltage outputs are controlled. The high voltage output pulses are created with either a combination of a function generator and a high voltage amplifier, or two high voltage DC power supplies and a high voltage solid state switch. Proper synchronization of the pulsed voltages, fundamental to optimizing the charge breeding performance, is achieved with triggering from a digital delay pulse generator. Themore » control system is based on National Instruments realtime controllers and LabVIEW software implementing Functional Global Variables (FGV) to store and access instrument parameters. Fiber optic converters enable network communication and triggering across the platforms.« less

  19. Controlling Precision Stepper Motors in Flight Using (Almost) No Parts

    NASA Technical Reports Server (NTRS)

    Randall, David

    2010-01-01

    This concept allows control of high-performance stepper motors with minimal parts count and minimal flight software complexity. Although it uses a small number of common flight-qualified parts and simple control algorithms, it is capable enough to meet demanding system requirements. Its programmable nature makes it trivial to implement changes to control algorithms both during integration & test and in flight. Enhancements such as microstepping, half stepping, back-emf compensation, and jitter reduction can be tailored to the requirements of a large variety of stepper motor based applications including filter wheels, focus mechanisms, antenna tracking subsystems, pointing and mobility. The hardware design (using an H-bridge motor controller IC) was adapted from JPL's MER mission, still operating on Mars. This concept has been fully developed and incorporated into the MCS instrument on MRO, currently operating in Mars orbit. It has been incorporated into the filter wheel mechanism and linear stage (focus) mechanism for the AMT instrument. On MCS/MRO, two of these circuits control the elevation and azimuth of the MCS telescope/radiometer assembly, allowing the instrument to continuously monitor the limb of the Martian atmosphere. Implementation on MCS/MRO resulted in a 4:1 reduction in the volume and mass required for the motor driver electronics (100:25 square inches of PCB space), producing a very compact instrument. In fact, all of the electronics for the MCS instrument are packaged within the movable instrument structure. It also saved approximately 3 Watts of power. Most importantly, the design enabled MCS to meet very its stringent maximum allowable torque disturbance requirements.

  20. Development of a translation stage for in situ noninvasive analysis and high-resolution imaging

    NASA Astrophysics Data System (ADS)

    Strivay, David; Clar, Mathieu; Rakkaa, Said; Hocquet, Francois-Philippe; Defeyt, Catherine

    2016-11-01

    Noninvasive imaging techniques and analytical instrumentation for cultural heritage object studies have undergone a tremendous development over the last years. Many new miniature and/or handheld systems have been developed and optimized. Nonetheless, these instruments are usually used with a tripod or a manual position system. This is very time consuming when performing point analysis or 2D scanning of a surface. The Centre Européen d'Archéométrie has built a translation system made of pluggable rails of 1 m long with a maximum length and height of 3 m. Three motors embedded in the system allow the platform to be moved along these axis, toward and backward from the sample. The rails hold a displacement system, providing a continuous movement. Any position can be reached with a reproducibility of 0.1 mm. The displacements are controlled by an Ethernet connection through a laptop computer running a multiplatform custom-made software written in JAVA. This software allows a complete control over the positioning using a simple, unique, and concise interface. Automatic scanning can be performed over a large surface of 3 m on 3 m. The Ethernet wires provide also the power for the different motors and, if necessary, the detection head. The platform has been originally designed for a XRF detection head (with its full power alimentation) but now can accommodate many different systems like IR reflectography, digital camera, hyperspectral camera, and Raman probes. The positioning system can be modified to combine the acquisition software of the imaging or analytical techniques and the positioning software.

  1. Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation

    NASA Astrophysics Data System (ADS)

    Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.

    2017-12-01

    The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.

  2. ORAC: 21st Century Observing at UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.

    The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.

  3. Integrated Laser Characterization, Data Acquisition, and Command and Control Test System

    NASA Technical Reports Server (NTRS)

    Stysley, Paul; Coyle, Barry; Lyness, Eric

    2012-01-01

    Satellite-based laser technology has been developed for topographical measurements of the Earth and of other planets. Lasers for such missions must be highly efficient and stable over long periods in the temperature variations of orbit. In this innovation, LabVIEW is used on an Apple Macintosh to acquire and analyze images of the laser beam as it exits the laser cavity to evaluate the laser s performance over time, and to monitor and control the environmental conditions under which the laser is tested. One computer attached to multiple cameras and instruments running LabVIEW-based software replaces a conglomeration of computers and software packages, saving hours in maintenance and data analysis, and making very longterm tests possible. This all-in-one system was written primarily using LabVIEW for Mac OS X, which allows the combining of data from multiple RS-232, USB, and Ethernet instruments for comprehensive laser analysis and control. The system acquires data from CCDs (charge coupled devices), power meters, thermistors, and oscilloscopes over a controllable period of time. This data is saved to an html file that can be accessed later from a variety of data analysis programs. Also, through the LabVIEW interface, engineers can easily control laser input parameters such as current, pulse width, chiller temperature, and repetition rates. All of these parameters can be adapted and cycled over a period of time.

  4. Virtualization of Legacy Instrumentation Control Computers for Improved Reliability, Operational Life, and Management.

    PubMed

    Katz, Jonathan E

    2017-01-01

    Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.

  5. [Research and development of portable hypertension therapeutic apparatus based on biofeedback mechanism].

    PubMed

    Huang, Rong; He, Hongmei; Pi, Xitian; Diao, Ziji; Zhao, Suwen

    2014-06-01

    Non-drug treatment of hypertension has become a research hotspot, which might overcome the heavy economic burden and side effects of drug treatment for the patients. Because of the good treatment effect and convenient operation, a new treatment based on slow breathing training is increasingly becoming a kind of physical therapy for hypertension. This paper explains the principle of hypertension treatment based on slow breathing training method, and introduces the overall structure of the portable blood pressure controlling instrument, including breathing detection circuit, the core control module, audio module, memory module and man-machine interaction module. We give a brief introduction to the instrument and the software in this paper. The prototype testing results showed that the treatment had a significant effect on controlling the blood pressure.

  6. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  7. Cone-beam micro-CT system based on LabVIEW software.

    PubMed

    Ionita, Ciprian N; Hoffmann, Keneth R; Bednarek, Daniel R; Chityala, Ravishankar; Rudin, Stephen

    2008-09-01

    Construction of a cone-beam computed tomography (CBCT) system for laboratory research usually requires integration of different software and hardware components. As a result, building and operating such a complex system require the expertise of researchers with significantly different backgrounds. Additionally, writing flexible code to control the hardware components of a CBCT system combined with designing a friendly graphical user interface (GUI) can be cumbersome and time consuming. An intuitive and flexible program structure, as well as the program GUI for CBCT acquisition, is presented in this note. The program was developed in National Instrument's Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) graphical language and is designed to control a custom-built CBCT system but has been also used in a standard angiographic suite. The hardware components are commercially available to researchers and are in general provided with software drivers which are LabVIEW compatible. The program structure was designed as a sequential chain. Each step in the chain takes care of one or two hardware commands at a time; the execution of the sequence can be modified according to the CBCT system design. We have scanned and reconstructed over 200 specimens using this interface and present three examples which cover different areas of interest encountered in laboratory research. The resulting 3D data are rendered using a commercial workstation. The program described in this paper is available for use or improvement by other researchers.

  8. Method, accuracy and limitation of computer interaction in the operating room by a navigated surgical instrument.

    PubMed

    Hurka, Florian; Wenger, Thomas; Heininger, Sebastian; Lueth, Tim C

    2011-01-01

    This article describes a new interaction device for surgical navigation systems--the so-called navigation mouse system. The idea is to use a tracked instrument of a surgical navigation system like a pointer to control the software. The new interaction system extends existing navigation systems with a microcontroller-unit. The microcontroller-unit uses the existing communication line to extract the needed 3D-information of an instrument to calculate positions analogous to the PC mouse cursor and click events. These positions and events are used to manipulate the navigation system. In an experimental setup the reachable accuracy with the new mouse system is shown.

  9. Reactive control and reasoning assistance for scientific laboratory instruments

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Levinson, Richard; Robinson, Peter

    1993-01-01

    Scientific laboratory instruments that are involved in chemical or physical sample identification frequently require substantial human preparation, attention, and interactive control during their operation. Successful real-time analysis of incoming data that supports such interactive control requires: (1) a clear recognition of variance of the data from expected results; and (2) rapid diagnosis of possible alternative hypotheses which might explain the variance. Such analysis then aids in decisions about modifying the experiment protocol, as well as being a goal itself. This paper reports on a collaborative project at the NASA Ames Research Center between artificial intelligence researchers and planetary microbial ecologists. Our team is currently engaged in developing software that autonomously controls science laboratory instruments and that provides data analysis of the real-time data in support of dynamic refinement of the experiment control. the first two instruments to which this technology has been applied are a differential thermal analyzer (DTA) and a gas chromatograph (GC). coupled together, they form a new geochemicstry and microbial analysis tool that is capable of rapid identification of the organiz and mineralogical constituents in soils. The thermal decomposition of the minerals and organics, and the attendance release of evolved gases, provides data about the structural and molecular chemistry of the soil samples.

  10. CARMENES instrument control system and operational scheduler

    NASA Astrophysics Data System (ADS)

    Garcia-Piquer, Alvaro; Guàrdia, Josep; Colomé, Josep; Ribas, Ignasi; Gesa, Lluis; Morales, Juan Carlos; Pérez-Calpena, Ana; Seifert, Walter; Quirrenbach, Andreas; Amado, Pedro J.; Caballero, José A.; Reiners, Ansgar

    2014-07-01

    The main goal of the CARMENES instrument is to perform high-accuracy measurements of stellar radial velocities (1m/s) with long-term stability. CARMENES will be installed in 2015 at the 3.5 m telescope in the Calar Alto Observatory (Spain) and it will be equipped with two spectrographs covering from the visible to the near-infrared. It will make use of its near-IR capabilities to observe late-type stars, whose peak of the spectral energy distribution falls in the relevant wavelength interval. The technology needed to develop this instrument represents a challenge at all levels. We present two software packages that play a key role in the control layer for an efficient operation of the instrument: the Instrument Control System (ICS) and the Operational Scheduler. The coordination and management of CARMENES is handled by the ICS, which is responsible for carrying out the operations of the different subsystems providing a tool to operate the instrument in an integrated manner from low to high user interaction level. The ICS interacts with the following subsystems: the near-IR and visible channels, composed by the detectors and exposure meters; the calibration units; the environment sensors; the front-end electronics; the acquisition and guiding module; the interfaces with telescope and dome; and, finally, the software subsystems for operational scheduling of tasks, data processing, and data archiving. We describe the ICS software design, which implements the CARMENES operational design and is planned to be integrated in the instrument by the end of 2014. The CARMENES operational scheduler is the second key element in the control layer described in this contribution. It is the main actor in the translation of the survey strategy into a detailed schedule for the achievement of the optimization goals. The scheduler is based on Artificial Intelligence techniques and computes the survey planning by combining the static constraints that are known a priori (i.e., target visibility, sky background, required time sampling coverage) and the dynamic change of the system conditions (i.e., weather, system conditions). Off-line and on-line strategies are integrated into a single tool for a suitable transfer of the target prioritization made by the science team to the real-time schedule that will be used by the instrument operators. A suitable solution will be expected to increase the efficiency of telescope operations, which will represent an important benefit in terms of scientific return and operational costs. We present the operational scheduling tool designed for CARMENES, which is based on two algorithms combining a global and a local search: Genetic Algorithms and Hill Climbing astronomy-based heuristics, respectively. The algorithm explores a large amount of potential solutions from the vast search space and is able to identify the most efficient ones. A planning solution is considered efficient when it optimizes the objectives defined, which, in our case, are related to the reduction of the time that the telescope is not in use and the maximization of the scientific return, measured in terms of the time coverage of each target in the survey. We present the results obtained using different test cases.

  11. Development of dedicated target tracking capability for the CERES instruments through flight software: enhancing radiometric validation and on-orbit calibration

    NASA Astrophysics Data System (ADS)

    Teague, Kelly K.; Smith, G. Louis; Priestley, Kory; Lukashin, Constantine; Roithmayr, Carlos

    2012-09-01

    Five CERES scanning radiometers have been flown to date. The Proto-Flight Model flew aboard the Tropical Rainfall Measurement Mission spacecraft in November 1997. Two CERES instruments, Flight Models (FM) 1 and 2, are aboard the Terra spacecraft, which was launched in December 1999. Two more CERES instruments, FM-3 and FM-4, are on the Aqua spacecraft, which was placed in orbit in May 2002. These instruments continue to operate after providing over a decade of Earth Radiation Budget data. The CERES FM-5 instrument, onboard the Suomi-NPP spacecraft, launched in October 2011. The CERES FM- 6 instrument is manifested on the JPPS-1 spacecraft to be launched in December 2016. A successor to these instruments is presently in the definition stage. This paper describes the evolving role of flight software in the operation of these instruments to meet the Science objectives of the mission and also the ability to execute supplemental tasks as they evolve. In order to obtain and maintain high accuracy in the data products from these instruments, a number of operational activities have been developed and implemented since the instruments were originally designed and placed in orbit. These new activities are possible because of the ability to exploit and modify the flight software, which operates the instruments. The CERES Flight Software interface was designed to allow for on-orbit modification, and as such, constantly evolves to meet changing needs. The purpose of this paper is to provide a brief overview of modifications which have been developed to allow dedicated targeting of specific geographic locations as the CERES sensor flies overhead on its host spacecraft. This new observing strategy greatly increases the temporal and angular sampling for specific targets of high scientific interest.

  12. Software for Classroom Music Making.

    ERIC Educational Resources Information Center

    Ely, Mark C.

    1992-01-01

    Describes musical instrument digital interface (MIDI), a communication system that uses digital data to enable MIDI-equipped instruments to communicate with each other. Includes discussion of music editors, sequencers, compositional software, and commonly used computers. Suggests uses for the technology for students and teachers. Urges further…

  13. The CRREL Instrumented Vehicle: Hardware and Software.

    DTIC Science & Technology

    1983-01-01

    rear axle torque are meas- ured. The vehicle is equipped for front-wheel, rear-wheel or four-wheel drive. A dual brake system allows front-, rear- or...four-wheel braking . A minicomputer- based data acquisition system is installed in the vehicle to control data gather ing and to process the data. The...o..o...o 4 4. Dual brake system control valves . ........ 5 5. Schematic of modified brake system ...... .... st 5 6. Air-shock-absorber regulator

  14. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  15. Cavity-Enhanced Quantum-Cascade Laser-Based Instrument for Trace gas Measurements

    NASA Astrophysics Data System (ADS)

    Provencal, R.; Gupta, M.; Owano, T.; Baer, D.; Ricci, K.; O'Keefe, A.

    2005-12-01

    An autonomous instrument based on Off-Axis Integrated Cavity Output Spectroscopy has been successfully deployed for measurements of CO in the troposphere and tropopause onboard a NASA DC-8 aircraft. The instrument consists of a measurement cell comprised of two high reflectivity mirrors, a continuous-wave quantum-cascade laser, gas sampling system, control and data acquisition electronics, and data analysis software. The instrument reports CO mixing ratio at a 1-Hz rate based on measured absorption, gas temperature and pressure using Beer's Law. During several flights in May-June 2004 and January 2005 that reached altitudes of 41000 ft, the instrument recorded CO values with a precision of 0.2 ppbv (1-s averaging time). Despite moderate turbulence and measurements of particulate-laden airflows, the instrument operated consistently and did not require any maintenance, mirror cleaning, or optical realignment during the flights. We will also present recent development efforts to extend the instrument's capabilities for the measurements of CH4, N2O and CO in real time.

  16. A low-cost programmable pulse generator for physiology and behavior

    PubMed Central

    Sanders, Joshua I.; Kepecs, Adam

    2014-01-01

    Precisely timed experimental manipulations of the brain and its sensory environment are often employed to reveal principles of brain function. While complex and reliable pulse trains for temporal stimulus control can be generated with commercial instruments, contemporary options remain expensive and proprietary. We have developed Pulse Pal, an open source device that allows users to create and trigger software-defined trains of voltage pulses with high temporal precision. Here we describe Pulse Pal’s circuitry and firmware, and characterize its precision and reliability. In addition, we supply online documentation with instructions for assembling, testing and installing Pulse Pal. While the device can be operated as a stand-alone instrument, we also provide application programming interfaces in several programming languages. As an inexpensive, flexible and open solution for temporal control, we anticipate that Pulse Pal will be used to address a wide range of instrumentation timing challenges in neuroscience research. PMID:25566051

  17. New customizable phased array UT instrument opens door for furthering research and better industrial implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dao, Gavin; Ginzel, Robert

    2014-02-18

    Phased array UT as an inspection technique in itself continues to gain wide acceptance. However, there is much room for improvement in terms of implementation of Phased Array (PA) technology for every unique NDT application across several industries (e.g. oil and petroleum, nuclear and power generation, steel manufacturing, etc.). Having full control of the phased array instrument and customizing a software solution is necessary for more seamless and efficient inspections, from setting the PA parameters, collecting data and reporting, to the final analysis. NDT researchers and academics also need a flexible and open platform to be able to control variousmore » aspects of the phased array process. A high performance instrument with advanced PA features, faster data rates, a smaller form factor, and capability to adapt to specific applications, will be discussed.« less

  18. The Gemini-South MCAO operational model: insights on a new era of telescope operation

    NASA Astrophysics Data System (ADS)

    Trancho, Gelys; Bec, Matthieu; Artigau, Etienne; d'Orgeville, Celine; Gratadour, Damien; Rigaut, Francois J.; Walls, Brian

    2008-07-01

    The Gemini Observatory is implementing a Multi-Conjugate Adaptive Optics (MCAO) system as a facility instrument for the Gemini South telescope (GeMS). The system will include 5 Laser Guide Stars, 3 Natural Guide Stars, and 3 deformable mirrors, optically conjugated at different altitudes, to achieve near-uniform atmospheric compensation over a one arc minute square field of view. This setup implies some level of operational complexity. In this paper we describe how GeMS will be integrated into the flow of Gemini operations, from the observing procedures necessary to execute the programs in the queue (telescope control software, observing tools, sequence executor) to the safety implementation needed such as spotters/ASCAM, space command and laser traffic control software.

  19. An Integrated Simulation Module for Cyber-Physical Automation Systems †

    PubMed Central

    Ferracuti, Francesco; Freddi, Alessandro; Monteriù, Andrea; Prist, Mariorosario

    2016-01-01

    The integration of Wireless Sensors Networks (WSNs) into Cyber Physical Systems (CPSs) is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA), a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called “GILOO” (Graphical Integration of Labview and cOOja). It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA), etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new “Advanced Sky GUI” have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been applied to a smart home mock-up where a networked control has been developed for the LED lighting system. PMID:27164109

  20. An Integrated Simulation Module for Cyber-Physical Automation Systems.

    PubMed

    Ferracuti, Francesco; Freddi, Alessandro; Monteriù, Andrea; Prist, Mariorosario

    2016-05-05

    The integration of Wireless Sensors Networks (WSNs) into Cyber Physical Systems (CPSs) is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA), a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called "GILOO" (Graphical Integration of Labview and cOOja). It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA), etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new "Advanced Sky GUI" have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been applied to a smart home mock-up where a networked control has been developed for the LED lighting system.

  1. Research study demonstrates computer simulation can predict warpage and assist in its elimination

    NASA Astrophysics Data System (ADS)

    Glozer, G.; Post, S.; Ishii, K.

    1994-10-01

    Programs for predicting warpage in injection molded parts are relatively new. Commercial software for simulating the flow and cooling stages of injection molding have steadily gained acceptance; however, warpage software is not yet as readily accepted. This study focused on gaining an understanding of the predictive capabilities of the warpage software. The following aspects of this study were unique. (1) Quantitative results were found using a statistically designed set of experiments. (2) Comparisons between experimental and simulation results were made with parts produced in a well-instrumented and controlled injection molding machine. (3) The experimental parts were accurately measured on a coordinate measuring machine with a non-contact laser probe. (4) The effect of part geometry on warpage was investigated.

  2. CONRAD Software Architecture

    NASA Astrophysics Data System (ADS)

    Guzman, J. C.; Bennett, T.

    2008-08-01

    The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.

  3. Shaping ability of reciprocating motion of WaveOne and HyFlex in moderate to severe curved canals: A comparative study with cone beam computed tomography

    PubMed Central

    Simpsy, Gurram Samuel; Sajjan, Girija S.; Mudunuri, Padmaja; Chittem, Jyothi; Prasanthi, Nalam N. V. D.; Balaga, Pankaj

    2016-01-01

    Introduction: M-Wire and reciprocating motion of WaveOne and controlled memory (CM) wire) of HyFlex were the recent innovations using thermal treatment. Therefore, a study was planned to evaluate the shaping ability of reciprocating motion of WaveOne and HyFlex using cone beam computed tomography (CBCT). Methodology: Forty-five freshly extracted mandibular teeth were selected and stored in saline until use. All teeth were scanned pre- and post-operatively using CBCT (Kodak 9000). All teeth were accessed and divided into three groups. (1) Group 1 (control n = 15): Instrumented with ProTaper. (2) Group 2 (n = 15): Instrumented with primary file (8%/25) WaveOne. (3) Group 3 (n = 15): Instrumented with (4%/25) HyFlex CM. Sections at 1, 3, and 5 mm were obtained from the pre- and post-operative scans. Measurement was done using CS3D software and Adobe Photoshop software. Apical transportation and degree of straightening were measured and statistically analyzed. Results: HyFlex showed lesser apical transportation when compared to other groups at 1 and 3 mm. WaveOne showed lesser degree of straightening when compared to other groups. Conclusion: This present study concluded that all systems could be employed in routine endodontics whereas HyFlex and WaveOne could be employed in severely curved canals. PMID:27994323

  4. Laboratory and field based evaluation of chromatography ...

    EPA Pesticide Factsheets

    The Monitor for AeRosols and GAses in ambient air (MARGA) is an on-line ion-chromatography-based instrument designed for speciation of the inorganic gas and aerosol ammonium-nitrate-sulfate system. Previous work to characterize the performance of the MARGA has been primarily based on field comparison to other measurement methods to evaluate accuracy. While such studies are useful, the underlying reasons for disagreement among methods are not always clear. This study examines aspects of MARGA accuracy and precision specifically related to automated chromatography analysis. Using laboratory standards, analytical accuracy, precision, and method detection limits derived from the MARGA chromatography software are compared to an alternative software package (Chromeleon, Thermo Scientific Dionex). Field measurements are used to further evaluate instrument performance, including the MARGA’s use of an internal LiBr standard to control accuracy. Using gas/aerosol ratios and aerosol neutralization state as a case study, the impact of chromatography on measurement error is assessed. The new generation of on-line chromatography-based gas and particle measurement systems have many advantages, including simultaneous analysis of multiple pollutants. The Monitor for Aerosols and Gases in Ambient Air (MARGA) is such an instrument that is used in North America, Europe, and Asia for atmospheric process studies as well as routine monitoring. While the instrument has been evaluat

  5. Integrating a flexible modeling framework (FMF) with the network security assessment instrument to reduce software security risk

    NASA Technical Reports Server (NTRS)

    Gilliam, D. P.; Powell, J. D.

    2002-01-01

    This paper presents a portion of an overall research project on the generation of the network security assessment instrument to aid developers in assessing and assuring the security of software in the development and maintenance lifecycles.

  6. Practical experience with test-driven development during commissioning of the multi-star AO system ARGOS

    NASA Astrophysics Data System (ADS)

    Kulas, M.; Borelli, Jose Luis; Gässler, Wolfgang; Peter, Diethard; Rabien, Sebastian; Orban de Xivry, Gilles; Busoni, Lorenzo; Bonaglia, Marco; Mazzoni, Tommaso; Rahmer, Gustavo

    2014-07-01

    Commissioning time for an instrument at an observatory is precious, especially the night time. Whenever astronomers come up with a software feature request or point out a software defect, the software engineers have the task to find a solution and implement it as fast as possible. In this project phase, the software engineers work under time pressure and stress to deliver a functional instrument control software (ICS). The shortness of development time during commissioning is a constraint for software engineering teams and applies to the ARGOS project as well. The goal of the ARGOS (Advanced Rayleigh guided Ground layer adaptive Optics System) project is the upgrade of the Large Binocular Telescope (LBT) with an adaptive optics (AO) system consisting of six Rayleigh laser guide stars and wavefront sensors. For developing the ICS, we used the technique Test- Driven Development (TDD) whose main rule demands that the programmer writes test code before production code. Thereby, TDD can yield a software system, that grows without defects and eases maintenance. Having applied TDD in a calm and relaxed environment like office and laboratory, the ARGOS team has profited from the benefits of TDD. Before the commissioning, we were worried that the time pressure in that tough project phase would force us to drop TDD because we would spend more time writing test code than it would be worth. Despite this concern at the beginning, we could keep TDD most of the time also in this project phase This report describes the practical application and performance of TDD including its benefits, limitations and problems during the ARGOS commissioning. Furthermore, it covers our experience with pair programming and continuous integration at the telescope.

  7. KLASS: Kennedy Launch Academy Simulation System

    NASA Technical Reports Server (NTRS)

    Garner, Lesley C.

    2007-01-01

    Software provides access to many sophisticated scientific instrumentation (Scanning Electron Microscope (SEM), a Light Microscope, a Scanning Probe Microscope (covering Scanning Tunneling, Atomic Force, and Magnetic Force microscopy), and an Energy Dispersive Spectrometer for the SEM). Flash animation videos explain how each of the instruments work. Videos on how they are used at NASA and the sample preparation. Measuring and labeling tools provided with each instrument. Hands on experience of controlling the virtual instrument to conduct investigations, much like the real scientists at NASA do. Very open architecture. Open source on SourceForge. Extensive use of XML Target audience is high school and entry-level college students. "Many beginning students never get closer to an electron microscope than the photos in their textbooks. But anyone can get a sense of what the instrument can do by downloading this simulator from NASA's Kennedy Space Center." Science Magazine, April 8th, 2005

  8. Direct Digital Control Study.

    DTIC Science & Technology

    1985-02-01

    Deck - Cold Deck Reset Reheat Coil Reset Steam Boiler Optimization [lot Water Outside Air Reset Chiller Optimization Chiller Water Temperature Reset...with programming techniques for each type of installed DDC in order to effect changes in operating setpoints and application programs. *Communication...can be changed without recailbration of instrumentation devices. Changes to the application software, operating setpoints and parameters require the

  9. [The development of an intelligent four-channel aggregometer].

    PubMed

    Guan, X; Wang, M

    1998-07-01

    The paper introduces the hardware and software design of the instrument. We use 89C52 single-chip computer as the microprocessor to control the amplifier, AD and DA conversion chip to realize the sampling, data process, printout and supervision. The final result is printed out in form of data and aggregation curve from PP40 plotter.

  10. Monitoring and Hardware Management for Critical Fusion Plasma Instrumentation

    NASA Astrophysics Data System (ADS)

    Carvalho, Paulo F.; Santos, Bruno; Correia, Miguel; Combo, Álvaro M.; Rodrigues, AntÓnio P.; Pereira, Rita C.; Fernandes, Ana; Cruz, Nuno; Sousa, Jorge; Carvalho, Bernardo B.; Batista, AntÓnio J. N.; Correia, Carlos M. B. A.; Gonçalves, Bruno

    2018-01-01

    Controlled nuclear fusion aims to obtain energy by particles collision confined inside a nuclear reactor (Tokamak). These ionized particles, heavier isotopes of hydrogen, are the main elements inside of plasma that is kept at high temperatures (millions of Celsius degrees). Due to high temperatures and magnetic confinement, plasma is exposed to several sources of instabilities which require a set of procedures by the control and data acquisition systems throughout fusion experiments processes. Control and data acquisition systems often used in nuclear fusion experiments are based on the Advanced Telecommunication Computer Architecture (AdvancedTCA®) standard introduced by the Peripheral Component Interconnect Industrial Manufacturers Group (PICMG®), to meet the demands of telecommunications that require large amount of data (TB) transportation at high transfer rates (Gb/s), to ensure high availability including features such as reliability, serviceability and redundancy. For efficient plasma control, systems are required to collect large amounts of data, process it, store for later analysis, make critical decisions in real time and provide status reports either from the experience itself or the electronic instrumentation involved. Moreover, systems should also ensure the correct handling of detected anomalies and identified faults, notify the system operator of occurred events, decisions taken to acknowledge and implemented changes. Therefore, for everything to work in compliance with specifications it is required that the instrumentation includes hardware management and monitoring mechanisms for both hardware and software. These mechanisms should check the system status by reading sensors, manage events, update inventory databases with hardware system components in use and maintenance, store collected information, update firmware and installed software modules, configure and handle alarms to detect possible system failures and prevent emergency scenarios occurrences. The goal is to ensure high availability of the system and provide safety operation, experiment security and data validation for the fusion experiment. This work aims to contribute to the joint effort of the IPFN control and data acquisition group to develop a hardware management and monitoring application for control and data acquisition instrumentation especially designed for large scale tokamaks like ITER.

  11. Cross-instrument Analysis Correlation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy R.

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog boxmore » driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.« less

  12. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Astrophysics Data System (ADS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  13. A new telescope control software for the Mayall 4-meter telescope

    NASA Astrophysics Data System (ADS)

    Abareshi, Behzad; Marshall, Robert; Gott, Shelby; Sprayberry, David; Cantarutti, Rolando; Joyce, Dick; Williams, Doug; Probst, Ronald; Reetz, Kristin; Paat, Anthony; Butler, Karen; Soto, Christian; Dey, Arjun; Summers, David

    2016-07-01

    The Mayall 4-meter telescope recently went through a major modernization of its telescope control system in preparation for DESI. We describe MPK (Mayall Pointing Kernel), our new software for telescope control. MPK outputs a 20Hz position-based trajectory with a velocity component, which feeds into Mayall's new servo system over a socket. We wrote a simple yet realistic servo simulator that let us develop MPK mostly without access to real hardware, and also lets us provide other teams with a Mayall simulator as test bed for development of new instruments. MPK has a small core comprised of prioritized, soft real-time threads. Access to the core's services is via MPK's main thread, a complete, interactive Tcl/Tk shell, which gives us the power and flexibility of a scripting language to add any other features, from GUIs, to modules for interaction with critical subsystems like dome or guider, to an API for networked clients of a new instrument (e.g., DESI). MPK is designed for long term maintainability: it runs on a stock computer and Linux OS, and uses only standard, open source libraries, except for commercial software that comes with source code in ANSI C/C++. We discuss the technical details of how MPK combines the Reflexxes motion library with the TCSpk/TPK pointing library to generically handle any motion requests, from slews to offsets to sidereal or non-sidereal tracking. We show how MPK calculates when the servos have reached a steady state. We also discuss our TPOINT modeling strategy and report performance results.

  14. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Technical Reports Server (NTRS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    2017-01-01

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  15. Solar Constant (SOLCON) Experiment: Ground Support Equipment (GSE) software development

    NASA Technical Reports Server (NTRS)

    Gibson, M. Alan; Thomas, Susan; Wilson, Robert

    1991-01-01

    The Solar Constant (SOLCON) Experiment, the objective of which is to determine the solar constant value and its variability, is scheduled for launch as part of the Space Shuttle/Atmospheric Laboratory for Application and Science (ATLAS) spacelab mission. The Ground Support Equipment (GSE) software was developed to monitor and analyze the SOLCON telemetry data during flight and to test the instrument on the ground. The design and development of the GSE software are discussed. The SOLCON instrument was tested during Davos International Solar Intercomparison, 1989 and the SOLCON data collected during the tests are analyzed to study the behavior of the instrument.

  16. Final Technical Report on Quantifying Dependability Attributes of Software Based Safety Critical Instrumentation and Control Systems in Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smidts, Carol; Huang, Funqun; Li, Boyuan

    With the current transition from analog to digital instrumentation and control systems in nuclear power plants, the number and variety of software-based systems have significantly increased. The sophisticated nature and increasing complexity of software raises trust in these systems as a significant challenge. The trust placed in a software system is typically termed software dependability. Software dependability analysis faces uncommon challenges since software systems’ characteristics differ from those of hardware systems. The lack of systematic science-based methods for quantifying the dependability attributes in software-based instrumentation as well as control systems in safety critical applications has proved itself to be amore » significant inhibitor to the expanded use of modern digital technology in the nuclear industry. Dependability refers to the ability of a system to deliver a service that can be trusted. Dependability is commonly considered as a general concept that encompasses different attributes, e.g., reliability, safety, security, availability and maintainability. Dependability research has progressed significantly over the last few decades. For example, various assessment models and/or design approaches have been proposed for software reliability, software availability and software maintainability. Advances have also been made to integrate multiple dependability attributes, e.g., integrating security with other dependability attributes, measuring availability and maintainability, modeling reliability and availability, quantifying reliability and security, exploring the dependencies between security and safety and developing integrated analysis models. However, there is still a lack of understanding of the dependencies between various dependability attributes as a whole and of how such dependencies are formed. To address the need for quantification and give a more objective basis to the review process -- therefore reducing regulatory uncertainty -- measures and methods are needed to assess dependability attributes early on, as well as throughout the life-cycle process of software development. In this research, extensive expert opinion elicitation is used to identify the measures and methods for assessing software dependability. Semi-structured questionnaires were designed to elicit expert knowledge. A new notation system, Causal Mechanism Graphing, was developed to extract and represent such knowledge. The Causal Mechanism Graphs were merged, thus, obtaining the consensus knowledge shared by the domain experts. In this report, we focus on how software contributes to dependability. However, software dependability is not discussed separately from the context of systems or socio-technical systems. Specifically, this report focuses on software dependability, reliability, safety, security, availability, and maintainability. Our research was conducted in the sequence of stages found below. Each stage is further examined in its corresponding chapter. Stage 1 (Chapter 2): Elicitation of causal maps describing the dependencies between dependability attributes. These causal maps were constructed using expert opinion elicitation. This chapter describes the expert opinion elicitation process, the questionnaire design, the causal map construction method and the causal maps obtained. Stage 2 (Chapter 3): Elicitation of the causal map describing the occurrence of the event of interest for each dependability attribute. The causal mechanisms for the “event of interest” were extracted for each of the software dependability attributes. The “event of interest” for a dependability attribute is generally considered to be the “attribute failure”, e.g. security failure. The extraction was based on the analysis of expert elicitation results obtained in Stage 1. Stage 3 (Chapter 4): Identification of relevant measurements. Measures for the “events of interest” and their causal mechanisms were obtained from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.« less

  17. Penn State University ground software support for X-ray missions.

    NASA Astrophysics Data System (ADS)

    Townsley, L. K.; Nousek, J. A.; Corbet, R. H. D.

    1995-03-01

    The X-ray group at Penn State is charged with two software development efforts in support of X-ray satellite missions. As part of the ACIS instrument team for AXAF, the authors are developing part of the ground software to support the instrument's calibration. They are also designing a translation program for Ginga data, to change it from the non-standard FRF format, which closely parallels the original telemetry format, to FITS.

  18. Design and Testing of a Controller for the Martian Atmosphere Pressure and Humidity Instrument DREAMS-P/H

    NASA Astrophysics Data System (ADS)

    Tapani Nikkanen, Timo; Schmidt, Walter; Genzer, Maria; Harri, Ari-Matti; Haukka, Harri

    2013-04-01

    The European Space Agency (ESA), driven by the goal of performing a soft landing on Mars, is planning to launch the Entry, descent and landing Demonstrator Module (EDM)[1] simultaneously with the Trace Gas Orbiter (TGO) as a part of the ExoMars program towards Mars in 2016. As a secondary objective, the EDM will gather meteorological data and observe the electrical environment of the landing site with its Dust characterisation, Risk assessment, and Environmental Analyser on the Martian Surface (DREAMS). The Finnish Meteorological Institute (FMI) is participating in the project by designing, building and testing a pressure and a humidity instrument for Mars, named DREAMS-P and DREAMS-H, respectively. The instruments are based on previous FMI designs, including ones flown on board the Huygens, Phoenix and Mars Science Laboratory.[2] Traditionally, the FMI pressure and humidity instruments have been controlled by an FPGA. However, the need to incorporate more autonomy and modifiability into instruments, cut the development time and component costs, stimulated interest to study a Commercial Off-The-Shelf (COTS) Microcontroller Unit (MCU) based instrument design. Thus, in the DREAMS-P/H design, an automotive MCU is used as the instrument controller. The MCU has been qualified for space by tests in and outside FMI. The DREAMS-P/H controller command and data interface utilizes a RS-422 connection to receive telecommands from and to transmit data to the Central Electronics Unit (CEU) of the DREAMS science package. The two pressure transducers of DREAMS-P and one humidity transducer of DREAMS-H are controlled by a single MCU. The MCU controls the power flow for each transducer and performs pulse counting measurements on sensor and reference channels to retrieve scientific data. Pressure and humidity measurements are scheduled and set up according to a configuration table assigned to each transducer. The configuration tables can be modified during the flight. The whole software is entirely interrupt driven, thus the MCU goes into a power saving standby mode whenever possible. Any measurement or other operation can be stopped by simply interrupting the controller with a telecommand. Software and functional tests of the DREAMS-P/H controller are needed to verify the performance of the instrument in nominal conditions and the correct operation and error detection in anomalous conditions. The nominal conditions tests range from simple functional and performance tests, to longer simulations of continued operation and measurements. Continued operation simulations can be implemented by executing accelerated runs of the expected normal measurement cycles. On the contrary, anomalous conditions tests are used to verify that the controller can handle bad telecommands or anomalous operation of the instrument transducers, for example in the case of malfunctioning sensors. Bad telecommand tests are done by feeding illegal parameters or scrambled telecommands to the controller. Malfunctioning sensors can be simulated by modifying the signals coming from the sensors and reference channels. All expected use cases and all imagined unexpected operating circumstances are studied to ensure that the system is robust. This also makes the planned modification of the design for other future missions easier and safer. Reference: [1] ESA ExoMars EDM mission: http://exploration.esa.int/science-e/www/object/index.cfm?fobjectid=47852 [2] FMI Space Projects website: http://space.fmi.fi/index.php?id=23

  19. Instrumentation complex for Langley Research Center's National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Russell, C. H.; Bryant, C. S.

    1977-01-01

    The instrumentation discussed in the present paper was developed to ensure reliable operation for a 2.5-meter cryogenic high-Reynolds-number fan-driven transonic wind tunnel. It will incorporate four CPU's and associated analog and digital input/output equipment, necessary for acquiring research data, controlling the tunnel parameters, and monitoring the process conditions. Connected in a multipoint distributed network, the CPU's will support data base management and processing; research measurement data acquisition and display; process monitoring; and communication control. The design will allow essential processes to continue, in the case of major hardware failures, by switching input/output equipment to alternate CPU's and by eliminating nonessential functions. It will also permit software modularization by CPU activity and thereby reduce complexity and development time.

  20. Microorganism penetration in dentinal tubules of instrumented and retreated root canal walls. In vitro SEM study

    PubMed Central

    Al-Sulaiman, Alaa; Al-Rasheed, Fellwa; Alnajjar, Fatimah; Al-Abdulwahab, Bander; Al-Badah, Abdulhakeem

    2014-01-01

    Objectives This in vitro study aimed to investigate the ability of Candida albicans (C. albicans) and Enterococcus faecalis (E. faecalis) to penetrate dentinal tubules of instrumented and retreated root canal surface of split human teeth. Materials and Methods Sixty intact extracted human single-rooted teeth were divided into 4 groups, negative control, positive control without canal instrumentation, instrumented, and retreated. Root canals in the instrumented group were enlarged with endodontic instruments, while root canals in the retreated group were enlarged, filled, and then removed the canal filling materials. The teeth were split longitudinally after canal preparation in 3 groups except the negative control group. The teeth were inoculated with both microorganisms separately and in combination. Teeth specimens were examined by scanning electron microscopy (SEM), and the depth of penetration into the dentinal tubules was assessed using the SMILE view software (JEOL Ltd). Results Penetration of C. albicans and E. faecalis into the dentinal tubules was observed in all 3 groups, although penetration was partially restricted by dentin debris of tubules in the instrumented group and remnants of canal filling materials in the retreated group. In all 3 groups, E. faecalis penetrated deeper into the dentinal tubules by way of cell division than C. albicans which built colonies and penetrated by means of hyphae. Conclusions Microorganisms can easily penetrate dentinal tubules of root canals with different appearance based on the microorganism size and status of dentinal tubules. PMID:25383343

  1. Unobtrusive Software and System Health Management with R2U2 on a Parallel MIMD Coprocessor

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Moosbrugger, Patrick

    2017-01-01

    Dynamic monitoring of software and system health of a complex cyber-physical system requires observers that continuously monitor variables of the embedded software in order to detect anomalies and reason about root causes. There exists a variety of techniques for code instrumentation, but instrumentation might change runtime behavior and could require costly software re-certification. In this paper, we present R2U2E, a novel realization of our real-time, Realizable, Responsive, and Unobtrusive Unit (R2U2). The R2U2E observers are executed in parallel on a dedicated 16-core EPIPHANY co-processor, thereby avoiding additional computational overhead to the system under observation. A DMA-based shared memory access architecture allows R2U2E to operate without any code instrumentation or program interference.

  2. Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond

    NASA Technical Reports Server (NTRS)

    Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry

    1996-01-01

    The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.

  3. Software metrics: The key to quality software on the NCC project

    NASA Technical Reports Server (NTRS)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  4. SiFAP: a Simple Sub-Millisecond Astronomical Photometer

    NASA Astrophysics Data System (ADS)

    Ambrosino, F.; Meddi, F.; Nesci, R.; Rossi, C.; Sclavi, S.; Bruni, I.

    2013-09-01

    A new fast photometer based on SiPM technology was developed at the University of Rome "La Sapienza" starting from 2009. A first prototype was successfully tested observing the Crab pulsar at the Loiano telescope of the Bologna Observatory. In this paper we illustrate the improvements we applied to our instrument, concerning new cooled commercial sensors, a new version of our custom dedicated electronics and an upgraded control timing software. Finally we report the results obtained with this instrument on December 2012 on the Crab pulsar at the Loiano telescope to show its goodness and capabilities.

  5. Development of flying qualities criteria for single pilot instrument flight operations

    NASA Technical Reports Server (NTRS)

    Bar-Gill, A.; Nixon, W. B.; Miller, G. E.

    1982-01-01

    Flying qualities criteria for Single Pilot Instrument Flight Rule (SPIFR) operations were investigated. The ARA aircraft was modified and adapted for SPIFR operations. Aircraft configurations to be flight-tested were chosen and matched on the ARA in-flight simulator, implementing modern control theory algorithms. Mission planning and experimental matrix design were completed. Microprocessor software for the onboard data acquisition system was debugged and flight-tested. Flight-path reconstruction procedure and the associated FORTRAN program were developed. Algorithms associated with the statistical analysis of flight test results and the SPIFR flying qualities criteria deduction are discussed.

  6. Automated data acquisition technology development:Automated modeling and control development

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  7. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  8. The control system of the multi-strip ionization chamber for the HIMM

    NASA Astrophysics Data System (ADS)

    Li, Min; Yuan, Y. J.; Mao, R. S.; Xu, Z. G.; Li, Peng; Zhao, T. C.; Zhao, Z. L.; Zhang, Nong

    2015-03-01

    Heavy Ion Medical Machine (HIMM) is a carbon ion cancer treatment facility which is being built by the Institute of Modern Physics (IMP) in China. In this facility, transverse profile and intensity of the beam at the treatment terminals will be measured by the multi-strip ionization chamber. In order to fulfill the requirement of the beam position feedback to accomplish the beam automatic commissioning, less than 1 ms reaction time of the Data Acquisition (DAQ) of this detector must be achieved. Therefore, the control system and software framework for DAQ have been redesigned and developed with National Instruments Compact Reconfigurable Input/Output (CompactRIO) instead of PXI 6133. The software is Labview-based and developed following the producer-consumer pattern with message mechanism and queue technology. The newly designed control system has been tested with carbon beam at the Heavy Ion Research Facility at Lanzhou-Cooler Storage Ring (HIRFL-CSR) and it has provided one single beam profile measurement in less than 1 ms with 1 mm beam position resolution. The fast reaction time and high precision data processing during the beam test have verified the usability and maintainability of the software framework. Furthermore, such software architecture is easy-fitting to applications with different detectors such as wire scanner detector.

  9. Use of electronic microprocessor-based instrumentation by the U.S. geological survey for hydrologic data collection

    USGS Publications Warehouse

    Shope, William G.; ,

    1991-01-01

    The U.S. Geological Survey is acquiring a new generation of field computers and communications software to support hydrologic data-collection at field locations. The new computer hardware and software mark the beginning of the Survey's transition from the use of electromechanical devices and paper tapes to electronic microprocessor-based instrumentation. Software is being developed for these microprocessors to facilitate the collection, conversion, and entry of data into the Survey's National Water Information System. The new automated data-collection process features several microprocessor-controlled sensors connected to a serial digital multidrop line operated by an electronic data recorder. Data are acquired from the sensors in response to instructions programmed into the data recorder by the user through small portable lap-top or hand-held computers. The portable computers, called personal field computers, also are used to extract data from the electronic recorders for transport by courier to the office computers. The Survey's alternative to manual or courier retrieval is the use of microprocessor-based remote telemetry stations. Plans have been developed to enhance the Survey's use of the Geostationary Operational Environmental Satellite telemetry by replacing the present network of direct-readout ground stations with less expensive units. Plans also provide for computer software that will support other forms of telemetry such as telephone or land-based radio.

  10. Calibration Issues and Operating System Requirements for Electron-Probe Microanalysis

    NASA Technical Reports Server (NTRS)

    Carpenter, P.

    2006-01-01

    Instrument purchase requirements and dialogue with manufacturers have established hardware parameters for alignment, stability, and reproducibility, which have helped improve the precision and accuracy of electron microprobe analysis (EPMA). The development of correction algorithms and the accurate solution to quantitative analysis problems requires the minimization of systematic errors and relies on internally consistent data sets. Improved hardware and computer systems have resulted in better automation of vacuum systems, stage and wavelength-dispersive spectrometer (WDS) mechanisms, and x-ray detector systems which have improved instrument stability and precision. Improved software now allows extended automated runs involving diverse setups and better integrates digital imaging and quantitative analysis. However, instrumental performance is not regularly maintained, as WDS are aligned and calibrated during installation but few laboratories appear to check and maintain this calibration. In particular, detector deadtime (DT) data is typically assumed rather than measured, due primarily to the difficulty and inconvenience of the measurement process. This is a source of fundamental systematic error in many microprobe laboratories and is unknown to the analyst, as the magnitude of DT correction is not listed in output by microprobe operating systems. The analyst must remain vigilant to deviations in instrumental alignment and calibration, and microprobe system software must conveniently verify the necessary parameters. Microanalysis of mission critical materials requires an ongoing demonstration of instrumental calibration. Possible approaches to improvements in instrument calibration, quality control, and accuracy will be discussed. Development of a set of core requirements based on discussions with users, researchers, and manufacturers can yield documents that improve and unify the methods by which instruments can be calibrated. These results can be used to continue improvements of EPMA.

  11. Reagan Test Site Distributed Operations

    DTIC Science & Technology

    2012-01-01

    for missile testing because of its geography and its strategic location in the Pacific [ 1 ]. The atoll’s distance from launch facilities at Vandenberg...research on ballistic missile defense 50 years ago (Figure 1 ). The subsequent development of RTS’s unique instrumentation sensors, including high...control center including hardware, software, networks, and the facility functioned successfully. FIGURE 1 . The map shows the isolated location of the

  12. A Macintosh based data system for array spectrometers (Poster)

    NASA Astrophysics Data System (ADS)

    Bregman, J.; Moss, N.

    An interactive data aquisition and reduction system has been assembled by combining a Macintosh computer with an instrument controller (an Apple II computer) via an RS-232 interface. The data system provides flexibility for operating different linear array spectrometers. The standard Macintosh interface is used to provide ease of operation and to allow transferring the reduced data to commercial graphics software.

  13. 2012 Summer Research Experiences for Undergraduates at Pisgah Astronomical Research Institute

    NASA Astrophysics Data System (ADS)

    Castelaz, Michael W.; Cline, J. D.; Whitworth, C.; Clavier, D.; Owen, L.

    2013-01-01

    Pisgah Astronomical Research Institute (PARI) offers research experiences for undergraduates (REU). PARI receives support for the internships from the NC Space Grant Consortium, NSF awards, private donations, and industry partner funding. The PARI REU program began in 2001 with 4 students and has averaged 6 students per year over the past 11 years. This year PARI hosted 8 funded REU students. Mentors for the interns include PARI’s Science, Education, and Information Technology staff and visiting faculty who are members of the PARI Research Faculty Affiliate program. Students work with mentors on radio and optical astronomy research, electrical engineering for robotic control of instruments, software development for instrument control and software for citizen science projects, and science education by developing curricula and multimedia and teaching high school students in summer programs at PARI. At the end of the summer interns write a paper about their research which is published in the annually published PARI Summer Student Proceedings. Several of the students have presented their results at AAS Meetings. We will present a summary of specific research conducted by the students with their mentors and the logistics for hosting the PARI undergraduate internship program.

  14. Pivotal role of computers and software in mass spectrometry - SEQUEST and 20 years of tandem MS database searching.

    PubMed

    Yates, John R

    2015-11-01

    Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.

  15. CATS, continuous automated testing of seismological, hydroacoustic, and infrasound (SHI) processing software.

    NASA Astrophysics Data System (ADS)

    Brouwer, Albert; Brown, David; Tomuta, Elena

    2017-04-01

    To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.

  16. JPL's Real-Time Weather Processor project (RWP) metrics and observations at system completion

    NASA Technical Reports Server (NTRS)

    Loesh, Robert E.; Conover, Robert A.; Malhotra, Shan

    1990-01-01

    As an integral part of the overall upgraded National Airspace System (NAS), the objective of the Real-Time Weather Processor (RWP) project is to improve the quality of weather information and the timeliness of its dissemination to system users. To accomplish this, an RWP will be installed in each of the Center Weather Service Units (CWSUs), located in 21 of the 23 Air Route Traffic Control Centers (ARTCCs). The RWP System is a prototype system. It is planned that the software will be GFE and that production hardware will be acquired via industry competitive procurement. The ARTCC is a facility established to provide air traffic control service to aircraft operating on Instrument Flight Rules (IFR) flight plans within controlled airspace, principally during the en route phase of the flight. Covered here are requirement metrics, Software Problem Failure Reports (SPFRs), and Ada portability metrics and observations.

  17. Coleman performs VO2 Max PFS Software Calibrations and Instrument Check

    NASA Image and Video Library

    2011-02-24

    ISS026-E-029180 (24 Feb. 2011) --- NASA astronaut Catherine (Cady) Coleman, Expedition 26 flight engineer, performs VO2max portable Pulmonary Function System (PFS) software calibrations and instrument check while using the Cycle Ergometer with Vibration Isolation System (CEVIS) in the Destiny laboratory of the International Space Station.

  18. The Paired Availability Design and Related Instrumental Variable Meta-analyses | Division of Cancer Prevention

    Cancer.gov

    Stuart G. Baker, 2017 Introduction This software computes meta-analysis and extrapolation estimates for an instrumental variable meta-analysis of randomized trial or before-and-after studies (the latter also known as the paired availability design). The software also checks on the assumptions if sufficient data are available. |

  19. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  20. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  1. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  2. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to have a separate computer software repository. Contractual instruments establishing the repository...

  3. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities.more » Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).« less

  4. Virtual Platform for See Robustness Verification of Bootloader Embedded Software on Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.

    2013-05-01

    Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.

  5. A flexible computerized system for environmental data acquisition and transmission

    NASA Astrophysics Data System (ADS)

    Zappalà, G.

    2009-04-01

    In recent years increasing importance has been addressed to the knowledge of the marine environment, either to help detecting and understanding global climate change phenomena, or to protect and preserve those coastal areas, where multiple interests converge (linked to the tourism, recreational or productive activities…) and which suffer greater impact from anthropogenic activities; this has in turn stimulated the start of research programs devoted to the monitoring and surveillance of these particular zones, coupling the needs for knowledge, sustainable development and exploitation of natural resources. There is an increasing need to have data available in real time or near real time in order to intervene in emergency situations. Cabled or wireless data transmission can be used. The first allows the transmission of a higher amount of data only in coastal sites, while the second gives a bigger flexibility in terms of application to different environments; more, using mobile phone services (either terrestrial or satellite), it is possible to allocate the data centre in the most convenient place, without any need of proximity to the sea. Traditional oceanographic techniques, based on ship surveys, hardly fit the needs of operational oceanography, because of their high cost and fragmentary nature, both in spatial and temporal domains. To obtain a good synopticity, it is necessary to complement traditional ship observations with measurements from fixed stations (buoys moored in sites chosen to be representative of wider areas, or to constitute a sentinel against the arrival of pollutants), satellite observations, use of ships of opportunity and of newly developed instruments, like the gliders, or towed sliding devices, like the SAVE. Modern instruments rely on an electronic heart; an integrated hardware-software system developed in Messina is here presented, used in various versions to control data acquisition and transmission on buoys or on ship-based instrumentation. The data acquisition and transmission system is based on IEEE P996.1 standard boards, implementing a PC-like architecture; basically, it consists in a Pentium family CPU (the fist prototypes used a 40 MHz 386 CPU), a variable number of RS-232 ports to connect measuring instruments and communication devices, an analog to digital converter (8 inputs 12 or 16 bit), power outputs with connected circuit status feedback to drive actuators and switch on and off the measuring systems, satellite and/or cellular phone modem, GPS; the mass storage is supplied by Disk on Chip (DOC) devices. According to the needs, it can be fully or only partly implemented. The software environment is Datalight ROMDOS v. 6, an MS-DOS compatible Operating System. The software has been written in Microsoft Professional Compiled BASIC v. 7.1. and Microsoft Macro Assembler v. 5.0. It enables to fully control the system instruments both in local and remote mode using a special set of macro commands (that can be combined into sequences using a simple text editor) that include also conditional execution of branches; this feature can be very useful in case of partial operativity of the system due, for instance, to low battery level or failure of some instrument. Available commands include: • System management commands • Instrument management commands • Conditional branch commands • Data transmission commands Collected data are locally stored and can be transmitted as e-mails, so increasing their safety against loosing and making the global data path fault tolerant using the peculiarities of the e-mail system. The first version was used in a network of coastal monitoring buoys funded by the Italian SAM program; a second one to equip an automatic multiple launcher for expendable probes to be used in ships of opportunity, designed and built in the framework of an EU funded program, "MFSTEP". Every hour, a "sequence manager" starts a macro-command sequence, that can be different for each time and is remotely reprogrammable; new releases of the software and of the sequences are uploadable to the station without suspending its normal activity. The macro-commands enable to manage the data acquisition and transmission, the mission programming, the station hardware and the measuring instruments. In the "launcher" version the program also controls real time and position acquisition, comparison against set points-times, launch, data acquisition and transmission, ancillary functions. The whole system can be connected to another computer (local laptop or remote desktop) using a terminal software; however, to fully and easily use its capabilities, a remote control program has been written in Microsoft Visual Basic, running in Windows environment. This program enables to transfer files to and from the measuring system, set up all its functionalities, and, if needed, take control of all the system operations. Thanks to the PC-like hardware architecture, it is easy to upgrade the system to more powerful processors without the need to modify the software, which, in turn, can be easily programmed using standard development packages.

  6. The AAO fiber instrument data simulator

    NASA Astrophysics Data System (ADS)

    Goodwin, Michael; Farrell, Tony; Smedley, Scott; Heald, Ron; Heijmans, Jeroen; De Silva, Gayandhi; Carollo, Daniela

    2012-09-01

    The fiber instrument data simulator is an in-house software tool that simulates detector images of fiber-fed spectrographs developed by the Australian Astronomical Observatory (AAO). In addition to helping validate the instrument designs, the resulting simulated images are used to develop the required data reduction software. Example applications that have benefited from the tool usage are the HERMES and SAMI instrumental projects for the Anglo-Australian Telescope (AAT). Given the sophistication of these projects an end-to-end data simulator that accurately models the predicted detector images is required. The data simulator encompasses all aspects of the transmission and optical aberrations of the light path: from the science object, through the atmosphere, telescope, fibers, spectrograph and finally the camera detectors. The simulator runs under a Linux environment that uses pre-calculated information derived from ZEMAX models and processed data from MATLAB. In this paper, we discuss the aspects of the model, software, example simulations and verification.

  7. Software design for the VIS instrument onboard the Euclid mission: a multilayer approach

    NASA Astrophysics Data System (ADS)

    Galli, E.; Di Giorgio, A. M.; Pezzuto, S.; Liu, S. J.; Giusi, G.; Li Causi, G.; Farina, M.; Cropper, M.; Denniston, J.; Niemi, S.

    2014-07-01

    The Euclid mission scientific payload is composed of two instruments: a VISible Imaging Instrument (VIS) and a Near Infrared Spectrometer and Photometer instrument (NISP). Each instrument has its own control unit. The Instrument Command and Data Processing Unit (VI-CDPU) is the control unit of the VIS instrument. The VI-CDPU is connected directly to the spacecraft by means of a MIL-STD-1553B bus and to the satellite Mass Memory Unit via a SpaceWire link. All the internal interfaces are implemented via SpaceWire links and include 12 high speed lines for the data provided by the 36 focal plane CCDs readout electronics (ROEs) and one link to the Power and Mechanisms Control Unit (VI-PMCU). VI-CDPU is in charge of distributing commands to the instrument sub-systems, collecting their housekeeping parameters and monitoring their health status. Moreover, the unit has the task of acquiring, reordering, compressing and transferring the science data to the satellite Mass Memory. This last feature is probably the most challenging one for the VI-CDPU, since stringent constraints about the minimum lossless compression ratio, the maximum time for the compression execution and the maximum power consumption have to be satisfied. Therefore, an accurate performance analysis at hardware layer is necessary, which could delay too much the design and development of software. In order to mitigate this risk, in the multilayered design of software we decided to design a middleware layer that provides a set of APIs with the aim of hiding the implementation of the HW connected layer to the application one. The middleware is built on top of the Operating System layer (which includes the Real-Time OS that will be adopted) and the onboard Computer Hardware. The middleware itself has a multi-layer architecture composed of 4 layers: the Abstract RTOS Adapter Layer (AOSAL), the Speci_c RTOS Adapter Layer (SOSAL), the Common Patterns Layer (CPL), the Service Layer composed of two subgroups which are the Common Service (CSL) and the Specific Service layer (SSL). The middleware design is made using the UML 2.0 standard. The AOSAL includes the abstraction of services provided by a generic RTOS (e.g Thread/Task, Time Management, Mutex and Semaphores) as well as an abstraction of SpaceWire and 1553-B bus Interface. The SOSAL is the implementation of AOSAL for the adopted RTOS. The CPL provides a set of patterns that are a general solution for common problems related to embedded hard Real Time systems. This set includes patterns for memory management, homogenous redundancy channels, pipes and filters for data exchange, proxies for slow memories, watchdog and reactive objects. The CPL is designed using a soft-metamodeling approach, so as to be as general as possible. Finally, the SL provides a set of services that are common to space applications. The testing of this middleware can be done both during the design using appropriate tools of analysis and in the implementation phase by means of unit testing tools.

  8. Development of a compact laser-based single photon ionization time-of-flight mass spectrometer

    NASA Astrophysics Data System (ADS)

    Tonokura, Kenichi; Kanno, Nozomu; Yamamoto, Yukio; Yamada, Hiroyuki

    2010-02-01

    We have developed a compact, laser-based, single photon ionization time-of-flight mass spectrometer (SPI-TOF-MS) for on-line monitoring of trace organic species. To obtain the mass spectrum, we use a nearly fragmentation-free SPI technique with 10.5 eV (118 nm) vacuum ultraviolet laser pulses generated by frequency tripling of the third harmonic of an Nd:YAG laser. The instrument can be operated in a linear TOF-MS mode or a reflectron TOF-MS mode in the coaxial design. We designed ion optics to optimize detection sensitivity and mass resolution. For data acquisition, the instrument is controlled using LabVIEW control software. The total power requirement for the vacuum unit, control electronics unit, ion optics, and detection system is approximately 100 W. We achieve a detection limit of parts per billion by volume (ppbv) for on-line trace analysis of several organic compounds. A mass resolution of 800 at about 100 amu is obtained for reflectron TOF-MS mode in a 0.35 m long instrument. The application of on-line monitoring of diesel engine exhaust was demonstrated.

  9. Using a graphical programming language to write CAMAC/GPIB instrument drivers

    NASA Technical Reports Server (NTRS)

    Zambrana, Horacio; Johanson, William

    1991-01-01

    To reduce the complexities of conventional programming, graphical software was used in the development of instrumentation drivers. The graphical software provides a standard set of tools (graphical subroutines) which are sufficient to program the most sophisticated CAMAC/GPIB drivers. These tools were used and instrumentation drivers were successfully developed for operating CAMAC/GPIB hardware from two different manufacturers: LeCroy and DSP. The use of these tools is presented for programming a LeCroy A/D Waveform Analyzer.

  10. The ISO SWS on-line system

    NASA Technical Reports Server (NTRS)

    Roelfsema, P. R.; Kester, D. J. M.; Wesselius, P. R.; Wieprech, E.; Sym, N.

    1992-01-01

    The software which is currently being developed for the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO) is described. The spectrometer has a wide range of capabilities in the 2-45 micron infrared band. SWS contains two independent gratings, one for the long and one for the short wavelength section of the band. With the gratings a spectral resolution of approximately 1000 to approximately 2500 can be obtained. The instrument also contains two Fabry-Perault's yielding a resolution between approximately 1000 and approximately 20000. Software is currently being developed for the acquisition, calibration, and analysis of SWS data. The software is firstly required to run in a pipeline mode without human interaction, to process data as they are received from the telescope. However, both for testing and calibration of the instrument as well as for evaluation of the planned operating procedures the software should also be suitable for interactive use. Thirdly the same software will be used for long term characterization of the instrument. The software must work properly within the environment designed by the European Space Agency (ESA) for the spacecraft operations. As a result strict constraints are put on I/O devices, throughput etc.

  11. EPICS Controlled Collimator for Controlling Beam Sizes in HIPPO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napolitano, Arthur Soriano; Vogel, Sven C.

    2017-08-03

    Controlling the beam spot size and shape in a diffraction experiment determines the probed sample volume. The HIPPO - High-Pressure-Preferred Orientation– neutron time-offlight diffractometer is located at the Lujan Neutron Scattering Center in Los Alamos National Laboratories. HIPPO characterizes microstructural parameters, such as phase composition, strains, grain size, or texture, of bulk (cm-sized) samples. In the current setup, the beam spot has a 10 mm diameter. Using a collimator, consisting of two pairs of neutron absorbing boron-nitride slabs, horizontal and vertical dimensions of a rectangular beam spot can be defined. Using the HIPPO robotic sample changer for sample motion, themore » collimator would enable scanning of e.g. cylindrical samples along the cylinder axis by probing slices of such samples. The project presented here describes implementation of such a collimator, in particular the motion control software. We utilized the EPICS (Experimental Physics Interface and Control System) software interface to integrate the collimator control into the HIPPO instrument control system. Using EPICS, commands are sent to commercial stepper motors that move the beam windows.« less

  12. Agile deployment and code coverage testing metrics of the boot software on-board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián

    2018-02-01

    In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.

  13. LCOGT Imaging Lab

    NASA Astrophysics Data System (ADS)

    Tufts, Joseph R.; Lobdill, Rich; Haldeman, Benjamin J.; Haynes, Rachel; Hawkins, Eric; Burleson, Ben; Jahng, David

    2008-07-01

    The Las Cumbres Observatory Global Telescope Network (LCOGT) is an ambitious project to build and operate, within 5 years, a worldwide robotic network of 50 0.4, 1, and 2 m telescopes sharing identical instrumentation and optimized for precision photometry of time-varying sources. The telescopes, instrumentation, and software are all developed in house with two 2 m telescopes already installed. The LCOGT Imaging Lab is responsible for assembly and characterization of the network's cameras and instrumentation. In addition to a fully equipped CNC machine shop, two electronics labs, and a future optics lab, the Imaging Lab is designed from the ground up to be a superb environment for bare detectors, precision filters, and assembled instruments. At the heart of the lab is an ISO class 5 cleanroom with full ionization. Surrounding this, the class 7 main lab houses equipment for detector characterization including QE and CTE, and equipment for measuring transmission and reflection of optics. Although the first science cameras installed, two TEC cooled e2v 42-40 deep depletion based units and two CryoTiger cooled Fairchild Imaging CCD486-BI based units, are from outside manufacturers, their 18 position filter wheels and the remainder of the network's science cameras, controllers, and instrumentation will be built in house. Currently being designed, the first generation LCOGT cameras for the network's 1 m telescopes use existing CCD486-BI devices and an in-house controller. Additionally, the controller uses digital signal processing to optimize readout noise vs. speed, and all instrumentation uses embedded microprocessors for communication over ethernet.

  14. Distributed observing facility for remote access to multiple telescopes

    NASA Astrophysics Data System (ADS)

    Callegari, Massimo; Panciatici, Antonio; Pasian, Fabio; Pucillo, Mauro; Santin, Paolo; Aro, Simo; Linde, Peter; Duran, Maria A.; Rodriguez, Jose A.; Genova, Francoise; Ochsenbein, Francois; Ponz, J. D.; Talavera, Antonio

    2000-06-01

    The REMOT (Remote Experiment Monitoring and conTrol) project was financed by 1996 by the European Community in order to investigate the possibility of generalizing the remote access to scientific instruments. After the feasibility of this idea was demonstrated, the DYNACORE (DYNAmically, COnfigurable Remote Experiment monitoring and control) project was initiated as a REMOT follow-up. Its purpose is to develop software technology to support scientists in two different domains, astronomy and plasma physics. The resulting system allows (1) simultaneous multiple user access to different experimental facilities, (2) dynamic adaptability to different kinds of real instruments, (3) exploitation of the communication infrastructures features, (4) ease of use through intuitive graphical interfaces, and (5) additional inter-user communication using off-the-shelf projects such as video-conference tools, chat programs and shared blackboards.

  15. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mike; Cipiti, Ben; Demuth, Scott Francis

    2017-01-30

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  16. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  17. Unified Software Solution for Efficient SPR Data Analysis in Drug Research

    PubMed Central

    Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan

    2016-01-01

    Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754

  18. The Software Element of the NASA Portable Electronic Device Radiated Emissions Investigation

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Williams, Reuben A. (Technical Monitor)

    2002-01-01

    NASA Langley Research Center's (LaRC) High Intensity Radiated Fields Laboratory (HIRF Lab) recently conducted a series of electromagnetic radiated emissions tests under a cooperative agreement with Delta Airlines and an interagency agreement with the FAA. The frequency spectrum environment at a commercial airport was measured on location. The environment survey provides a comprehensive picture of the complex nature of the electromagnetic environment present in those areas outside the aircraft. In addition, radiated emissions tests were conducted on portable electronic devices (PEDs) that may be brought onboard aircraft. These tests were performed in both semi-anechoic and reverberation chambers located in the HIRF Lab. The PEDs included cell phones, laptop computers, electronic toys, and family radio systems. The data generated during the tests are intended to support the research on the effect of radiated emissions from wireless devices on aircraft systems. Both tests systems relied on customized control and data reduction software to provide test and instrument control, data acquisition, a user interface, real time data reduction, and data analysis. The software executed on PC's running MS Windows 98 and 2000, and used Agilent Pro Visual Engineering Environment (VEE) development software, Common Object Model (COM) technology, and MS Excel.

  19. DESIGN NOTE: Microcontroller-based multi-sensor apparatus for temperature control and thermal conductivity measurement

    NASA Astrophysics Data System (ADS)

    Mukaro, R.; Gasseller, M.; Kufazvinei, C.; Olumekor, L.; Taele, B. M.

    2003-08-01

    A microcontroller-based multi-sensor temperature measurement and control system that uses a steady-state one-dimensional heat-flow technique for absolute determination of thermal conductivity of a rigid poor conductor using the guarded hot-plate method is described. The objective of this project was to utilize the latest powerful, yet inexpensive, technological developments, sensors, data acquisition and control system, computer and application software, for research and teaching by example. The system uses an ST6220 microcontroller and LM335 temperature sensors for temperature measurement and control. The instrument interfaces to a computer via the serial port using a Turbo C++ programme. LM335Z silicon semiconductor temperature sensors located at different axial locations in the heat source were calibrated and used to measure temperature in the range from room temperature (about 293 K) to 373 K. A zero and span circuit was used in conjunction with an eight-to-one-line data multiplexer to scale the LM335 output signals to fit the 0 5.0 V full-scale input of the microcontroller's on-chip ADC and to sequentially measure temperature at the different locations. Temperature control is achieved by using software-generated pulse-width-modulated signals that control power to the heater. This article emphasizes the apparatus's instrumentation, the computerized data acquisition design, operation and demonstration of the system as a purposeful measurement system that could be easily adopted for use in the undergraduate laboratory. Measurements on a 10 mm thick sample of polyurethane foam at different temperature gradients gave a thermal conductivity of 0.026 +/- 0.004 W m-1 K-1.

  20. The Instrumental Genesis Process in Future Primary Teachers Using Dynamic Geometry Software

    ERIC Educational Resources Information Center

    Ruiz-López, Natalia

    2018-01-01

    This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis…

  1. A distributed, graphical user interface based, computer control system for atomic physics experiments

    NASA Astrophysics Data System (ADS)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  2. A distributed, graphical user interface based, computer control system for atomic physics experiments.

    PubMed

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  3. A computer-aided approach to nonlinear control systhesis

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Anthony, Tobin

    1988-01-01

    The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.

  4. Flight experience with lightweight, low-power miniaturized instrumentation systems

    NASA Technical Reports Server (NTRS)

    Hamory, Philip J.; Murray, James E.

    1992-01-01

    Engineers at the NASA Dryden Flight Research Facility (NASA-Dryden) have conducted two flight research programs with lightweight, low-power miniaturized instrumentation systems built around commercial data loggers. One program quantified the performance of a radio-controlled model airplane. The other program was a laminar boundary-layer transition experiment on a manned sailplane. The purpose of this paper is to report NASA-Dryden personnel's flight experience with the miniaturized instrumentation systems used on these two programs. The paper will describe the data loggers, the sensors, and the hardware and software developed to complete the systems. The paper also describes how the systems were used and covers the challenges encountered to make them work. Examples of raw data and derived results will be shown as well. Finally, future plans for these systems will be discussed.

  5. Multichamber Multipotentiostat System for Cellular Microphysiometry.

    PubMed

    Lima, Eduardo A; Snider, Rachel M; Reiserer, Ronald S; McKenzie, Jennifer R; Kimmel, Danielle W; Eklund, Sven E; Wikswo, John P; Cliffel, David E

    2014-12-01

    Multianalyte microphysiometry is a powerful technique for studying cellular metabolic flux in real time. Monitoring several analytes concurrently in a number of individual chambers, however, requires specific instrumentation that is not available commercially in a single, compact, benchtop form at an affordable cost. We developed a multipotentiostat system capable of performing simultaneous amperometric and potentiometric measurements in up to eight individual chambers. The modular design and custom LabVIEW™ control software provide flexibility and allow for expansion and modification to suit different experimental conditions. Superior accuracy is achieved when operating the instrument in a standalone configuration; however, measurements performed in conjunction with a previously developed multianalyte microphysiometer have shown low levels of crosstalk as well. Calibrations and experiments with primary and immortalized cell cultures demonstrate the performance of the instrument and its capabilities.

  6. Management and Stewardship of Airborne Observational Data for the NSF/NCAR HIAPER (GV) and NSF/NCAR C-130 at the National Center for Atmospheric Research (NCAR) Earth Observing Laboratory (EOL)

    NASA Astrophysics Data System (ADS)

    Aquino, J.

    2014-12-01

    The National Science Foundation (NSF) provides the National Center for Atmospheric Research (NCAR) Earth Observing Laboratory (EOL) funding for the operation, maintenance and upgrade of two research aircraft: the NSF/NCAR High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) Gulfstream V and the NSF/NCAR Hercules C-130. A suite of in-situ and remote sensing airborne instruments housed at the EOL Research Aviation Facility (RAF) provide a basic set of measurements that are typically deployed on most airborne field campaigns. In addition, instruments to address more specific research requirements are provided by collaborating participants from universities, industry, NASA, NOAA or other agencies. The data collected are an important legacy of these field campaigns. A comprehensive metadata database and integrated cyber-infrastructure, along with a robust data workflow that begins during the field phase and extends to long-term archival (current aircraft data holdings go back to 1967), assures that: all data and associated software are safeguarded throughout the data handling process; community standards of practice for data stewardship and software version control are followed; simple and timely community access to collected data and associated software tools are provided; and the quality of the collected data is preserved, with the ultimate goal of supporting research and the reproducibility of published results. The components of this data system to be presented include: robust, searchable web access to data holdings; reliable, redundant data storage; web-based tools and scripts for efficient creation, maintenance and update of data holdings; access to supplemental data and documentation; storage of data in standardized data formats; comprehensive metadata collection; mature version control; human-discernable storage practices; and procedures to inform users of changes. In addition, lessons learned, shortcomings, and desired upgrades will be discussed.

  7. FIEStool: Automated data reduction for FIber-fed Echelle Spectrograph (FIES)

    NASA Astrophysics Data System (ADS)

    Stempels, Eric; Telting, John

    2017-08-01

    FIEStool automatically reduces data obtained with the FIber-fed Echelle Spectrograph (FIES) at the Nordic Optical Telescope, a high-resolution spectrograph available on a stand-by basis, while also allowing the basic properties of the reduction to be controlled in real time by the user. It provides a Graphical User Interface and offers bias subtraction, flat-fielding, scattered-light subtraction, and specialized reduction tasks from the external packages IRAF (ascl:9911.002) and NumArray. The core of FIEStool is instrument-independent; the software, written in Python, could with minor modifications also be used for automatic reduction of data from other instruments.

  8. Temperature control system for optical elements in astronomical instrumentation

    NASA Astrophysics Data System (ADS)

    Verducci, Orlando; de Oliveira, Antonio C.; Ribeiro, Flávio F.; Vital de Arruda, Márcio; Gneiding, Clemens D.; Fraga, Luciano

    2014-07-01

    Extremely low temperatures may damage the optical components assembled inside of an astronomical instrument due to the crack in the resin or glue used to attach lenses and mirrors. The environment, very cold and dry, in most of the astronomical observatories contributes to this problem. This paper describes the solution implemented at SOAR for remotely monitoring and controlling temperatures inside of a spectrograph, in order to prevent a possible damage of the optical parts. The system automatically switches on and off some heat dissipation elements, located near the optics, as the measured temperature reaches a trigger value. This value is set to a temperature at which the instrument is not operational to prevent malfunction and only to protect the optics. The software was developed with LabVIEWTM and based on an object-oriented design that offers flexibility and ease of maintenance. As result, the system is able to keep the internal temperature of the instrument above a chosen limit, except perhaps during the response time, due to inertia of the temperature. This inertia can be controlled and even avoided by choosing the correct amount of heat dissipation and location of the thermal elements. A log file records the measured temperature values by the system for operation analysis.

  9. Streamlined sign-out of capillary protein electrophoresis using middleware and an open-source macro application.

    PubMed

    Mathur, Gagan; Haugen, Thomas H; Davis, Scott L; Krasowski, Matthew D

    2014-01-01

    Interfacing of clinical laboratory instruments with the laboratory information system (LIS) via "middleware" software is increasingly common. Our clinical laboratory implemented capillary electrophoresis using a Sebia(®) Capillarys-2™ (Norcross, GA, USA) instrument for serum and urine protein electrophoresis. Using Data Innovations Instrument Manager, an interface was established with the LIS (Cerner) that allowed for bi-directional transmission of numeric data. However, the text of the interpretive pathology report was not properly transferred. To reduce manual effort and possibility for error in text data transfer, we developed scripts in AutoHotkey, a free, open-source macro-creation and automation software utility. Scripts were written to create macros that automated mouse and key strokes. The scripts retrieve the specimen accession number, capture user input text, and insert the text interpretation in the correct patient record in the desired format. The scripts accurately and precisely transfer narrative interpretation into the LIS. Combined with bar-code reading by the electrophoresis instrument, the scripts transfer data efficiently to the correct patient record. In addition, the AutoHotKey script automated repetitive key strokes required for manual entry into the LIS, making protein electrophoresis sign-out easier to learn and faster to use by the pathology residents. Scripts allow for either preliminary verification by residents or final sign-out by the attending pathologist. Using the open-source AutoHotKey software, we successfully improved the transfer of text data between capillary electrophoresis software and the LIS. The use of open-source software tools should not be overlooked as tools to improve interfacing of laboratory instruments.

  10. On-Board Software Reference Architecture for Payloads

    NASA Astrophysics Data System (ADS)

    Bos, Victor; Rugina, Ana; Trcka, Adam

    2016-08-01

    The goal of the On-board Software Reference Architecture for Payloads (OSRA-P) is to identify an architecture for payload software to harmonize the payload domain, to enable more reuse of common/generic payload software across different payloads and missions and to ease the integration of the payloads with the platform.To investigate the payload domain, recent and current payload instruments of European space missions have been analyzed. This led to a Payload Catalogue describing 12 payload instruments as well as a Capability Matrix listing specific characteristics of each payload. In addition, a functional decomposition of payload software was prepared which contains functionalities typically found in payload systems. The definition of OSRA-P was evaluated by case studies and a dedicated OSRA-P workshop to gather feedback from the payload community.

  11. Development of a video-simulation instrument for assessing cognition in older adults.

    PubMed

    Ip, Edward H; Barnard, Ryan; Marshall, Sarah A; Lu, Lingyi; Sink, Kaycee; Wilson, Valerie; Chamberlain, Dana; Rapp, Stephen R

    2017-12-06

    Commonly used methods to assess cognition, such as direct observation, self-report, or neuropsychological testing, have significant limitations. Therefore, a novel tablet computer-based video simulation was created with the goal of being valid, reliable, and easy to administer. The design and implementation of the SIMBAC (Simulation-Based Assessment of Cognition) instrument is described in detail, as well as informatics "lessons learned" during development. The software emulates 5 common instrumental activities of daily living (IADLs) and scores participants' performance. The modules were chosen by a panel of geriatricians based on relevance to daily functioning and ability to be modeled electronically, and included facial recognition, pairing faces with the correct names, filling a pillbox, using an automated teller machine (ATM), and automatic renewal of a prescription using a telephone. Software development included three phases 1) a period of initial design and testing (alpha version), 2) pilot study with 10 cognitively normal and 10 cognitively impaired adults over the age of 60 (beta version), and 3) larger validation study with 162 older adults of mixed cognitive status (release version). Results of the pilot study are discussed in the context of refining the instrument; full results of the validation study are reported in a separate article. In both studies, SIMBAC reliably differentiated controls from persons with cognitive impairment, and performance was highly correlated with Mini Mental Status Examination (MMSE) score. Several informatics challenges emerged during software development, which are broadly relevant to the design and use of electronic assessment tools. Solutions to these issues, such as protection of subject privacy and safeguarding against data loss, are discussed in depth. Collection of fine-grained data (highly detailed information such as time spent reading directions and the number of taps on screen) is also considered. SIMBAC provides clinicians direct insight into whether subjects can successfully perform selected cognitively intensive activities essential for independent living and advances the field of cognitive assessment. Insight gained from the development process could inform other researchers who seek to develop software tools in health care.

  12. Instrument response measurements of ion mobility spectrometers in situ: maintaining optimal system performance of fielded systems

    NASA Astrophysics Data System (ADS)

    Wallis, Eric; Griffin, Todd M.; Popkie, Norm, Jr.; Eagan, Michael A.; McAtee, Robert F.; Vrazel, Danet; McKinly, Jim

    2005-05-01

    Ion Mobility Spectroscopy (IMS) is the most widespread detection technique in use by the military for the detection of chemical warfare agents, explosives, and other threat agents. Moreover, its role in homeland security and force protection has expanded due, in part, to its good sensitivity, low power, lightweight, and reasonable cost. With the increased use of IMS systems as continuous monitors, it becomes necessary to develop tools and methodologies to ensure optimal performance over a wide range of conditions and extended periods of time. Namely, instrument calibration is needed to ensure proper sensitivity and to correct for matrix or environmental effects. We have developed methodologies to deal with the semi-quantitative nature of IMS and allow us to generate response curves that allow a gauge of instrument performance and maintenance requirements. This instrumentation communicates to the IMS systems via a software interface that was developed in-house. The software measures system response, logs information to a database, and generates the response curves. This paper will discuss the instrumentation, software, data collected, and initial results from fielded systems.

  13. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  14. James Webb Space Telescope Integrated Science Instrument Module Thermal Vacuum Thermal Balance Test Campaign at NASA's Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Glazer, Stuart; Comber, Brian (Inventor)

    2016-01-01

    The James Webb Space Telescope is a large infrared telescope with a 6.5-meter primary mirror, designed as a successor to the Hubble Space Telescope when launched in 2018. Three of the four science instruments contained within the Integrated Science Instrument Module (ISIM) are passively cooled to their operational temperature range of 36K to 40K with radiators, and the fourth instrument is actively cooled to its operational temperature of approximately 6K. Thermal-vacuum testing of the flight science instruments at the ISIM element level has taken place in three separate highly challenging and extremely complex thermal tests within a gaseous helium-cooled shroud inside Goddard Space Flight Centers Space Environment Simulator. Special data acquisition software was developed for these tests to monitor over 1700 flight and test sensor measurements, track over 50 gradients, component rates, and temperature limits in real time against defined constraints and limitations, and guide the complex transition from ambient to final cryogenic temperatures and back. This extremely flexible system has proven highly successful in safeguarding the nearly $2B science payload during the 3.5-month-long thermal tests. Heat flow measurement instrumentation, or Q-meters, were also specially developed for these tests. These devices provide thermal boundaries o the flight hardware while measuring instrument heat loads up to 600 mW with an estimated uncertainty of 2 mW in test, enabling accurate thermal model correlation, hardware design validation, and workmanship verification. The high accuracy heat load measurements provided first evidence of a potentially serious hardware design issue that was subsequently corrected. This paper provides an overview of the ISIM-level thermal-vacuum tests and thermal objectives; explains the thermal test configuration and thermal balances; describes special measurement instrumentation and monitoring and control software; presents key test thermal results; lists problems encountered during testing and lessons learned.

  15. The MEDA Project: Developing Evaluation Competence in the Training Software Domain.

    ERIC Educational Resources Information Center

    Machell, Joan; Saunders, Murray

    1992-01-01

    The MEDA (Methodologie d'Evaluation des Didacticiels pour les Adultes) tool is a generic instrument to evaluate training courseware. It was developed for software designers to improve products, for instructors to select appropriate courseware, and for distributors and consultants to match software to client needs. Describes software evaluation…

  16. Titan 3E/Centaur D-1T Systems Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A systems and operational summary of the Titan 3E/Centaur D-1T program is presented which describes vehicle assembly facilities, launch facilities, and management responsibilities, and also provides detailed information on the following separate systems: (1) mechanical systems, including structural components, insulation, propulsion units, reaction control, thrust vector control, hydraulic systems, and pneumatic equipment; (2) astrionics systems, such as instrumentation and telemetry, navigation and guidance, C-Band tracking system, and range safety command system; (3) digital computer unit software; (4) flight control systems; (5) electrical/electronic systems; and (6) ground support equipment, including checkout equipment.

  17. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  18. Wind Evaluation Breadboard electronics and software

    NASA Astrophysics Data System (ADS)

    Núñez, Miguel; Reyes, Marcos; Viera, Teodora; Zuluaga, Pablo

    2008-07-01

    WEB, the Wind Evaluation Breadboard, is an Extremely Large Telescope Primary Mirror simulator, developed with the aim of quantifying the ability of a segmented primary mirror to cope with wind disturbances. This instrument supported by the European Community (Framework Programme 6, ELT Design Study), is developed by ESO, IAC, MEDIA-ALTRAN, JUPASA and FOGALE. The WEB is a bench of about 20 tons and 7 meter diameter emulating a segmented primary mirror and its cell, with 7 hexagonal segments simulators, including electromechanical support systems. In this paper we present the WEB central control electronics and the software development which has to interface with: position actuators, auxiliary slave actuators, edge sensors, azimuth ring, elevation actuator, meteorological station and air balloons enclosure. The set of subsystems to control is a reduced version of a real telescope segmented primary mirror control system with high real time performance but emphasizing on development time efficiency and flexibility, because WEB is a test bench. The paper includes a detailed description of hardware and software, paying special attention to real time performance. The Hardware is composed of three computers and the Software architecture has been divided in three intercommunicated applications and they have been implemented using Labview over Windows XP and Pharlap ETS real time operating system. The edge sensors and position actuators close loop has a sampling and commanding frequency of 1KHz.

  19. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  20. Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience

    PubMed Central

    Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin

    2009-01-01

    Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671

  1. An Open Hardware seismic data recorder - a solid basis for citizen science

    NASA Astrophysics Data System (ADS)

    Mertl, Stefan

    2015-04-01

    "Ruwai" is a 24-Bit Open Hardware seismic data recorder. It is built up of four stackable printed circuit boards fitting the Arduino Mega 2560 microcontroller prototyping platform. An interface to the BeagleBone Black single-board computer enables extensive data storage, -processing and networking capabilities. The four printed circuit boards provide a uBlox Lea-6T GPS module and real-time clock (GPS Timing shield), an Texas Instruments ADS1274 24-Bit analog to digital converter (ADC main shield), an analog input section with a Texas Instruments PGA281 programmable gain amplifier and an analog anti-aliasing filter (ADC analog interface pga) and the power conditioning based on 9-36V DC input (power supply shield). The Arduino Mega 2560 is used for controlling the hardware components, timestamping sampled data using the GPS timing information and transmitting the data to the BeagleBone Black single-board computer. The BeagleBone Black provides local data storage, wireless mesh networking using the optimized link state routing daemon and differential GNSS positioning using the RTKLIB software. The complete hardware and software is published under free software - or open hardware licenses and only free software (e.g. KiCad) was used for the development to facilitate the reusability of the design and increases the sustainability of the project. "Ruwai" was developed within the framework of the "Community Environmental Observation Network (CEON)" (http://www.mertl-research.at/ceon/) which was supported by the Internet Foundation Austria (IPA) within the NetIdee 2013 call.

  2. A new software for dimensional measurements in 3D endodontic root canal instrumentation.

    PubMed

    Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella

    2012-01-01

    The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.

  3. Application and design of solar photovoltaic system

    NASA Astrophysics Data System (ADS)

    Tianze, Li; Hengwei, Lu; Chuan, Jiang; Luan, Hou; Xia, Zhang

    2011-02-01

    Solar modules, power electronic equipments which include the charge-discharge controller, the inverter, the test instrumentation and the computer monitoring, and the storage battery or the other energy storage and auxiliary generating plant make up of the photovoltaic system which is shown in the thesis. PV system design should follow to meet the load supply requirements, make system low cost, seriously consider the design of software and hardware, and make general software design prior to hardware design in the paper. To take the design of PV system for an example, the paper gives the analysis of the design of system software and system hardware, economic benefit, and basic ideas and steps of the installation and the connection of the system. It elaborates on the information acquisition, the software and hardware design of the system, the evaluation and optimization of the system. Finally, it shows the analysis and prospect of the application of photovoltaic technology in outer space, solar lamps, freeways and communications.

  4. Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix

    2005-01-01

    To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.

  5. Development of a Low-Cost Sub-Scale Aircraft for Flight Research: The FASER Project

    NASA Technical Reports Server (NTRS)

    Owens, Donald B.; Cox, David E.; Morelli, Eugene A.

    2006-01-01

    An inexpensive unmanned sub-scale aircraft was developed to conduct frequent flight test experiments for research and demonstration of advanced dynamic modeling and control design concepts. This paper describes the aircraft, flight systems, flight operations, and data compatibility including details of some practical problems encountered and the solutions found. The aircraft, named Free-flying Aircraft for Sub-scale Experimental Research, or FASER, was outfitted with high-quality instrumentation to measure aircraft inputs and states, as well as vehicle health parameters. Flight data are stored onboard, but can also be telemetered to a ground station in real time for analysis. Commercial-off-the-shelf hardware and software were used as often as possible. The flight computer is based on the PC104 platform, and runs xPC-Target software. Extensive wind tunnel testing was conducted with the same aircraft used for flight testing, and a six degree-of-freedom simulation with nonlinear aerodynamics was developed to support flight tests. Flight tests to date have been conducted to mature the flight operations, validate the instrumentation, and check the flight data for kinematic consistency. Data compatibility analysis showed that the flight data are accurate and consistent after corrections are made for estimated systematic instrumentation errors.

  6. Design and Experiment of Electrooculogram (EOG) System and Its Application to Control Mobile Robot

    NASA Astrophysics Data System (ADS)

    Sanjaya, W. S. M.; Anggraeni, D.; Multajam, R.; Subkhi, M. N.; Muttaqien, I.

    2017-03-01

    In this paper, we design and investigate a biological signal detection of eye movements (Electrooculogram). To detect a signal of Electrooculogram (EOG) used 4 instrument amplifier process; differential instrumentation amplifier, High Pass Filter (HPF) with 3 stage filters, Low Pass Filter (LPF) with 3 stage filters and Level Shifter circuit. The total of amplifying is 1000 times of gain, with frequency range 0.5-30 Hz. IC OP-Amp OP07 was used for all amplifying process. EOG signal will be read as analog input for Arduino microprocessor, and will interfaced with serial communication to PC Monitor using Processing® software. The result of this research show a differences value of eye movements. Differences signal of EOG have been applied to navigation control of the mobile robot. In this research, all communication process using Bluetooth HC-05.

  7. The Summer Undergraduate Research Internship Program at the Pisgah Astronomical Research Institute

    NASA Astrophysics Data System (ADS)

    Cline, J. Donald; Castelaz, M.; Whitworth, C.; Clavier, D.; Owen, L.; Barker, T.

    2012-01-01

    Pisgah Astronomical Research Institute (PARI) offers summer undergraduate research internships. PARI has received support for the internships from the NC Space Grant Consortium, NSF awards for public science education, private donations, private foundations, and through a collaboration with the Pisgah Astronomical Research and Education Center of the University of North Carolina - Asheville. The internship program began in 2001 with 4 students. This year 7 funded students participated in 2011. Mentors for the interns include PARI's Science, Education, and Information Technology Directors and visiting faculty who are members of the PARI Research Affiliate Faculty program. Students work with mentors on radio and optical astronomy research, electrical engineering for robotic control of instruments, software development for instrument control and software for citizen science projects, and science education by developing curricula and multimedia and teaching high school students in summer programs at PARI. At the end of the summer interns write a paper about their research which is published in the PARI Summer Student Proceedings. Several of the students have presented their results at AAS Meetings. We will present a summary of specific research conducted by the students with their mentors, the logistics for hosting the PARI undergraduate internship program, and plans for growth based on the impact of an NSF supported renovation to the Research Building on the PARI campus.

  8. Assessing the detectability of antioxidants in two-dimensional high-performance liquid chromatography.

    PubMed

    Bassanese, Danielle N; Conlan, Xavier A; Barnett, Neil W; Stevenson, Paul G

    2015-05-01

    This paper explores the analytical figures of merit of two-dimensional high-performance liquid chromatography for the separation of antioxidant standards. The cumulative two-dimensional high-performance liquid chromatography peak area was calculated for 11 antioxidants by two different methods--the areas reported by the control software and by fitting the data with a Gaussian model; these methods were evaluated for precision and sensitivity. Both methods demonstrated excellent precision in regards to retention time in the second dimension (%RSD below 1.16%) and cumulative second dimension peak area (%RSD below 3.73% from the instrument software and 5.87% for the Gaussian method). Combining areas reported by the high-performance liquid chromatographic control software displayed superior limits of detection, in the order of 1 × 10(-6) M, almost an order of magnitude lower than the Gaussian method for some analytes. The introduction of the countergradient eliminated the strong solvent mismatch between dimensions, leading to a much improved peak shape and better detection limits for quantification. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, H.L.

    Much of the polymer composites industry is built around the thermochemical conversion of raw material into useful composites. The raw materials (molding compound, prepreg) often are made up of thermosetting resins and small fibers or particles. While this conversion can follow a large number of paths, only a few paths are efficient, economical and lead to desirable composite properties. Processing instrument (P/I) technology enables a computer to sense and interpret changes taking place during the cure of prepreg or molding compound. P/I technology has been used to make estimates of gel time and cure time, thermal diffusivity measurements and transitionmore » temperature measurements. Control and sensing software is comparatively straightforward. The interpretation of results with appropriate software is under development.« less

  11. Students' Expectations from Technology in Mathematical Tasks: Mathematical Relationships between Objects, Instrumental Genesis and Emergent Goals

    ERIC Educational Resources Information Center

    Laina, Vasiliki; Monaghan, John

    2014-01-01

    This paper reports on two students' work on geometry tasks in a dynamic geometry system. It augments prior work on students' instrumental geneses via a consideration of emergent goals that arise in students' work. It offers a way to interpret students' (working with new software) awareness of what software can and cannot do and students'…

  12. AMPS data management requirements study. [user manuals (computer programs)/display devices - computerized simulation/experimentation/ionosphere

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A data simulation is presented for instruments and associated control and display functions required to perform controlled active experiments of the atmosphere. A comprehensive user's guide is given for the data requirements and software developed for the following experiments: (1) electromagnetic wave transmission; (2) passive observation of ambient plasmas; (3) ionospheric measurements with a subsatellite; (4) electron accelerator beam measurements; and (5) measurement of acoustic gravity waves in the sodium layer using lasers. A complete description of each experiment is given.

  13. ANALOG I/O MODULE TEST SYSTEM BASED ON EPICS CA PROTOCOL AND ACTIVEX CA INTERFACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    YENG,YHOFF,L.

    2003-10-13

    Analog input (ADC) and output (DAC) modules play a substantial role in device level control of accelerator and large experiment physics control system. In order to get the best performance some features of analog modules including linearity, accuracy, crosstalk, thermal drift and so on have to be evaluated during the preliminary design phase. Gain and offset error calibration and thermal drift compensation (if needed) may have to be done in the implementation phase as well. A natural technique for performing these tasks is to interface the analog VO modules and GPIB interface programmable test instruments with a computer, which canmore » complete measurements or calibration automatically. A difficulty is that drivers of analog modules and test instruments usually work on totally different platforms (vxworks VS Windows). Developing new test routines and drivers for testing instruments under VxWorks (or any other RTOS) platform is not a good solution because such systems have relatively poor user interface and developing such software requires substantial effort. EPICS CA protocol and ActiveX CA interface provide another choice, a PC and LabVIEW based test system. Analog 110 module can be interfaced from LabVIEW test routines via ActiveX CA interface. Test instruments can be controlled via LabVIEW drivers, most of which are provided by instrument vendors or by National Instruments. Labview also provides extensive data analysis and process functions. Using these functions, users can generate powerful test routines very easily. Several applications built for Spallation Neutron Source (SNS) Beam Loss Monitor (BLM) system are described in this paper.« less

  14. Critical Design Decisions of The Planck LFI Level 1 Software

    NASA Astrophysics Data System (ADS)

    Morisset, N.; Rohlfs, R.; Türler, M.; Meharga, M.; Binko, P.; Beck, M.; Frailis, M.; Zacchei, A.

    2010-12-01

    The PLANCK satellite with two on-board instruments, a Low Frequency Instrument (LFI) and a High Frequency Instrument (HFI) has been launched on May 14th with Ariane 5. The ISDC Data Centre for Astrophysics in Versoix, Switzerland has developed and maintains the Planck LFI Level 1 software for the Data Processing Centre (DPC) in Trieste, Italy. The main tasks of the Level 1 processing are to retrieve the daily available scientific and housekeeping (HK) data of the LFI instrument, the Sorption Cooler and the 4k Cooler data from Mission Operation Centre (MOC) in Darmstadt; to sort them by time and by type (detector, observing mode, etc...); to extract the spacecraft attitude information from auxiliary files; to flag the data according to several criteria; and to archive the resulting Time Ordered Information (TOI), which will then be used to produce maps of the sky in different spectral bands. The output of the Level 1 software are the TOI files in FITS format, later ingested into the Data Management Component (DMC) database. This software has been used during different phases of the LFI instrument development. We started to reuse some ISDC components for the LFI Qualification Model (QM) and we completely rework the software for the Flight Model (FM). This was motivated by critical design decisions taken jointly with the DPC. The main questions were: a) the choice of the data format: FITS or DMC? b) the design of the pipelines: use of the Planck Process Coordinator (ProC) or a simple Perl script? c) do we adapt the existing QM software or do we restart from scratch? The timeline and available manpower are also important issues to be taken into account. We present here the orientation of our choices and discuss their pertinence based on the experience of the final pre-launch tests and the start of real Planck LFI operations.

  15. ICESat Science Investigator led Processing System (I-SIPS)

    NASA Astrophysics Data System (ADS)

    Bhardwaj, S.; Bay, J.; Brenner, A.; Dimarzio, J.; Hancock, D.; Sherman, M.

    2003-12-01

    The ICESat Science Investigator-led Processing System (I-SIPS) generates the GLAS standard data products. It consists of two main parts the Scheduling and Data Management System (SDMS) and the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software. The system has been operational since the successful launch of ICESat. It ingests data from the GLAS instrument, generates GLAS data products, and distributes them to the GLAS Science Computing Facility (SCF), the Instrument Support Facility (ISF) and the National Snow and Ice Data Center (NSIDC) ECS DAAC. The SDMS is the Planning, Scheduling and Data Management System that runs the GLAS Science Algorithm Software (GSAS). GSAS is based on the Algorithm Theoretical Basis Documents provided by the Science Team and is developed independently of SDMS. The SDMS provides the processing environment to plan jobs based on existing data, control job flow, data distribution, and archiving. The SDMS design is based on a mission-independent architecture that imposes few constraints on the science code thereby facilitating I-SIPS integration. I-SIPS currently works in an autonomous manner to ingest GLAS instrument data, distribute this data to the ISF, run the science processing algorithms to produce the GLAS standard products, reprocess data when new versions of science algorithms are released, and distributes the products to the SCF, ISF, and NSIDC. I-SIPS has a proven performance record, delivering the data to the SCF within hours after the initial instrument activation. The I-SIPS design philosophy gives this system a high potential for reuse in other science missions.

  16. New Software Architecture Options for the TCL Data Acquisition System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valenton, Emmanuel

    2014-09-01

    The Turbulent Combustion Laboratory (TCL) conducts research on combustion in turbulent flow environments. To conduct this research, the TCL utilizes several pulse lasers, a traversable wind tunnel, flow controllers, scientific grade CCD cameras, and numerous other components. Responsible for managing these different data-acquiring instruments and data processing components is the Data Acquisition (DAQ) software. However, the current system is constrained to running through VXI hardware—an instrument-computer interface—that is several years old, requiring the use of an outdated version of the visual programming language, LabVIEW. A new Acquisition System is being programmed which will borrow heavily from either a programming modelmore » known as the Current Value Table (CVT) System or another model known as the Server-Client System. The CVT System model is in essence, a giant spread sheet from which data or commands may be retrieved or written to, and the Server-Client System is based on network connections between a server and a client, very much like the Server-Client model of the Internet. Currently, the bare elements of a CVT DAQ Software have been implemented, consisting of client programs in addition to a server program that the CVT will run on. This system is being rigorously tested to evaluate the merits of pursuing the CVT System model and to uncover any potential flaws which may result in further implementation. If the CVT System is chosen, which is likely, then future work will consist of build up the system until enough client programs have been created to run the individual components of the lab. The advantages of such a System will be flexibility, portability, and polymorphism. Additionally, the new DAQ software will allow the Lab to replace the VXI with a newer instrument interface—the PXI—and take advantage of the capabilities of current and future versions of LabVIEW.« less

  17. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  18. Integrated fiducial sample mount and software for correlated microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy R McJunkin; Jill R. Scott; Tammy L. Trowbridge

    2014-02-01

    A novel design sample mount with integrated fiducials and software for assisting operators in easily and efficiently locating points of interest established in previous analytical sessions is described. The sample holder and software were evaluated with experiments to demonstrate the utility and ease of finding the same points of interest in two different microscopy instruments. Also, numerical analysis of expected errors in determining the same position with errors unbiased by a human operator was performed. Based on the results, issues related to acquiring reproducibility and best practices for using the sample mount and software were identified. Overall, the sample mountmore » methodology allows data to be efficiently and easily collected on different instruments for the same sample location.« less

  19. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.

  20. Towards a flexible array control and operation framework for CTA

    NASA Astrophysics Data System (ADS)

    Birsin, E.; Colomé, J.; Hoffmann, D.; Koeppel, H.; Lamanna, G.; Le Flour, T.; Lopatin, A.; Lyard, E.; Melkumyan, D.; Oya, I.; Panazol, J.-L.; Schlenstedt, S.; Schmidt, T.; Schwanke, U.; Stegmann, C.; Walter, R.; Wegner, P.; CTA Consortium

    2012-12-01

    The Cherenkov Telescope Array (CTA) [1] will be the successor to current Imaging Atmospheric Cherenkov Telescopes (IACT) like H.E.S.S., MAGIC and VERITAS. CTA will improve in sensitivity by about an order of magnitude compared to the current generation of IACTs. The energy range will extend from well below 100 GeV to above 100 TeV. To accomplish these goals, CTA will consist of two arrays, one in each hemisphere, consisting of 50-80 telescopes and composed of three different telescope types with different mirror sizes. It will be the first open observatory for very high energy γ-ray astronomy. The Array Control working group of CTA is currently evaluating existing technologies which are best suited for a project like CTA. The considered solutions comprise the ALMA Common Software (ACS), the OPC Unified Architecture (OPC UA) and the Data Distribution Service (DDS) for bulk data transfer. The first applications, like an automatic observation scheduler and the control software for some prototype instrumentation have been developed.

  1. The LBT real-time based control software to mitigate and compensate vibrations

    NASA Astrophysics Data System (ADS)

    Borelli, J.; Trowitzsch, J.; Brix, M.; Kürster, M.; Gässler, W.; Bertram, T.; Briegel, F.

    2010-07-01

    The Large Binocular Telescope (LBT) uses two 8.4 meters active primary mirrors and two adaptive secondary mirrors on the same mounting to take advantage of its interferometric capabilities. Both applications, interferometry and AO, are sensitive to vibrations. Several measurement campaigns have been carried out at the LBT and their results strongly indicate that a vibration monitoring system is required to improve the performance of LINC-NIRVANA, LBTI, and ARGOS, the laser guided ground layer adaptive optic system. Currently, a control software for mitigation and compensation of the vibrations is being designed. A complex set of algorithms collects real-time vibration data, archiving it for further analysis, and in parallel, generating the tip-tilt and optical path difference (OPD) data for the control loop of the instruments. A real-time data acquisition device equipped with embedded real-time Linux is used in our systems. A set of quick-look tools is currently under development in order to verify if the conditions at the telescope are suitable for interferometric/adaptive observations.

  2. Remote monitoring system for the cryogenic system of superconducting magnets in the SuperKEKB interaction region

    NASA Astrophysics Data System (ADS)

    Aoki, K.; Ohuchi, N.; Zong, Z.; Arimoto, Y.; Wang, X.; Yamaoka, H.; Kawai, M.; Kondou, Y.; Makida, Y.; Hirose, M.; Endou, T.; Iwasaki, M.; Nakamura, T.

    2017-12-01

    A remote monitoring system was developed based on the software infrastructure of the Experimental Physics and Industrial Control System (EPICS) for the cryogenic system of superconducting magnets in the interaction region of the SuperKEKB accelerator. The SuperKEKB has been constructed to conduct high-energy physics experiments at KEK. These superconducting magnets consist of three apparatuses, the Belle II detector solenoid, and QCSL and QCSR accelerator magnets. They are each contained in three cryostats cooled by dedicated helium cryogenic systems. The monitoring system was developed to read data of the EX-8000, which is an integrated instrumentation system to control all cryogenic components. The monitoring system uses the I/O control tools of EPICS software for TCP/IP, archiving techniques using a relational database, and easy human-computer interface. Using this monitoring system, it is possible to remotely monitor all real-time data of the superconducting magnets and cryogenic systems. It is also convenient to share data among multiple groups.

  3. Sample Analysis at Mars Instrument Simulator

    NASA Technical Reports Server (NTRS)

    Benna, Mehdi; Nolan, Tom

    2013-01-01

    The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes, and thus validates, complex command scripts prior to their up-linking to the SAM instrument. As an output, this module generates synthetic data and message logs at a rate that is similar to the actual instrument.

  4. Design of the on-board application software for the instrument control unit of Euclid-NISP

    NASA Astrophysics Data System (ADS)

    Ligori, Sebastiano; Corcione, Leonardo; Capobianco, Vito; Valenziano, Luca

    2014-08-01

    In this paper we describe the main requirements driving the development of the Application software of the ICU of NISP, the Near-Infrared Spectro-Photometer of the Euclid mission. This software will be based on a real-time operating system and will interface with all the subunits of NISP, as well as the CMDU of the spacecraft for the Telecommand and Housekeeping management. We briefly detail the services (following the PUS standard) that will be made available, and also possible commonalities in the approach with the ASW of the VIS CDPU, which could make the development effort more efficient; this approach could also make easier the maintenance of the SW during the mission. The development plan of the ASW and the next milestones foreseen are described, together with the architectural design approach and the development environment we are setting up.

  5. Simulation of Hazards and Poses for a Rocker-Bogie Rover

    NASA Technical Reports Server (NTRS)

    Backes, Paul; Norris, Jeffrey; Powell, Mark; Tharp, Gregory

    2004-01-01

    Provisions for specification of hazards faced by a robotic vehicle (rover) equipped with a rocker-bogie suspension, for prediction of collisions between the vehicle and the hazards, and for simulation of poses of the vehicle at selected positions on the terrain have been incorporated into software that simulates the movements of the vehicle on planned paths across the terrain. The software in question is that of the Web Interface for Telescience (WITS), selected aspects of which have been described in a number of prior NASA Tech Briefs articles. To recapitulate: The WITS is a system of computer software that enables scientists, located at geographically dispersed computer terminals connected to the World Wide Web, to command instrumented robotic vehicles (rovers) during exploration of Mars and perhaps eventually of other planets. The WITS also has potential for adaptation to terrestrial use in telerobotics and other applications that involve computer-based remote monitoring, supervision, control, and planning.

  6. Operational radiological support for the US manned space program

    NASA Technical Reports Server (NTRS)

    Golightly, Michael J.; Hardy, Alva C.; Atwell, William; Weyland, Mark D.; Kern, John; Cash, Bernard L.

    1993-01-01

    Radiological support for the manned space program is provided by the Space Radiation Analysis Group at NASA/JSC. This support ensures crew safety through mission design analysis, real-time space environment monitoring, and crew exposure measurements. Preflight crew exposure calculations using mission design information are used to ensure that crew exposures will remain within established limits. During missions, space environment conditions are continuously monitored from within the Mission Control Center. In the event of a radiation environment enhancement, the impact to crew exposure is assessed and recommendations are provided to flight management. Radiation dosimeters are placed throughout the spacecraft and provided to each crewmember. During a radiation contingency, the crew could be requested to provide dosimeter readings. This information would be used for projecting crew dose enhancements. New instrumentation and computer technology are being developed to improve the support. Improved instruments include tissue equivalent proportional counter (TEPC)-based dosimeters and charged particle telescopes. Data from these instruments will be telemetered and will provide flight controllers with unprecedented information regarding the radiation environment in and around the spacecraft. New software is being acquired and developed to provide 'smart' space environmental data displays for use by flight controllers.

  7. Identification and Quantitative Measurements of Chemical Species by Mass Spectrometry

    NASA Technical Reports Server (NTRS)

    Zondlo, Mark A.; Bomse, David S.

    2005-01-01

    The development of a miniature gas chromatograph/mass spectrometer system for the measurement of chemical species of interest to combustion is described. The completed system is a fully-contained, automated instrument consisting of a sampling inlet, a small-scale gas chromatograph, a miniature, quadrupole mass spectrometer, vacuum pumps, and software. A pair of computer-driven valves controls the gas sampling and introduction to the chromatographic column. The column has a stainless steel exterior and a silica interior, and contains an adsorbent of that is used to separate organic species. The detection system is based on a quadrupole mass spectrometer consisting of a micropole array, electrometer, and a computer interface. The vacuum system has two miniature pumps to maintain the low pressure needed for the mass spectrometer. A laptop computer uses custom software to control the entire system and collect the data. In a laboratory demonstration, the system separated calibration mixtures containing 1000 ppm of alkanes and alkenes.

  8. Means of storage and automated monitoring of versions of text technical documentation

    NASA Astrophysics Data System (ADS)

    Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.

    2018-03-01

    The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.

  9. SDO FlatSat Facility

    NASA Technical Reports Server (NTRS)

    Amason, David L.

    2008-01-01

    The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.

  10. GROVER: An autonomous vehicle for ice sheet research

    NASA Astrophysics Data System (ADS)

    Trisca, G. O.; Robertson, M. E.; Marshall, H.; Koenig, L.; Comberiate, M. A.

    2013-12-01

    The Goddard Remotely Operated Vehicle for Exploration and Research or Greenland Rover (GROVER) is a science enabling autonomous robot specifically designed to carry a low-power, large bandwidth radar for snow accumulation mapping over the Greenland Ice Sheet. This new and evolving technology enables reduced cost and increased safety for polar research. GROVER was field tested at Summit, Greenland in May 2013. The robot traveled over 30 km and was controlled both by line of sight wireless and completely autonomously with commands and telemetry via the Iridium Satellite Network, from Summit as well as remotely from Boise, Idaho. Here we describe GROVER's unique abilities and design. The software stack features a modular design that can be adapted for any application that requires autonomous behavior, reliable communications using different technologies and low level control of peripherals. The modules are built to communicate using the publisher-subscriber design pattern to maximize data-reuse and allow for graceful failures at the software level, along with the ability to be loaded or unloaded on-the-fly, enabling the software to adopt different behaviors based on power constraints or specific processing needs. These modules can also be loaded or unloaded remotely for servicing and telemetry can be configured to contain any kind of information being generated by the sensors or scientific instruments. The hardware design protects the electronic components and the control system can change functional parameters based on sensor input. Power failure modes built into the hardware prevent the vehicle from running out of energy permanently by monitoring voltage levels and triggering software reboots when the levels match pre-established conditions. This guarantees that the control software will be operational as soon as there is enough charge to sustain it, giving the vehicle increased longevity in case of a temporary power loss. GROVER demonstrates that autonomous rovers can be a revolutionary tool for data collection, and that both the technology and the software are available and ready to be implemented to create scientific data collection platforms.

  11. Remote monitoring and fault recovery for FPGA-based field controllers of telescope and instruments

    NASA Astrophysics Data System (ADS)

    Zhu, Yuhua; Zhu, Dan; Wang, Jianing

    2012-09-01

    As the increasing size and more and more functions, modern telescopes have widely used the control architecture, i.e. central control unit plus field controller. FPGA-based field controller has the advantages of field programmable, which provide a great convenience for modifying software and hardware of control system. It also gives a good platform for implementation of the new control scheme. Because of multi-controlled nodes and poor working environment in scattered locations, reliability and stability of the field controller should be fully concerned. This paper mainly describes how we use the FPGA-based field controller and Ethernet remote to construct monitoring system with multi-nodes. When failure appearing, the new FPGA chip does self-recovery first in accordance with prerecovery strategies. In case of accident, remote reconstruction for the field controller can be done through network intervention if the chip is not being restored. This paper also introduces the network remote reconstruction solutions of controller, the system structure and transport protocol as well as the implementation methods. The idea of hardware and software design is given based on the FPGA. After actual operation on the large telescopes, desired results have been achieved. The improvement increases system reliability and reduces workload of maintenance, showing good application and popularization.

  12. TOPEX Software Document Series. Volume 5; Rev. 1; TOPEX GDR Processing

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis; Hancock, David W., III

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Geophysical Data Record (GDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  13. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  14. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  15. Approach and Instrument Placement Validation

    NASA Technical Reports Server (NTRS)

    Ator, Danielle

    2005-01-01

    The Mars Exploration Rovers (MER) from the 2003 flight mission represents the state of the art technology for target approach and instrument placement on Mars. It currently takes 3 sols (Martian days) for the rover to place an instrument on a designated rock target that is about 10 to 20 m away. The objective of this project is to provide an experimentally validated single-sol instrument placement capability to future Mars missions. After completing numerous test runs on the Rocky8 rover under various test conditions, it has been observed that lighting conditions, shadow effects, target features and the initial target distance have an effect on the performance and reliability of the tracking software. Additional software validation testing will be conducted in the months to come.

  16. Software Manages Documentation in a Large Test Facility

    NASA Technical Reports Server (NTRS)

    Gurneck, Joseph M.

    2001-01-01

    The 3MCS computer program assists and instrumentation engineer in performing the 3 essential functions of design, documentation, and configuration management of measurement and control systems in a large test facility. Services provided by 3MCS are acceptance of input from multiple engineers and technicians working at multiple locations;standardization of drawings;automated cross-referencing; identification of errors;listing of components and resources; downloading of test settings; and provision of information to customers.

  17. Real-Time Tomography Mooring

    DTIC Science & Technology

    1992-06-01

    Vi th this sampling schedule the data logger has enough (data. storage capacity for, a five yea r deploymeneit-. SYsteim specifica tionis are shown...sit [I t ille check is pwrformed as a virttial device. called -Ilinei check". which is scheduled in the task tabhe aid(] executed by the systsenm cot...PCI as FSK carrier detect input- add function (5) to do timer/counter control - stop counter when defaults set I 6. Instrument scheduler software

  18. Large research infrastructure for Earth-Ocean Science: Challenges of multidisciplinary integration across hardware, software, and people networks

    NASA Astrophysics Data System (ADS)

    Best, M.; Barnes, C. R.; Johnson, F.; Pautet, L.; Pirenne, B.; Founding Scientists Of Neptune Canada

    2010-12-01

    NEPTUNE Canada is operating a regional cabled ocean observatory across the northern Juan de Fuca Plate in the northeastern Pacific. Installation of the first suite of instruments and connectivity equipment was completed in 2009, so this system now provides the continuous power and bandwidth to collect integrated data on physical, chemical, geological, and biological gradients at temporal resolutions relevant to the dynamics of the earth-ocean system. The building of this facility integrates hardware, software, and people networks. Hardware progress to date includes: installation of the 800km powered fiber-optic backbone in the Fall of 2007; development of Nodes and Junction Boxes; acquisition/development and testing of Instruments; development of mobile instrument platforms such as a) a Vertical Profiler and b) a Crawler (University of Bremmen); and integration of over a thousand components into an operating subsea sensor system. Nodes, extension cables, junction boxes, and instruments were installed at 4 out of 5 locations in 2009; the fifth Node is instrumented in September 2010. In parallel, software and hardware systems are acquiring, archiving, and delivering the continuous real-time data through the internet to the world - already many terabytes of data. A web environment (Oceans 2.0) to combine this data access with analysis and visualization, collaborative tools, interoperability, and instrument control is being released. Finally, a network of scientists and technicians are contributing to the process in every phase, and data users already number in the thousands. Initial experiments were planned through a series of workshops and international proposal competitions. At inshore Folger Passage, Barkley Sound, understanding controls on biological productivity help evaluate the effects that marine processes have on fish and marine mammals. Experiments around Barkley Canyon allow quantification of changes in biological and chemical activity associated with nutrient and cross-shelf sediment transport around the shelf/slope break and through the canyon to the deep sea. There and north along the mid-continental slope, instruments on exposed and shallowly buried gas hydrates allow monitoring of changes in their distribution, structure, and venting, particularly related to earthquakes, slope failures and regional plate motions. Circulation obviation retrofit kits (CORKs) at mid-plate ODP 1026-7 monitor real-time changes in crustal temperature and pressure, particularly as they relate to events such as earthquakes, hydrothermal convection or regional plate strain. At Endeavour Ridge, complex interactions among volcanic, tectonic, hydrothermal and biological processes are quantified at the western edge of the Juan de Fuca plate. Across the network, high resolution seismic information elucidates tectonic processes such as earthquakes, and a tsunami system allows determination of open ocean tsunami amplitude, propagation direction, and speed. The infrastructure has further capacity for experiments to expand from this initial suite. Further information and opportunities can be found at http://www.neptunecanada.ca

  19. Development of high-availability ATCA/PCIe data acquisition instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Correia, Miguel; Sousa, Jorge; Batista, Antonio J.N.

    2015-07-01

    Latest Fusion energy experiments envision a quasi-continuous operation regime. In consequence, the largest experimental devices, currently in development, specify high-availability (HA) requirements for the whole plant infrastructure. HA features enable the whole facility to perform seamlessly in the case of failure of any of its components, coping with the increasing duration of plasma discharges (steady-state) and assuring safety of equipment, people, environment and investment. IPFN developed a control and data acquisition system, aiming for fast control of advanced Fusion devices, which is thus required to provide such HA features. The system is based on in-house developed Advanced Telecommunication Computing Architecturemore » (ATCA) instrumentation modules - IO blades and data switch blades, establishing a PCIe network on the ATCA shelf's back-plane. The data switch communicates to an external host computer through a PCIe data network. At the hardware management level, the system architecture takes advantage of ATCA native redundancy and hot swap specifications to implement fail-over substitution of IO or data switch blades. A redundant host scheme is also supported by the ATCA/PCIe platform. At the software level, PCIe provides implementation of hot plug services, which translate the hardware changes to the corresponding software/operating system devices. The paper presents how the ATCA and PCIe based system can be setup to perform with the desired degree of HA, thus being suitable for advanced Fusion control and data acquisition systems. (authors)« less

  20. Ultrasonic wave-based structural health monitoring embedded instrument.

    PubMed

    Aranguren, G; Monje, P M; Cokonaj, Valerijan; Barrera, Eduardo; Ruiz, Mariano

    2013-12-01

    Piezoelectric sensors and actuators are the bridge between electronic and mechanical systems in structures. This type of sensor is a key element in the integrity monitoring of aeronautic structures, bridges, pressure vessels, wind turbine blades, and gas pipelines. In this paper, an all-in-one system for Structural Health Monitoring (SHM) based on ultrasonic waves is presented, called Phased Array Monitoring for Enhanced Life Assessment. This integrated instrument is able to generate excitation signals that are sent through piezoelectric actuators, acquire the received signals in the piezoelectric sensors, and carry out signal processing to check the health of structures. To accomplish this task, the instrument uses a piezoelectric phased-array transducer that performs the actuation and sensing of the signals. The flexibility and strength of the instrument allow the user to develop and implement a substantial part of the SHM technique using Lamb waves. The entire system is controlled using configuration software and has been validated through functional, electrical loading, mechanical loading, and thermal loading resistance tests.

  1. Portable optical spectroscopy for accurate analysis of ethane in exhaled breath

    NASA Astrophysics Data System (ADS)

    Patterson, Claire S.; McMillan, Lesley C.; Longbottom, Christopher; Gibson, Graham M.; Padgett, Miles J.; Skeldon, Kenneth D.

    2007-05-01

    We report on a maintenance-free, ward-portable, tunable diode laser spectroscopy system for the ultra-sensitive detection of ethane gas. Ethane is produced when cellular lipids are oxidized by free radicals. As a breath biomarker, ethane offers a unique measure of such oxidative stress. The ability to measure real-time breath ethane fluctuations will open up new areas in non-invasive healthcare. Instrumentation for such a purpose must be highly sensitive and specific to the target gas. Our technology has a sensitivity of 70 parts per trillion and a 1 s sampling rate. Based on a cryogenically cooled lead-salt laser, the instrument has a thermally managed closed-loop refrigeration system, eliminating the need for liquid coolants. Custom LabVIEW software allows automatic control by a laptop PC. We have field tested the instrument to ensure that target performance is sustained in a range of environments. We outline the novel applications underway with the instrument based on an in vivo clinical assessment of oxidative stress.

  2. Stroboscope Controller for Imaging Helicopter Rotors

    NASA Technical Reports Server (NTRS)

    Jensen, Scott; Marmie, John; Mai, Nghia

    2004-01-01

    A versatile electronic timing-and-control unit, denoted a rotorcraft strobe controller, has been developed for use in controlling stroboscopes, lasers, video cameras, and other instruments for capturing still images of rotating machine parts especially helicopter rotors. This unit is designed to be compatible with a variety of sources of input shaftangle or timing signals and to be capable of generating a variety of output signals suitable for triggering instruments characterized by different input-signal specifications. It is also designed to be flexible and reconfigurable in that it can be modified and updated through changes in its control software, without need to change its hardware. Figure 1 is a block diagram of the rotorcraft strobe controller. The control processor is a high-density complementary metal oxide semiconductor, singlechip 8-bit microcontroller. It is connected to a 32K x 8 nonvolatile static random-access memory (RAM) module. Also connected to the control processor is a 32K 8 electrically programmable read-only-memory (EPROM) module, which is used to store the control software. Digital logic support circuitry is implemented in a field-programmable gate array (FPGA). A 240 x 128-dot, 40- character 16-line liquid-crystal display (LCD) module serves as a graphical user interface; the user provides input through a 16-key keypad mounted next to the LCD. A 12-bit digital-to-analog converter (DAC) generates a 0-to-10-V ramp output signal used as part of a rotor-blade monitoring system, while the control processor generates all the appropriate strobing signals. Optocouplers are used to isolate all input and output digital signals, and optoisolators are used to isolate all analog signals. The unit is designed to fit inside a 19-in. (.48-cm) rack-mount enclosure. Electronic components are mounted on a custom printed-circuit board (see Figure 2). Two power-conversion modules on the printedcircuit board convert AC power to +5 VDC and 15 VDC, respectively.

  3. Broadband Seismometer at 2500m Depth in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Deschamps, A.; Hello, Y.; Charvis, P.; Dugué, M.; Bertin, V.; Valdy, P.; Le van Suu, A.; Real, D.

    2003-04-01

    In the frame of the ANTARES project, devoted to solar neutrinos detection across a large (0.1 km3) water volume located in deep sea, sea bottom facilities were developed at a depth of 2500m. Power supply, instrumentation control and data transmission have been implemented offshore Toulon (France) through a 43km long marine cable. A broadband seismological sensor has been installed among the instrumentation to control of physical and chemical environment of the neutrino detectors. The instrument was designed by Guralp Systems on the basis of CMG-3T seismometer (band-pass 120s-50hz) connected to a CMG DM24 digitizer for mechanical control and signal digitisation. Seismometer was inserted in titanium housing which fulfil the safety requirements of deployment operation. Control of CMG DM24 through asynchronous RS232 serial line was implemented in the ANTARES acquisition software running on the sea bottom. An interface running at the surface allows control and storage of the data. In January, the sensor was launched with the ANTARES instrumental line. In a second step the sensor has been slightly moved away from the ANTARES structure (60m), partly buried in the ground, roughly levelled and oriented by the IFREMER submarine Nautile. During the same operation the instrumentation line was connected to the power supply and data acquisition control. Masses of the seismometer were unlocked from the surface. Data are now continuously collected from Toulon and transmitted to Geosciences Azur in quasi real time. After a test period of 3 months, the sensor should be recovered for upgrades. For the final deployment (10 years), the ANTARES time control signal should be used to synchronise the sensor internal clock. This is the first real time broadband seismometer deployed in Europe and it will increase, in the future, our capability of marine earthquake detection in the area.

  4. Compact Low Power DPU for Plasma Instrument LINA on the Russian Luna-Glob Lander

    NASA Astrophysics Data System (ADS)

    Schmidt, Walter; Riihelä, Pekka; Kallio, Esa

    2013-04-01

    The Swedish Institute for Space Physics in Kiruna is bilding a Lunar Ions and Neutrals Analyzer (LINA) for the Russian Luna-Glob lander mission and its orbiter, to be launched around 2016 [1]. The Finnish Meteorological Institute is responsible for designing and building the central data processing units (DPU) for both instruments. The design details were optimized to serve as demonstrator also for a similar instrument on the Jupiter mission JUICE. To accommodate the originally set short development time and to keep the design between orbiter and Lander as similar as possible, the DPU is built around two re-programmable flash-based FPGAs from Actel. One FPGA contains a public-domain 32-bit processor core identical for both Lander and orbiter. The other FPGA handles all interfaces to the spacecraft system and the detectors, somewhat different for both implementations. Monitoring of analog housekeeping data is implemented as an IP-core from Stellamar inside the interface FPGA, saving mass, volume and especially power while simplifying the radiation protection design. As especially on the Lander the data retention before transfer to the orbiter cannot be guaranteed under all conditions, the DPU includes a Flash-PROM containing several software versions and data storage capability. With the memory management implemented inside the interface FPGA, one of the serial links can also be used as test port to verify the system, load the initial software into the Flash-PROM and to control the detector hardware directly without support by the processor and a ready developed operating system and software. Implementation and performance details will be presented. Reference: [1] http://www.russianspaceweb.com/luna_glob_lander.html.

  5. Automation of the 1.3-meter Robotically Controlled Telescope (RCT)

    NASA Astrophysics Data System (ADS)

    Gelderman, Richard; Treffers, Richard R.

    2011-03-01

    This poster describes the automation for the Robotically Controlled Telescope (RCT) Consortium of the 50-inch telescope at Kitt Peak National Observatory. Building upon the work of the previous contractor the telescope, dome and instrument were wired for totally autonomous (robotic) observations. The existing motors, encoders, limit switches and cables were connected to an open industrial panel that allows easy interconnection, troubleshooting and modifications. A sixteen axis Delta Tau Turbo PMAC controller is used to control all motors, encoders, flat field lights and many of the digital functions of the telescope. ADAM industrial I/O bricks are used for additional digital and analog I/O functions. Complex relay logic problems, such as the mirror cover opening sequence and the slit control, are managed using Allen Bradley Pico PLDs. Most of the low level software is written in C using the GNU compiler. The basic functionality uses an ASCII protocol communicating over Berkeley sockets. Early versions of this software were developed at U.C. Berkeley, for what was to become the Katzman Automatic Imaging Telescope (KAIT) at Lick Observatory. ASCII communications are useful for control, testing and easy to debug by looking at the log files; C-shell scripts are written to form more complex orchestrations.

  6. Software framework for automatic learning of telescope operation

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Molgó, Jordi; Guerra, Dailos

    2016-07-01

    The "Gran Telescopio de Canarias" (GTC) is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). The GTC Control System (GCS) is a distributed object and component oriented system based on RT-CORBA and it is responsible for the operation of the telescope, including its instrumentation. The current development state of GCS is mature and fully operational. On the one hand telescope users as PI's implement the sequences of observing modes of future scientific instruments that will be installed in the telescope and operators, in turn, design their own sequences for maintenance. On the other hand engineers develop new components that provide new functionality required by the system. This great work effort is possible to minimize so that costs are reduced, especially if one considers that software maintenance is the most expensive phase of the software life cycle. Could we design a system that allows the progressive assimilation of sequences of operation and maintenance of the telescope, through an automatic self-programming system, so that it can evolve from one Component oriented organization to a Service oriented organization? One possible way to achieve this is to use mechanisms of learning and knowledge consolidation to reduce to the minimum expression the effort to transform the specifications of the different telescope users to the operational deployments. This article proposes a framework for solving this problem based on the combination of the following tools: data mining, self-Adaptive software, code generation, refactoring based on metrics, Hierarchical Agglomerative Clustering and Service Oriented Architectures.

  7. A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System

    NASA Astrophysics Data System (ADS)

    Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.

    2010-05-01

    The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.

  8. The instrumental genesis process in future primary teachers using Dynamic Geometry Software

    NASA Astrophysics Data System (ADS)

    Ruiz-López, Natalia

    2018-05-01

    This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.

  9. Extreme Ultraviolet Imaging Telescope (EIT)

    NASA Technical Reports Server (NTRS)

    Lemen, J. R.; Freeland, S. L.

    1997-01-01

    Efforts concentrated on development and implementation of the SolarSoft (SSW) data analysis system. From an EIT analysis perspective, this system was designed to facilitate efficient reuse and conversion of software developed for Yohkoh/SXT and to take advantage of a large existing body of software developed by the SDAC, Yohkoh, and SOHO instrument teams. Another strong motivation for this system was to provide an EIT analysis environment which permits coordinated analysis of EIT data in conjunction with data from important supporting instruments, including Yohkoh/SXT and the other SOHO coronal instruments; CDS, SUMER, and LASCO. In addition, the SSW system will support coordinated EIT/TRACE analysis (by design) when TRACE data is available; TRACE launch is currently planned for March 1998. Working with Jeff Newmark, the Chianti software package (K.P. Dere et al) and UV /EUV data base was fully integrated into the SSW system to facilitate EIT temperature and emission analysis.

  10. Dual Active Bridge based DC Transformer LabVIEW FPGA Control Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    In the area of power electronics control, Field Programmable Gate Arrays (FPGAs) have the capability to outperform their Digital Signal Processor (DSP) counterparts due to the FPGA’s ability to implement true parallel processing and therefore facilitate higher switching frequencies, higher control bandwidth, and/or enhanced functionality. National Instruments (NI) has developed two platforms, Compact RIO (cRIO) and Single Board RIO (sbRIO), which combine a real-time processor with an FPGA. The FPGA can be programmed with a subset of the well-known LabVIEW graphical programming language. The candidate software implements complete control algorithms in LabVIEW FPGA for a DC Transformer (DCX) based onmore » a dual active bridge (DAB). A DCX is an isolated bi-directional DC-DC converter designed to operate at unity conversion ratio, M, defined by where Vin is the primary-side DC bus voltage, Vout is the secondary-side DC bus voltage, and n is the turns ratio of the embedded high frequency transformer (HFX). The DCX based on a DAB incorporates two H-bridges, a resonant inductor, and an HFX to provide this functionality. The candidate software employs phase-shift modulation of the two H-bridges and a feedback loop to regulate the conversion ratio at unity. The software also includes alarm-handling capabilities as well as debugging and tuning tools. The software fits on the Xilinx Virtex V LX110 FPGA embedded in the NI cRIO-9118 FPGA chassis, and with a 40 MHz base clock, supports a modulation update rate of 40 MHz, and user-settable switching frequencies and synchronized control loop update rates of tens of kHz.« less

  11. Hardware fault insertion and instrumentation system: Mechanization and validation

    NASA Technical Reports Server (NTRS)

    Benson, J. W.

    1987-01-01

    Automated test capability for extensive low-level hardware fault insertion testing is developed. The test capability is used to calibrate fault detection coverage and associated latency times as relevant to projecting overall system reliability. Described are modifications made to the NASA Ames Reconfigurable Flight Control System (RDFCS) Facility to fully automate the total test loop involving the Draper Laboratories' Fault Injector Unit. The automated capability provided included the application of sequences of simulated low-level hardware faults, the precise measurement of fault latency times, the identification of fault symptoms, and bulk storage of test case results. A PDP-11/60 served as a test coordinator, and a PDP-11/04 as an instrumentation device. The fault injector was controlled by applications test software in the PDP-11/60, rather than by manual commands from a terminal keyboard. The time base was especially developed for this application to use a variety of signal sources in the system simulator.

  12. Control and Data Acquisition for the Spherical Tokamak MEDUSA-CR

    NASA Astrophysics Data System (ADS)

    Soto, Christian; Gonzalez, Jeferson; Carvajal, Johan; Ribeiro, Celso

    2013-10-01

    The former spherical tokamak (ST) MEDUSA (Madison EDUcation Small Aspect.ratio tokamak, R < 0.14 m, a < 0.10 m, BT < 0.5 T, Ip < 40 kA, 3 ms pulse) is being recommissioned in Costa Rica Institute of Technology. The main objectives of the MEDUSA-CR project are training and to clarify several issues in relevant physics for conventional and mainly STs, including beta studies in bean-shaped ST plasmas, transport, heating and current drive via Alfvén wave, and natural divertor STs with ergodic magnetic limiter. We present here the control and data acquisition systems for MEDUSA-CR device which are based on National Instruments (NI) software (LabView) and hardware on loan to our laboratory via NI-Costa Rica. The interface with the energy, gas fueling, and security systems are also presented. VIE-ITCR, IAEA-CRP contract 17592, National Instruments of Costa Rica.

  13. Recovering from "amnesia" brought about by radiation. Verification of the "Over the air" (OTA) application software update mechanism On-Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo

    Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.

  14. Preservation of root canal anatomy using self-adjusting file instrumentation with glide path prepared by 20/0.02 hand files versus 20/0.04 rotary files

    PubMed Central

    Jain, Niharika; Pawar, Ajinkya M.; Ukey, Piyush D.; Jain, Prashant K.; Thakur, Bhagyashree; Gupta, Abhishek

    2017-01-01

    Objectives: To compare the relative axis modification and canal concentricity after glide path preparation with 20/0.02 hand K-file (NITIFLEX®) and 20/0.04 rotary file (HyFlex™ CM) with subsequent instrumentation with 1.5 mm self-adjusting file (SAF). Materials and Methods: One hundred and twenty ISO 15, 0.02 taper, Endo Training Blocks (Dentsply Maillefer, Ballaigues, Switzerland) were acquired and randomly divided into following two groups (n = 60): group 1, establishing glide path till 20/0.02 hand K-file (NITIFLEX®) followed by instrumentation with 1.5 mm SAF; and Group 2, establishing glide path till 20/0.04 rotary file (HyFlex™ CM) followed by instrumentation with 1.5 mm SAF. Pre- and post-instrumentation digital images were processed with MATLAB R 2013 software to identify the central axis, and then superimposed using digital imaging software (Picasa 3.0 software, Google Inc., California, USA) taking five landmarks as reference points. Student's t-test for pairwise comparisons was applied with the level of significance set at 0.05. Results: Training blocks instrumented with 20/0.04 rotary file and SAF were associated less deviation in canal axis (at all the five marked points), representing better canal concentricity compared to those, in which glide path was established by 20/0.02 hand K-files followed by SAF instrumentation. Conclusion: Canal geometry is better maintained after SAF instrumentation with a prior glide path established with 20/0.04 rotary file. PMID:28855752

  15. Computer Center: Software Review: The DynaPulse 200M.

    ERIC Educational Resources Information Center

    Pankiewicz, Philip R., Ed.

    1995-01-01

    Reviews the DynaPulse 200M Education Edition microcomputer-based laboratory, which combines interactive software with curriculum and medical instrumentation to teach students about the cardiovascular system. (MKR)

  16. 21 CFR 862.2570 - Instrumentation for clinical multiplex test systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical... hardware components, as well as raw data storage mechanisms, data acquisition software, and software to...

  17. Technical Basis for Evaluating Software-Related Common-Cause Failures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muhlheim, Michael David; Wood, Richard

    2016-04-01

    The instrumentation and control (I&C) system architecture at a nuclear power plant (NPP) incorporates protections against common-cause failures (CCFs) through the use of diversity and defense-in-depth. Even for well-established analog-based I&C system designs, the potential for CCFs of multiple systems (or redundancies within a system) constitutes a credible threat to defeating the defense-in-depth provisions within the I&C system architectures. The integration of digital technologies into the I&C systems provides many advantages compared to the aging analog systems with respect to reliability, maintenance, operability, and cost effectiveness. However, maintaining the diversity and defense-in-depth for both the hardware and software within themore » digital system is challenging. In fact, the introduction of digital technologies may actually increase the potential for CCF vulnerabilities because of the introduction of undetected systematic faults. These systematic faults are defined as a “design fault located in a software component” and at a high level, are predominately the result of (1) errors in the requirement specification, (2) inadequate provisions to account for design limits (e.g., environmental stress), or (3) technical faults incorporated in the internal system (or architectural) design or implementation. Other technology-neutral CCF concerns include hardware design errors, equipment qualification deficiencies, installation or maintenance errors, instrument loop scaling and setpoint mistakes.« less

  18. Evaluation of a low-end architecture for collaborative software development, remote observing, and data analysis from multiple sites

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro; Otruba, Wolfgang; Hanslmeier, Arnold

    2000-06-01

    The Kanzelhoehe Solar Observatory is an observing facility located in Carinthia (Austria) and operated by the Institute of Geophysics, Astrophysics and Meteorology of the Karl- Franzens University Graz. A set of instruments for solar surveillance at different wavelengths bands is continuously operated in automatic mode and is presently being upgraded to be used in supplying near-real-time solar activity indexes for space weather applications. In this frame, we tested a low-end software/hardware architecture running on the PC platform in a non-homogeneous, remotely distributed environment that allows efficient or moderately efficient application sharing at the Intranet and Extranet (i.e., Wide Area Network) levels respectively. Due to the geographical distributed of participating teams (Trieste, Italy; Kanzelhoehe and Graz, Austria), we have been using such features for collaborative remote software development and testing, data analysis and calibration, and observing run emulation from multiple sites as well. In this work, we describe the used architecture and its performances based on a series of application sharing tests we carried out to ascertain its effectiveness in real collaborative remote work, observations and data exchange. The system proved to be reliable at the Intranet level for most distributed tasks, limited to less demanding ones at the Extranet level, but quite effective in remote instrument control when real time response is not needed.

  19. Analysis of methods of processing of expert information by optimization of administrative decisions

    NASA Astrophysics Data System (ADS)

    Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.

    2018-03-01

    In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.

  20. Cavity-enhanced quantum-cascade laser-based instrument for carbon monoxide measurements.

    PubMed

    Provencal, Robert; Gupta, Manish; Owano, Thomas G; Baer, Douglas S; Ricci, Kenneth N; O'Keefe, Anthony; Podolske, James R

    2005-11-01

    An autonomous instrument based on off-axis integrated cavity output spectroscopy has been developed and successfully deployed for measurements of carbon monoxide in the troposphere and tropopause onboard a NASA DC-8 aircraft. The instrument (Carbon Monoxide Gas Analyzer) consists of a measurement cell comprised of two high-reflectivity mirrors, a continuous-wave quantum-cascade laser, gas sampling system, control and data-acquisition electronics, and data-analysis software. CO measurements were determined from high-resolution CO absorption line shapes obtained by tuning the laser wavelength over the R(7) transition of the fundamental vibration band near 2172.8 cm(-1). The instrument reports CO mixing ratio (mole fraction) at a 1-Hz rate based on measured absorption, gas temperature, and pressure using Beer's Law. During several flights in May-June 2004 and January 2005 that reached altitudes of 41,000 ft (12.5 km), the instrument recorded CO values with a precision of 0.2 ppbv (1-s averaging time) and an accuracy limited by the reference CO gas cylinder (uncertainty < 1.0%). Despite moderate turbulence and measurements of particulate-laden airflows, the instrument operated consistently and did not require any maintenance, mirror cleaning, or optical realignment during the flights.

  1. Comparison of air-driven vs electric torque control motors on canal centering ability by ProTaper NiTi rotary instruments.

    PubMed

    Zarei, Mina; Javidi, Maryam; Erfanian, Mahdi; Lomee, Mahdi; Afkhami, Farzaneh

    2013-01-01

    Cleaning and shaping is one of the most important phases in root canal therapy. Various rotary NiTi systems minimize accidents and facilitate the shaping process. Todays NiTi files are used with air-driven and electric handpieces. This study compared the canal centering after instrumentation using the ProTaper system using Endo IT, electric torque-control motor, and NSK air-driven handpiece. This ex vivo randomized controlled trial study involved 26 mesial mandibular root canals with 10 to 35° curvature. The roots were randomly divided into 2 groups of 13 canals each. The roots were mounted in an endodontic cube with acrylic resin, sectioned horizontally at 2, 6 and 10 mm from the apex and then reassembled. The canals were instrumented according to the manufacturer's instructions using ProTaper rotary files and electric torque-control motors (group 1) or air-driven handpieces (group 2). Photographs of the cross-sections included shots before and after instrumentation, and image analysis was performed using Photoshop software. The centering ability and canal transportation was also evaluated. Repeated measurement and independent t-test provided statistical analysis of canal transportation. The comparison of the rate of transportation toward internal or external walls between the two groups was not statistically significant (p = 0.62). Comparison of the rate of transportation of sections within one group was not significant (p = 0.28). Use of rotary NiTi file with either electric torquecontrol motor or air-driven handpiece had no effect on canal centering. NiTi rotary instruments can be used with air-driven motors without any considerable changes in root canal anatomy, however it needs the clinician to be expert.

  2. Open source pipeline for ESPaDOnS reduction and analysis

    NASA Astrophysics Data System (ADS)

    Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan

    2012-09-01

    OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".

  3. Laboratory and telescope demonstration of the TP3-WFS for the adaptive optics segment of AOLI

    NASA Astrophysics Data System (ADS)

    Colodro-Conde, C.; Velasco, S.; Fernández-Valdivia, J. J.; López, R.; Oscoz, A.; Rebolo, R.; Femenía, B.; King, D. L.; Labadie, L.; Mackay, C.; Muthusubramanian, B.; Pérez Garrido, A.; Puga, M.; Rodríguez-Coira, G.; Rodríguez-Ramos, L. F.; Rodríguez-Ramos, J. M.; Toledo-Moreo, R.; Villó-Pérez, I.

    2017-05-01

    Adaptive Optics Lucky Imager (AOLI) is a state-of-the-art instrument that combines adaptive optics (AO) and lucky imaging (LI) with the objective of obtaining diffraction-limited images in visible wavelength at mid- and big-size ground-based telescopes. The key innovation of AOLI is the development and use of the new Two Pupil Plane Positions Wavefront Sensor (TP3-WFS). The TP3-WFS, working in visible band, represents an advance over classical wavefront sensors such as the Shack-Hartmann WFS because it can theoretically use fainter natural reference stars, which would ultimately provide better sky coverages to AO instruments using this newer sensor. This paper describes the software, algorithms and procedures that enabled AOLI to become the first astronomical instrument performing real-time AO corrections in a telescope with this new type of WFS, including the first control-related results at the William Herschel Telescope.

  4. The automatic neutron guide optimizer guide_bot

    NASA Astrophysics Data System (ADS)

    Bertelsen, Mads

    2017-09-01

    The guide optimization software guide_bot is introduced, the main purpose of which is to reduce the time spent programming when performing numerical optimization of neutron guides. A limited amount of information on the overall guide geometry and a figure of merit describing the desired beam is used to generate the code necessary to solve the problem. A generated McStas instrument file performs the Monte Carlo ray-tracing, which is controlled by iFit optimization scripts. The resulting optimal guide is thoroughly characterized, both in terms of brilliance transfer from an idealized source and on a more realistic source such as the ESS Butterfly moderator. Basic MATLAB knowledge is required from the user, but no experience with McStas or iFit is necessary. This paper briefly describes how guide_bot is used and some important aspects of the code. A short validation against earlier work is performed which shows the expected agreement. In addition a scan over the vertical divergence requirement, where individual guide optimizations are performed for each corresponding figure of merit, provides valuable data on the consequences of this parameter. The guide_bot software package is best suited for the start of an instrument design project as it excels at comparing a large amount of different guide alternatives for a specific set of instrument requirements, but is still applicable in later stages as constraints can be used to optimize more specific guides.

  5. Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center

    NASA Astrophysics Data System (ADS)

    Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott

    2016-05-01

    When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.

  6. Analysis of spacecraft data

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A software program for the production and analysis of data from the Dynamics Explorer-A (DE-A) satellite was maintained and modified and new software initiated. A capability was developed to process DE-A plasma-wave instrument mission analysis files on the Tektronic 4027 color CRT, for which two programs were written. The algorithm for the calibration lookup table for the plasma-wave instrument data was modified and verified, and a production program to generate color FR-80 spectrograms was written.

  7. Virtual and flexible digital signal processing system based on software PnP and component works

    NASA Astrophysics Data System (ADS)

    He, Tao; Wu, Qinghua; Zhong, Fei; Li, Wei

    2005-05-01

    An idea about software PnP (Plug & Play) is put forward according to the hardware PnP. And base on this idea, a virtual flexible digital signal processing system (FVDSPS) is carried out. FVDSPS is composed of a main control center, many sub-function modules and other hardware I/O modules. Main control center sends out commands to sub-function modules, and manages running orders, parameters and results of sub-functions. The software kernel of FVDSPS is DSP (Digital Signal Processing) module, which communicates with the main control center through some protocols, accept commands or send requirements. The data sharing and exchanging between the main control center and the DSP modules are carried out and managed by the files system of the Windows Operation System through the effective communication. FVDSPS real orients objects, orients engineers and orients engineering problems. With FVDSPS, users can freely plug and play, and fast reconfigure a signal process system according to engineering problems without programming. What you see is what you get. Thus, an engineer can orient engineering problems directly, pay more attention to engineering problems, and promote the flexibility, reliability and veracity of testing system. Because FVDSPS orients TCP/IP protocol, through Internet, testing engineers, technology experts can be connected freely without space. Engineering problems can be resolved fast and effectively. FVDSPS can be used in many fields such as instruments and meter, fault diagnosis, device maintenance and quality control.

  8. New hardware and software platform for experiments on a HUBER-5042 X-ray diffractometer with a DISPLEX DE-202 helium cryostat in the temperature range of 20-300 K

    NASA Astrophysics Data System (ADS)

    Dudka, A. P.; Antipin, A. M.; Verin, I. A.

    2017-09-01

    Huber-5042 diffractometer with a closed-cycle Displex DE-202 helium cryostat is a unique scientific instrument for carrying out X-ray diffraction experiments when studying the single crystal structure in the temperature range of 20-300 K. To make the service life longer and develop new experimental techniques, the diffractometer control is transferred to a new hardware and software platform. To this end, a modern computer; a new detector reader unit; and new control interfaces for stepper motors, temperature controller, and cryostat vacuum pumping system are used. The system for cooling the X-ray tube, the high-voltage generator, and the helium compressor and pump for maintaining the desired vacuum in the cryostat are replaced. The system for controlling the primary beam shutter is upgraded. A biological shielding is installed. The new program tools, which use the Linux Ubuntu operating system and SPEC constructor, include a set of drivers for control units through the aforementioned interfaces. A program for searching reflections from a sample using fast continuous scanning and a priori information about crystal is written. Thus, the software package for carrying out the complete cycle of precise diffraction experiment (from determining the crystal unit cell to calculating the integral reflection intensities) is upgraded. High quality of the experimental data obtained on this equipment is confirmed in a number of studies in the temperature range from 20 to 300 K.

  9. Correcting acoustic Doppler current profiler discharge measurement bias from moving-bed conditions without global positioning during the 2004 Glen Canyon Dam controlled flood on the Colorado River

    USGS Publications Warehouse

    Gartner, J.W.; Ganju, N.K.

    2007-01-01

    Discharge measurements were made by acoustic Doppler current profiler at two locations on the Colorado River during the 2004 controlled flood from Glen Canyon Dam, Arizona. Measurement hardware and software have constantly improved from the 1980s such that discharge measurements by acoustic profiling instruments are now routinely made over a wide range of hydrologic conditions. However, measurements made with instruments deployed from moving boats require reliable boat velocity data for accurate measurements of discharge. This is normally accomplished by using special acoustic bottom track pings that sense instrument motion over bottom. While this method is suitable for most conditions, high current flows that produce downstream bed sediment movement create a condition known as moving bed that will bias velocities and discharge to lower than actual values. When this situation exists, one solution is to determine boat velocity with satellite positioning information. Another solution is to use a lower frequency instrument. Discharge measurements made during the 2004 Glen Canyon controlled flood were subject to moving-bed conditions and frequent loss of bottom track. Due to site conditions and equipment availability, the measurements were conducted without benefit of external positioning information or lower frequency instruments. This paper documents and evaluates several techniques used to correct the resulting underestimated discharge measurements. One technique produces discharge values in good agreement with estimates from numerical model and measured hydrographs during the flood. ?? 2007, by the American Society of Limnology and Oceanography, Inc.

  10. Remote environmental sensor array system

    NASA Astrophysics Data System (ADS)

    Hall, Geoffrey G.

    This thesis examines the creation of an environmental monitoring system for inhospitable environments. It has been named The Remote Environmental Sensor Array System or RESA System for short. This thesis covers the development of RESA from its inception, to the design and modeling of the hardware and software required to make it functional. Finally, the actual manufacture, and laboratory testing of the finished RESA product is discussed and documented. The RESA System is designed as a cost-effective way to bring sensors and video systems to the underwater environment. It contains as water quality probe with sensors such as dissolved oxygen, pH, temperature, specific conductivity, oxidation-reduction potential and chlorophyll a. In addition, an omni-directional hydrophone is included to detect underwater acoustic signals. It has a colour, high-definition and a low-light, black and white camera system, which it turn are coupled to a laser scaling system. Both high-intensity discharge and halogen lighting system are included to illuminate the video images. The video and laser scaling systems are manoeuvred using pan and tilt units controlled from an underwater computer box. Finally, a sediment profile imager is included to enable profile images of sediment layers to be acquired. A control and manipulation system to control the instruments and move the data across networks is integrated into the underwater system while a power distribution node provides the correct voltages to power the instruments. Laboratory testing was completed to ensure that the different instruments associated with the RESA performed as designed. This included physical testing of the motorized instruments, calibration of the instruments, benchmark performance testing and system failure exercises.

  11. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  12. Three-phase Four-leg Inverter LabVIEW FPGA Control Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    In the area of power electronics control, Field Programmable Gate Arrays (FPGAs) have the capability to outperform their Digital Signal Processor (DSP) counterparts due to the FPGA’s ability to implement true parallel processing and therefore facilitate higher switching frequencies, higher control bandwidth, and/or enhanced functionality. National Instruments (NI) has developed two platforms, Compact RIO (cRIO) and Single Board RIO (sbRIO), which combine a real-time processor with an FPGA. The FPGA can be programmed with a subset of the well-known LabVIEW graphical programming language. The use of cRIO and sbRIO for power electronics control has developed over the last few yearsmore » to include control of three-phase inverters. Most three-phase inverter topologies include three switching legs. The addition of a fourth-leg to natively generate the neutral connection allows the inverter to serve single-phase loads in a microgrid or stand-alone power system and to balance the three-phase voltages in the presence of significant load imbalance. However, the control of a four-leg inverter is much more complex. In particular, instead of standard two-dimensional space vector modulation (SVM), the inverter requires three-dimensional space vector modulation (3D-SVM). The candidate software implements complete control algorithms in LabVIEW FPGA for a three-phase four-leg inverter. The software includes feedback control loops, three-dimensional space vector modulation gate-drive algorithms, advanced alarm handling capabilities, contactor control, power measurements, and debugging and tuning tools. The feedback control loops allow inverter operation in AC voltage control, AC current control, or DC bus voltage control modes based on external mode selection by a user or supervisory controller. The software includes the ability to synchronize its AC output to the grid or other voltage-source before connection. The software also includes provisions to allow inverter operation in parallel with other voltage regulating devices on the AC or DC buses. This flexibility allows the Inverter to operate as a stand-alone voltage source, connected to the grid, or in parallel with other controllable voltage sources as part of a microgrid or remote power system. In addition, as the inverter is expected to operate under severe unbalanced conditions, the software includes algorithms to accurately compute real and reactive power for each phase based on definitions provided in the IEEE Standard 1459: IEEE Standard Definitions for the Measurement of Electric Power Quantities Under Sinusoidal, Nonsinusoidal, Balanced, or Unbalanced Conditions. Finally, the software includes code to output analog signals for debugging and for tuning of control loops. The software fits on the Xilinx Virtex V LX110 FPGA embedded in the NI cRIO-9118 FPGA chassis, and with a 40 MHz base clock, supports a modulation update rate of 40 MHz, user-settable switching frequencies and synchronized control loop update rates of tens of kHz, and reference waveform generation, including Phase Lock Loop (PLL), update rate of 100 kHz.« less

  13. Optical Testing and Verification Methods for the James Webb Space Telescope Integrated Science Instrument Module Element

    NASA Technical Reports Server (NTRS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; hide

    2016-01-01

    NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  14. Optical testing and verification methods for the James Webb Space Telescope Integrated Science Instrument Module element

    NASA Astrophysics Data System (ADS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.

    2016-09-01

    NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  15. Addressing software security risk mitigations in the life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt

    2003-01-01

    The NASA Office of Safety and Mission Assurance (OSMA) has funded the Jet Propulsion Laboratory (JPL) with a Center Initiative, 'Reducing Software Security Risk through an Integrated Approach' (RSSR), to address this need. The Initiative is a formal approach to addressing software security in the life cycle through the instantiation of a Software Security Assessment Instrument (SSAI) for the development and maintenance life cycles.

  16. An autonomous observation and control system based on EPICS and RTS2 for Antarctic telescopes

    NASA Astrophysics Data System (ADS)

    Zhang, Guang-yu; Wang, Jian; Tang, Peng-yi; Jia, Ming-hao; Chen, Jie; Dong, Shu-cheng; Jiang, Fengxin; Wu, Wen-qing; Liu, Jia-jing; Zhang, Hong-fei

    2016-01-01

    For unattended telescopes in Antarctic, the remote operation, autonomous observation and control are essential. An EPICS-(Experimental Physics and Industrial Control System) and RTS2-(Remote Telescope System, 2nd Version) based autonomous observation and control system with remoted operation is introduced in this paper. EPICS is a set of open source software tools, libraries and applications developed collaboratively and used worldwide to create distributed soft real-time control systems for scientific instruments while RTS2 is an open source environment for control of a fully autonomous observatory. Using the advantage of EPICS and RTS2, respectively, a combined integrated software framework for autonomous observation and control is established that use RTS2 to fulfil the function of astronomical observation and use EPICS to fulfil the device control of telescope. A command and status interface for EPICS and RTS2 is designed to make the EPICS IOC (Input/Output Controller) components integrate to RTS2 directly. For the specification and requirement of control system of telescope in Antarctic, core components named Executor and Auto-focus for autonomous observation is designed and implemented with remote operation user interface based on browser-server mode. The whole system including the telescope is tested in Lijiang Observatory in Yunnan Province for practical observation to complete the autonomous observation and control, including telescope control, camera control, dome control, weather information acquisition with the local and remote operation.

  17. The New Meteor Radar at Penn State: Design and First Observations

    NASA Technical Reports Server (NTRS)

    Urbina, J.; Seal, R.; Dyrud, L.

    2011-01-01

    In an effort to provide new and improved meteor radar sensing capabilities, Penn State has been developing advanced instruments and technologies for future meteor radars, with primary objectives of making such instruments more capable and more cost effective in order to study the basic properties of the global meteor flux, such as average mass, velocity, and chemical composition. Using low-cost field programmable gate arrays (FPGAs), combined with open source software tools, we describe a design methodology enabling one to develop state-of-the art radar instrumentation, by developing a generalized instrumentation core that can be customized using specialized output stage hardware. Furthermore, using object-oriented programming (OOP) techniques and open-source tools, we illustrate a technique to provide a cost-effective, generalized software framework to uniquely define an instrument s functionality through a customizable interface, implemented by the designer. The new instrument is intended to provide instantaneous profiles of atmospheric parameters and climatology on a daily basis throughout the year. An overview of the instrument design concepts and some of the emerging technologies developed for this meteor radar are presented.

  18. NASA Tech Briefs, January 2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Topics covered include: Multisensor Instrument for Real-Time Biological Monitoring; Sensor for Monitoring Nanodevice-Fabrication Plasmas; Backed Bending Actuator; Compact Optoelectronic Compass; Micro Sun Sensor for Spacecraft; Passive IFF: Autonomous Nonintrusive Rapid Identification of Friendly Assets; Finned-Ladder Slow-Wave Circuit for a TWT; Directional Radio-Frequency Identification Tag Reader; Integrated Solar-Energy-Harvesting and -Storage Device; Event-Driven Random-Access-Windowing CCD Imaging System; Stroboscope Controller for Imaging Helicopter Rotors; Software for Checking State-charts; Program Predicts Broadband Noise from a Turbofan Engine; Protocol for a Delay-Tolerant Data-Communication Network; Software Implements a Space-Mission File-Transfer Protocol; Making Carbon-Nanotube Arrays Using Block Copolymers: Part 2; Modular Rake of Pitot Probes; Preloading To Accelerate Slow-Crack-Growth Testing; Miniature Blimps for Surveillance and Collection of Samples; Hybrid Automotive Engine Using Ethanol-Burning Miller Cycle; Fabricating Blazed Diffraction Gratings by X-Ray Lithography; Freeze-Tolerant Condensers; The StarLight Space Interferometer; Champagne Heat Pump; Controllable Sonar Lenses and Prisms Based on ERFs; Measuring Gravitation Using Polarization Spectroscopy; Serial-Turbo-Trellis-Coded Modulation with Rate-1 Inner Code; Enhanced Software for Scheduling Space-Shuttle Processing; Bayesian-Augmented Identification of Stars in a Narrow View; Spacecraft Orbits for Earth/Mars-Lander Radio Relay; and Self-Inflatable/Self-Rigidizable Reflectarray Antenna.

  19. Effects of sodium hypochlorite associated with EDTA and etidronate on apical root transportation.

    PubMed

    Silva e Souza, P A R; das Dores, R S E; Tartari, T; Pinheiro, T P S; Tuji, F M; Silva e Souza, M H

    2014-01-01

    To evaluate the influence of sodium hypochlorite associated with EDTA and etidronate on apical root transportation. Forty-five roots of human mandibular molars with curvatures of 15-25° were embedded in acrylic resin to allow standardized angulation of the initial and final radiographs. The pre-instrumentation radiographs of the mesiobuccal canal of each root were taken using a radiograph digital sensor with a size 15 K-file in the canal. The canals were prepared with the ProTaper Universal system (Dentsply Maillefer, Ballaigues, Switzerland), using one of the following irrigation regimens during the instrumentation (n = 15): G1 - irrigation with 20 mL of saline solution (control); G2 - alternating irrigation with 2.5% hypochlorite solution (NaOCl) (15 mL); and 17% ethylenediaminetetraacetic acid (EDTA) (5 mL). During instrumentation, the canal was filled with NaOCl and then between each exchange of instrument filled with EDTA for 1 min, and G3 - irrigation with 20 mL of 5% NaOCl and 18% etidronate solution (HEBP) mixed in equal parts. The postinstrumentation radiographs were made with a F3 instrument in the canal. The images were magnified and superposed with Adobe Photoshop software (Adobe Systems, Mountain View, CA, USA). Apical transportation was determined with AutoCAD 2012 software (Autodesk Inc., San Rafael, CA, USA) by measuring the distance in millimetres between the tips of the instruments. The results were subjected to the nonparametric statistical Kruskal-Wallis test (α < 0.05). The median transportation and interquartile range values were 0.00 ± 0.05 for G1, 0.08 ± 0.23 for G2 and 0.13 ± 0.14 for G3. Comparison between groups showed that apical transportation in G3 was significantly greater than in G1 (P < 0.05). The use of NaOCl associated with etidronate increased apical transportation in the canals of extracted teeth. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  20. TES Instrument Decommissioning

    Atmospheric Science Data Center

    2018-03-20

    TES Instrument Decommissioning Tuesday, March 20, 2018 ... PST during a scheduled real time satellite contact the TES IOT along with the Aura FOT commanded the TES instrument to its ... generated from an algorithm update to the base Ground Data System software and will be made available to the scientific community in the ...

  1. Integration of instrumentation and processing software of a laser speckle contrast imaging system

    NASA Astrophysics Data System (ADS)

    Carrick, Jacob J.

    Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.

  2. The AIROPA software package: milestones for testing general relativity in the strong gravity regime with AO

    NASA Astrophysics Data System (ADS)

    Witzel, Gunther; Lu, Jessica R.; Ghez, Andrea M.; Martinez, Gregory D.; Fitzgerald, Michael P.; Britton, Matthew; Sitarski, Breann N.; Do, Tuan; Campbell, Randall D.; Service, Maxwell; Matthews, Keith; Morris, Mark R.; Becklin, E. E.; Wizinowich, Peter L.; Ragland, Sam; Doppmann, Greg; Neyman, Chris; Lyke, James; Kassis, Marc; Rizzi, Luca; Lilley, Scott; Rampy, Rachel

    2016-07-01

    General relativity can be tested in the strong gravity regime by monitoring stars orbiting the supermassive black hole at the Galactic Center with adaptive optics. However, the limiting source of uncertainty is the spatial PSF variability due to atmospheric anisoplanatism and instrumental aberrations. The Galactic Center Group at UCLA has completed a project developing algorithms to predict PSF variability for Keck AO images. We have created a new software package (AIROPA), based on modified versions of StarFinder and Arroyo, that takes atmospheric turbulence profiles, instrumental aberration maps, and images as inputs and delivers improved photometry and astrometry on crowded fields. This software package will be made publicly available soon.

  3. Educational Software Evaluation Form for Teachers

    ERIC Educational Resources Information Center

    Kara, Yilmaz

    2007-01-01

    The purpose of the study was to develop an educational software evaluation form to provide an evaluation and selection instrument of educational software that met the requirements of some balance between mechanics, content and pedagogy that is user friendly. The subjects for the study comprised a group of 32 biology teachers working in secondary…

  4. TOPEX SDR Processing, October 1998. Volume 4

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey E.; Lockwood, Dennis W.

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  5. Evaluation of software sensors for on-line estimation of culture conditions in an Escherichia coli cultivation expressing a recombinant protein.

    PubMed

    Warth, Benedikt; Rajkai, György; Mandenius, Carl-Fredrik

    2010-05-03

    Software sensors for monitoring and on-line estimation of critical bioprocess variables have mainly been used with standard bioreactor sensors, such as electrodes and gas analyzers, where algorithms in the software model have generated the desired state variables. In this article we propose that other on-line instruments, such as NIR probes and on-line HPLC, should be used to make more reliable and flexible software sensors. Five software sensor architectures were compared and evaluated: (1) biomass concentration from an on-line NIR probe, (2) biomass concentration from titrant addition, (3) specific growth rate from titrant addition, (4) specific growth rate from the NIR probe, and (5) specific substrate uptake rate and by-product rate from on-line HPLC and NIR probe signals. The software sensors were demonstrated on an Escherichia coli cultivation expressing a recombinant protein, green fluorescent protein (GFP), but the results could be extrapolated to other production organisms and product proteins. We conclude that well-maintained on-line instrumentation (hardware sensors) can increase the potential of software sensors. This would also strongly support the intentions with process analytical technology and quality-by-design concepts. 2010 Elsevier B.V. All rights reserved.

  6. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  7. Study of application of space telescope science operations software for SIRTF use

    NASA Technical Reports Server (NTRS)

    Dignam, F.; Stetson, E.; Allendoerfer, W.

    1985-01-01

    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.

  8. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  9. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general computing workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed to demonstrate the ability of an expert system to autonomously control the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. Integration options are explored and several possible solutions are presented.

  10. A microprocessor-based control system for the Vienna PDS microdensitometer

    NASA Technical Reports Server (NTRS)

    Jenkner, H.; Stoll, M.; Hron, J.

    1984-01-01

    The Motorola Exorset 30 system, based on a Motorola 6809 microprocessor which serves as control processor for the microdensitometer is presented. User communication and instrument control are implemented in this syatem; data transmission to a host computer is provided via standard interfaces. The Vienna PDS system (VIPS) software was developed in BASIC and M6809 assembler. It provides efficient user interaction via function keys and argument input in a menu oriented environment. All parameters can be stored on, and retrieved from, minifloppy disks, making it possible to set up large scanning tasks. Extensive user information includes continuously updated status and coordinate displays, as well as a real time graphic display during scanning.

  11. Recent developments for the Large Binocular Telescope Guiding Control Subsystem

    NASA Astrophysics Data System (ADS)

    Golota, T.; De La Peña, M. D.; Biddick, C.; Lesser, M.; Leibold, T.; Miller, D.; Meeks, R.; Hahn, T.; Storm, J.; Sargent, T.; Summers, D.; Hill, J.; Kraus, J.; Hooper, S.; Fisher, D.

    2014-07-01

    The Large Binocular Telescope (LBT) has eight Acquisition, Guiding, and wavefront Sensing Units (AGw units). They provide guiding and wavefront sensing capability at eight different locations at both direct and bent Gregorian focal stations. Recent additions of focal stations for PEPSI and MODS instruments doubled the number of focal stations in use including respective motion, camera controller server computers, and software infrastructure communicating with Guiding Control Subsystem (GCS). This paper describes the improvements made to the LBT GCS and explains how these changes have led to better maintainability and contributed to increased reliability. This paper also discusses the current GCS status and reviews potential upgrades to further improve its performance.

  12. Accessible laparoscopic instrument tracking ("InsTrac"): construct validity in a take-home box simulator.

    PubMed

    Partridge, Roland W; Hughes, Mark A; Brennan, Paul M; Hennessey, Iain A M

    2014-08-01

    Objective performance feedback has potential to maximize the training benefit of laparoscopic simulators. Instrument movement metrics are, however, currently the preserve of complex and expensive systems. We aimed to develop and validate affordable, user-ready software that provides objective feedback by tracking instrument movement in a "take-home" laparoscopic simulator. Computer-vision processing tracks the movement of colored bands placed around the distal instrument shafts. The position of each instrument is logged from the simulator camera feed and movement metrics calculated in real time. Ten novices (junior doctors) and 13 general surgery trainees (StR) (training years 3-7) performed a standardized task (threading string through hoops) on the eoSim (eoSurgical™ Ltd., Edinburgh, Scotland, United Kingdom) take-home laparoscopic simulator. Statistical analysis was performed using unpaired t tests with Welch's correction. The software was able to track the instrument tips reliably and effectively. Significant differences between the two groups were observed in time to complete task (StR versus novice, 2 minutes 33 seconds versus 9 minutes 53 seconds; P=.01), total distance traveled by instruments (3.29 m versus 11.38 m, respectively; P=.01), average instrument motion smoothness (0.15 mm/second(3) versus 0.06 mm/second(3), respectively; P<.01), and handedness (mean difference between dominant and nondominant hand) (0.55 m versus 2.43 m, respectively; P=.03). There was no significant difference seen in the distance between instrument tips, acceleration, speed of instruments, or time off-screen. We have developed software that brings objective performance feedback to the portable laparoscopic box simulator. Construct validity has been demonstrated. Removing the need for additional motion-tracking hardware makes it affordable and accessible. It is user-ready and has the potential to enhance the training benefit of portable simulators both in the workplace and at home.

  13. Loading, electromyograph, and motion during exercise

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    1993-01-01

    A system is being developed to gather kineto-dynamic data for a study to determine the load vectors applied to bone during exercise on equipment similar to that used in space. This information will quantify bone loading for exercise countermeasures development. Decreased muscle loading and external loading of bone during weightlessness results in cancellous bone loss of 1 percent per month in the lower extremities and 2 percent per month in the calcaneous. It is hypothesized that loading bone appropriately during exercise may prevent the bone loss. The system consists of an ergometer instrumented to provide position of the pedal (foot), pedaling forces on the foot (on the sagittal plane), and force on the seat. Accelerometers attached to the limbs will provide acceleration. These data will be used as input to an analytical model of the limb to determine forces on the bones and on groups of muscles. EMG signals from activity in the muscles will also be used in conjunction with the equations of mechanics of motion to be able to discern forces exerted by specific muscles. The tasks to be carried out include: design of various mechanical components to mount transducers, specification of mechanical components, specification of position transducers, development of a scheme to control the data acquisition instruments (TEAC recorder and optical encoder board), development of a dynamic model of the limbs in motion, and development of an overall scheme for data collection analysis and presentation. At the present time, all the hardware components of the system are operational, except for a computer board to gather position data from the pedals and crank. This board, however, may be put to use by anyone with background in computer based instrumentation. The software components are not all done. Software to transfer data recorded from the EMG measurements is operational, software to drive the optical encoder card is mostly done. The equations to model the kinematics and dynamics of motion of the limbs have been developed, but they have not yet been implemented in software. Aside from the development of the hardware and software components of the system, the methodology to use accelerometers and encoders and the formulation of the appropriate equations are an important contribution to the area of biomechanics, particularly in space applications.

  14. A steep peripheral ring in irregular cornea topography, real or an instrument error?

    PubMed

    Galindo-Ferreiro, Alicia; Galvez-Ruiz, Alberto; Schellini, Silvana A; Galindo-Alonso, Julio

    2016-01-01

    To demonstrate that the steep peripheral ring (red zone) on corneal topography after myopic laser in situ keratomileusis (LASIK) could possibly due to instrument error and not always to a real increase in corneal curvature. A spherical model for the corneal surface and modifying topography software was used to analyze the cause of an error due to instrument design. This study involved modification of the software of a commercially available topographer. A small modification of the topography image results in a red zone on the corneal topography color map. Corneal modeling indicates that the red zone could be an artifact due to an instrument-induced error. The steep curvature changes after LASIK, signified by the red zone, could be also an error due to the plotting algorithms of the corneal topographer, besides a steep curvature change.

  15. A control system of a mini survey facility for photometric monitoring

    NASA Astrophysics Data System (ADS)

    Tsutsui, Hironori; Yanagisawa, Kenshi; Izumiura, Hideyuki; Shimizu, Yasuhiro; Hanaue, Takumi; Ita, Yoshifusa; Ichikawa, Takashi; Komiyama, Takahiro

    2016-08-01

    We have built a control system for a mini survey facility dedicated to photometric monitoring of nearby bright (K<5) stars in the near-infrared region. The facility comprises a 4-m-diameter rotating dome and a small (30-mm aperture) wide-field (5 × 5 sq. deg. field of view) infrared (1.0-2.5 microns) camera on an equatorial fork mount, as well as power sources and other associated equipment. All the components other than the camera are controlled by microcomputerbased I/O boards that were developed in-house and are in many of the open-use instruments in our observatory. We present the specifications and configuration of the facility hardware, as well as the structure of its control software.

  16. Mechanism controller system for the optical spectroscopic and infrared remote imaging system instrument on board the Rosetta space mission

    NASA Astrophysics Data System (ADS)

    Castro Marín, J. M.; Brown, V. J. G.; López Jiménez, A. C.; Rodríguez Gómez, J.; Rodrigo, R.

    2001-05-01

    The optical, spectroscopic infrared remote imaging system (OSIRIS) is an instrument carried on board the European Space Agency spacecraft Rosetta that will be launched in January 2003 to study in situ the comet Wirtanen. The electronic design of the mechanism controller board (MCB) system of the two OSIRIS optical cameras, the narrow angle camera, and the wide angle camera, is described here. The system is comprised of two boards mounted on an aluminum frame as part of an electronics box that contains the power supply and the digital processor unit of the instrument. The mechanisms controlled by the MCB for each camera are the front door assembly and a filter wheel assembly. The front door assembly for each camera is driven by a four phase, permanent magnet stepper motor. Each filter wheel assembly consists of two, eight filter wheels. Each wheel is driven by a four phase, variable reluctance stepper motor. Each motor, for all the assemblies, also contains a redundant set of four stator phase windings that can be energized separately or in parallel with the main windings. All stepper motors are driven in both directions using the full step unipolar mode of operation. The MCB also performs general housekeeping data acquisition of the OSIRIS instrument, i.e., mechanism position encoders and temperature measurements. The electronic design application used is quite new due to use of a field programmable gate array electronic devices that avoid the use of the now traditional system controlled by microcontrollers and software. Electrical tests of the engineering model have been performed successfully and the system is ready for space qualification after environmental testing. This system may be of interest to institutions involved in future space experiments with similar needs for mechanisms control.

  17. Observing control and data reduction at the UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, Alan; Economou, Frossie; Wright, Gillian S.; Currie, Malcolm J.

    1998-07-01

    For the past seven years observing with the major instruments at the United Kingdom IR Telescope (UKIRT) has been semi-automated, using ASCII files top configure the instruments and then sequence a series of exposures and telescope movements to acquire the data. For one instrument automatic data reduction completes the cycle. The emergence of recent software technologies has suggested an evolution of this successful system to provide a friendlier and more powerful interface to observing at UKIRT. The Observatory Reduction and Acquisition Control (ORAC) project is now underway to construct this system. A key aim of ORAC is to allow a more complete description of the observing program, including the target sources and the recipe that will be used to provide on-line data reduction. Remote observation preparation and submission will also be supported. In parallel the observatory control system will be upgraded to use these descriptions for more automatic observing, while retaining the 'classical' interactive observing mode. The final component of the project is an improved automatic data reduction system, allowing on-line reduction of data at the telescope while retaining the flexibility to cope with changing observing techniques and instruments. The user will also automatically be provided with the scripts used for the real-time reduction to help provide post-observing data reduction support. The overall project goal is to improve the scientific productivity of the telescope, but it should also reduce the overall ongoing support requirements, and has the eventual goal of supporting the use of queue- scheduled observing.

  18. Design and installation of a next generation pilot scale fermentation system.

    PubMed

    Junker, B; Brix, T; Lester, M; Kardos, P; Adamca, J; Lynch, J; Schmitt, J; Salmon, P

    2003-01-01

    Four new fermenters were designed and constructed for use in secondary metabolite cultivations, bioconversions, and enzyme production. A new PC/PLC-based control system also was implemented using GE Fanuc PLCs, Genius I/O blocks, and Fix Dynamics SCADA software. These systems were incorporated into an industrial research fermentation pilot plant, designed and constructed in the early 1980s. Details of the design of these new fermenters and the new control system are described and compared with the existing installation for expected effectiveness. In addition, the reasoning behind selection of some of these features has been included. Key to the design was the goal of preserving similarity between the new and previously existing and successfully utilized fermenter hardware and software installations where feasible but implementing improvements where warranted and beneficial. Examples of enhancements include strategic use of Inconel as a material of construction to reduce corrosion, piping layout design for simplified hazardous energy isolation, on-line calculation and control of nutrient feed rates, and the use of field I/O modules located near the vessel to permit low-cost addition of new instrumentation.

  19. The ASTRI SST-2M telescope prototype for the Cherenkov Telescope Array: camera DAQ software architecture

    NASA Astrophysics Data System (ADS)

    Conforti, Vito; Trifoglio, Massimo; Bulgarelli, Andrea; Gianotti, Fulvio; Fioretti, Valentina; Tacchini, Alessandro; Zoli, Andrea; Malaguti, Giuseppe; Capalbi, Milvia; Catalano, Osvaldo

    2014-07-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) is a Flagship Project financed by the Italian Ministry of Education, University and Research, and led by INAF, the Italian National Institute of Astrophysics. Within this framework, INAF is currently developing an end-to-end prototype of a Small Size dual-mirror Telescope. In a second phase the ASTRI project foresees the installation of the first elements of the array at CTA southern site, a mini-array of 7 telescopes. The ASTRI Camera DAQ Software is aimed at the Camera data acquisition, storage and display during Camera development as well as during commissioning and operations on the ASTRI SST-2M telescope prototype that will operate at the INAF observing station located at Serra La Nave on the Mount Etna (Sicily). The Camera DAQ configuration and operations will be sequenced either through local operator commands or through remote commands received from the Instrument Controller System that commands and controls the Camera. The Camera DAQ software will acquire data packets through a direct one-way socket connection with the Camera Back End Electronics. In near real time, the data will be stored in both raw and FITS format. The DAQ Quick Look component will allow the operator to display in near real time the Camera data packets. We are developing the DAQ software adopting the iterative and incremental model in order to maximize the software reuse and to implement a system which is easily adaptable to changes. This contribution presents the Camera DAQ Software architecture with particular emphasis on its potential reuse for the ASTRI/CTA mini-array.

  20. General Mode Scanning Probe Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somnath, Suhas; Jesse, Stephen

    A critical part of SPM measurements is the information transfer from the probe-sample junction to the measurement system. Current information transfer methods heavily compress the information-rich data stream by averaging the data over a time interval, or via heterodyne detection approaches such as lock-in amplifiers and phase-locked loops. As a consequence, highly valuable information at the sub-microsecond time scales or information from frequencies outside the measurement band is lost. We have developed a fundamentally new approach called General Mode (G-mode), where we can capture the complete information stream from the detectors in the microscope. The availability of the complete informationmore » allows the microscope operator to analyze the data via information-theory analysis or comprehensive physical models. Furthermore, the complete data stream enables advanced data-driven filtering algorithms, multi-resolution imaging, ultrafast spectroscropic imaging, spatial mapping of multidimensional variability in material properties, etc. Though we applied this approach to scanning probe microscopy, the general philosophy of G-mode can be applied to many other modes of microscopy. G-mode data is captured by completely custom software written in LabVIEW and Matlab. The software generates the waveforms to electrically, thermally, or mechanically excite the SPM probe. It handles real-time communications with the microscope software for operations such as moving the SPM probe position and also controls other instrumentation hardware. The software also controls multiple variants of high-speed data acquisition cards to excite the SPM probe with the excitation waveform and simultaneously measure multiple channels of information from the microscope detectors at sampling rates of 1-100 MHz. The software also saves the raw data to the computer and allows the microscope operator to visualize processed or filtered data during the experiment. The software performs all these features while offering a user-friendly interface.« less

  1. Ground Support Software for Spaceborne Instrumentation

    NASA Technical Reports Server (NTRS)

    Anicich, Vincent; Thorpe, rob; Fletcher, Greg; Waite, Hunter; Xu, Hykua; Walter, Erin; Frick, Kristie; Farris, Greg; Gell, Dave; Furman, Jufy; hide

    2004-01-01

    ION is a system of ground support software for the ion and neutral mass spectrometer (INMS) instrument aboard the Cassini spacecraft. By incorporating commercial off-the-shelf database, Web server, and Java application components, ION offers considerably more ground-support-service capability than was available previously. A member of the team that operates the INMS or a scientist who uses the data collected by the INMS can gain access to most of the services provided by ION via a standard pointand click hyperlink interface generated by almost any Web-browser program running in almost any operating system on almost any computer. Data are stored in one central location in a relational database in a non-proprietary format, are accessible in many combinations and formats, and can be combined with data from other instruments and spacecraft. The use of the Java programming language as a system-interface language offers numerous capabilities for object-oriented programming and for making the database accessible to participants using a variety of computer hardware and software.

  2. Sonification Prototype for Space Physics

    NASA Astrophysics Data System (ADS)

    Candey, R. M.; Schertenleib, A. M.; Diaz Merced, W. L.

    2005-12-01

    As an alternative and adjunct to visual displays, auditory exploration of data via sonification (data controlled sound) and audification (audible playback of data samples) is promising for complex or rapidly/temporally changing visualizations, for data exploration of large datasets (particularly multi-dimensional datasets), and for exploring datasets in frequency rather than spatial dimensions (see also International Conferences on Auditory Display ). Besides improving data exploration and analysis for most researchers, the use of sound is especially valuable as an assistive technology for visually-impaired people and can make science and math more exciting for high school and college students. Only recently have the hardware and software come together to make a cross-platform open-source sonification tool feasible. We have developed a prototype sonification data analysis tool using the JavaSound API and NASA GSFC's ViSBARD software . Wanda Diaz Merced, a blind astrophysicist from Puerto Rico, is instrumental in advising on and testing the tool.

  3. [Development of multi-channels cardiac electrophysiological polygraph with LabVIEW as software platform and its clinical application].

    PubMed

    Fan, Shounian; Jiang, Yi; Jiang, Chenxi; Yang, Tianhe; Zhang, Chengyun; Liu, Junshi; Wu, Qiang; Zheng, Yaxi; Liu, Xiaoqiao

    2004-10-01

    Polygraph has become a necessary instrument in interventional cardiology and fundamental research of medicine up to the present. In this study, a LabView development system (DS) (developed by NI in U.S.) used as software platform, a DAQ data acquisition module and universal computer used as hardware platform, were creatively coupled with our self-made low noise multi-channels preamplifier to develop Multi-channels electrocardiograph. The device possessed the functions such as real time display of physiological process, digit highpass and lowpass, 50Hz filtered and gain adjustment, instant storing, random playback and printing, and process control stimulation. Besides, it was small-sized, economically practical and easy to operate. It could advance the spread of cardiac intervention treatment in hospitals.

  4. Ground facility for information reception, processing, dissemination and scientific instruments management setup in the CORONAS-PHOTON space project

    NASA Astrophysics Data System (ADS)

    Buslov, A. S.; Kotov, Yu. D.; Yurov, V. N.; Bessonov, M. V.; Kalmykov, P. A.; Oreshnikov, E. M.; Alimov, A. M.; Tumanov, A. V.; Zhuchkova, E. A.

    2011-06-01

    This paper deals with the organizational structure of ground-based receiving, processing, and dissemination of scientific information created by the Astrophysics Institute of the Scientific Research Nuclear University, Moscow Engineering Physics Institute. Hardware structure and software features are described. The principles are given for forming sets of control commands for scientific equipment (SE) devices, and statistics data are presented on the operation of facility during flight tests of the spacecraft (SC) in the course of one year.

  5. Data acquisition, processing and firing aid software for multichannel EMP simulation

    NASA Astrophysics Data System (ADS)

    Eumurian, Gregoire; Arbaud, Bruno

    1986-08-01

    Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.

  6. A virtual laboratory for neutron and synchrotron strain scanning

    NASA Astrophysics Data System (ADS)

    James, J. A.; Santisteban, J. R.; Edwards, L.; Daymond, M. R.

    2004-07-01

    The new generation of dedicated Engineering Strain Scanners at neutron and synchrotron facilities offer considerable improvements in both counting time and spatial resolution. In order to make full use of these advances in instrumentation, the routine tasks associated with setting up measurement runs and analysing the data need to be made as efficient as possible. Such tasks include the planning of the experiment, the alignment and positioning of the specimen, the least-squares refinement of diffraction spectra, the definition of strain in the sample coordinate system, and its visualization within a 3D model of the specimen. With this aim in mind, we have written a software providing support for most of these operations. The approach is based on a virtual lab consisting of 3D models of the sample and laboratory equipment. The system has been developed for ENGIN-X, the new engineering strain scanner recently commissioned at ISIS, but it is flexible enough to be ported to other neutron or synchrotron strain scanners. The software has been designed with visiting industrial and academic researchers in mind, users who need to be able to control the instrument after only a short period of training.

  7. EuroFlow standardization of flow cytometer instrument settings and immunophenotyping protocols

    PubMed Central

    Kalina, T; Flores-Montero, J; van der Velden, V H J; Martin-Ayuso, M; Böttcher, S; Ritgen, M; Almeida, J; Lhermitte, L; Asnafi, V; Mendonça, A; de Tute, R; Cullen, M; Sedek, L; Vidriales, M B; Pérez, J J; te Marvelde, J G; Mejstrikova, E; Hrusak, O; Szczepański, T; van Dongen, J J M; Orfao, A

    2012-01-01

    The EU-supported EuroFlow Consortium aimed at innovation and standardization of immunophenotyping for diagnosis and classification of hematological malignancies by introducing 8-color flow cytometry with fully standardized laboratory procedures and antibody panels in order to achieve maximally comparable results among different laboratories. This required the selection of optimal combinations of compatible fluorochromes and the design and evaluation of adequate standard operating procedures (SOPs) for instrument setup, fluorescence compensation and sample preparation. Additionally, we developed software tools for the evaluation of individual antibody reagents and antibody panels. Each section describes what has been evaluated experimentally versus adopted based on existing data and experience. Multicentric evaluation demonstrated high levels of reproducibility based on strict implementation of the EuroFlow SOPs and antibody panels. Overall, the 6 years of extensive collaborative experiments and the analysis of hundreds of cell samples of patients and healthy controls in the EuroFlow centers have provided for the first time laboratory protocols and software tools for fully standardized 8-color flow cytometric immunophenotyping of normal and malignant leukocytes in bone marrow and blood; this has yielded highly comparable data sets, which can be integrated in a single database. PMID:22948490

  8. Unit Testing for Command and Control Systems

    NASA Technical Reports Server (NTRS)

    Alexander, Joshua

    2018-01-01

    Unit tests were created to evaluate the functionality of a Data Generation and Publication tool for a command and control system. These unit tests are developed to constantly evaluate the tool and ensure it functions properly as the command and control system grows in size and scope. Unit tests are a crucial part of testing any software project and are especially instrumental in the development of a command and control system. They save resources, time and costs associated with testing, and catch issues before they become increasingly difficult and costly. The unit tests produced for the Data Generation and Publication tool to be used in a command and control system assure the users and stakeholders of its functionality and offer assurances which are vital in the launching of spacecraft safely.

  9. Mission Control Center (MCC) System Specification for the Shuttle Orbital Flight Test (OFT) Timeframe

    NASA Technical Reports Server (NTRS)

    1976-01-01

    System specifications to be used by the mission control center (MCC) for the shuttle orbital flight test (OFT) time frame were described. The three support systems discussed are the communication interface system (CIS), the data computation complex (DCC), and the display and control system (DCS), all of which may interfere with, and share processing facilities with other applications processing supporting current MCC programs. The MCC shall provide centralized control of the space shuttle OFT from launch through orbital flight, entry, and landing until the Orbiter comes to a stop on the runway. This control shall include the functions of vehicle management in the area of hardware configuration (verification), flight planning, communication and instrumentation configuration management, trajectory, software and consumables, payloads management, flight safety, and verification of test conditions/environment.

  10. Experimenting with string musical instruments

    NASA Astrophysics Data System (ADS)

    LoPresto, Michael C.

    2012-03-01

    What follows are several investigations involving string musical instruments developed for and used in a Science of Sound & Light course. The experiments make use of a guitar, orchestral string instruments and data collection and graphing software. They are designed to provide students with concrete examples of how mathematical formulae, when used in physics, represent reality that can actually be observed, in this case, the operation of string musical instruments.

  11. Porting and redesign of Geotool software system to Qt

    NASA Astrophysics Data System (ADS)

    Miljanovic Tamarit, V.; Carneiro, L.; Henson, I. H.; Tomuta, E.

    2016-12-01

    Geotool is a software system that allows a user to interactively display and process seismoacoustic data from International Monitoring System (IMS) station. Geotool can be used to perform a number of analysis and review tasks, including data I/O, waveform filtering, quality control, component rotation, amplitude and arrival measurement and review, array beamforming, correlation, Fourier analysis, FK analysis, event review and location, particle motion visualization, polarization analysis, instrument response convolution/deconvolution, real-time display, signal to noise measurement, spectrogram, and travel time model display. The Geotool program was originally written in C using the X11/Xt/Motif libraries for graphics. It was later ported to C++. Now the program is being ported to the Qt graphics system to be more compatible with the other software in the International Data Centre (IDC). Along with this port, a redesign of the architecture is underway to achieve a separation between user interface, control, and data model elements, in line with design patterns such as Model-View-Controller. Qt is a cross-platform application framework that will allow geotool to easily run on Linux, Mac, and Windows. The Qt environment includes modern libraries and user interfaces for standard utilities such as file and database access, printing, and inter-process communications. The Qt Widgets for Technical Applications library (QWT) provides tools for displaying standard data analysis graphics.

  12. Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system

    NASA Astrophysics Data System (ADS)

    Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong

    2018-01-01

    We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.

  13. Telescience - Concepts And Contributions To The Extreme Ultraviolet Explorer Mission

    NASA Astrophysics Data System (ADS)

    Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.

    1987-10-01

    A goal of the telescience concept is to allow scientists to use remotely located instruments as they would in their laboratory. Another goal is to increase reliability and scientific return of these instruments. In this paper we discuss the role of transparent software tools in development, integration, and postlaunch environments to achieve hands on access to the instrument. The use of transparent tools helps to reduce the parallel development of capability and to assure that valuable pre-launch experience is not lost in the operations phase. We also discuss the use of simulation as a rapid prototyping technique. Rapid prototyping provides a cost-effective means of using an iterative approach to instrument design. By allowing inexpensive produc-tion of testbeds, scientists can quickly tune the instrument to produce the desired scientific data. Using portions of the Extreme Ultraviolet Explorer (EUVE) system, we examine some of the results of preliminary tests in the use of simulation and tran-sparent tools. Additionally, we discuss our efforts to upgrade our software "EUVE electronics" simulator to emulate a full instrument, and give the pros and cons of the simulation facilities we have developed.

  14. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    NASA Astrophysics Data System (ADS)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  15. LabVIEW interface with Tango control system for a multi-technique X-ray spectrometry IAEA beamline end-station at Elettra Sincrotrone Trieste

    NASA Astrophysics Data System (ADS)

    Wrobel, P. M.; Bogovac, M.; Sghaier, H.; Leani, J. J.; Migliori, A.; Padilla-Alvarez, R.; Czyzycki, M.; Osan, J.; Kaiser, R. B.; Karydas, A. G.

    2016-10-01

    A new synchrotron beamline end-station for multipurpose X-ray spectrometry applications has been recently commissioned and it is currently accessible by end-users at the XRF beamline of Elettra Sincrotrone Trieste. The end-station consists of an ultra-high vacuum chamber that includes as main instrument a seven-axis motorized manipulator for sample and detectors positioning, different kinds of X-ray detectors and optical cameras. The beamline end-station allows performing measurements in different X-ray spectrometry techniques such as Microscopic X-Ray Fluorescence analysis (μXRF), Total Reflection X-Ray Fluorescence analysis (TXRF), Grazing Incidence/Exit X-Ray Fluorescence analysis (GI-XRF/GE-XRF), X-Ray Reflectometry (XRR), and X-Ray Absorption Spectroscopy (XAS). A LabVIEW Graphical User Interface (GUI) bound with Tango control system consisted of many custom made software modules is utilized as a user-friendly tool for control of the entire end-station hardware components. The present work describes this advanced Tango and LabVIEW software platform that utilizes in an optimal synergistic manner the merits and functionality of these well-established programming and equipment control tools.

  16. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  17. A portable detection instrument based on DSP for beef marbling

    NASA Astrophysics Data System (ADS)

    Zhou, Tong; Peng, Yankun

    2014-05-01

    Beef marbling is one of the most important indices to assess beef quality. Beef marbling is graded by the measurement of the fat distribution density in the rib-eye region. However quality grades of beef in most of the beef slaughtering houses and businesses depend on trainees using their visual senses or comparing the beef slice to the Chinese standard sample cards. Manual grading demands not only great labor but it also lacks objectivity and accuracy. Aiming at the necessity of beef slaughtering houses and businesses, a beef marbling detection instrument was designed. The instrument employs Charge-coupled Device (CCD) imaging techniques, digital image processing, Digital Signal Processor (DSP) control and processing techniques and Liquid Crystal Display (LCD) screen display techniques. The TMS320DM642 digital signal processor of Texas Instruments (TI) is the core that combines high-speed data processing capabilities and real-time processing features. All processes such as image acquisition, data transmission, image processing algorithms and display were implemented on this instrument for a quick, efficient, and non-invasive detection of beef marbling. Structure of the system, working principle, hardware and software are introduced in detail. The device is compact and easy to transport. The instrument can determine the grade of beef marbling reliably and correctly.

  18. Performance of the Magnetospheric Multiscale central instrument data handling

    NASA Astrophysics Data System (ADS)

    Klar, Robert A.; Miller, Scott A.; Brysch, Michael L.; Bertrand, Allison R.

    In order to study the fundamental physical processes of magnetic reconnection, particle acceleration and turbulence, the Magnetospheric Multiscale (MMS) mission employs a constellation of four identically configured observatories, each with a suite of complementary science instruments. Southwest Research Institute® (SwRI® ) developed the Central Instrument Data Processor (CIDP) to handle the large data volume associated with these instruments. The CIDP is an integrated access point between the instruments and the spacecraft. It provides synchronization pulses, relays telecommands, and gathers instrument housekeeping telemetry. It collects science data from the instruments and stores it to a mass memory for later playback to a ground station. This paper retrospectively examines the data handling performance realized by the CIDP implementation. It elaborates on some of the constraints on the hardware and software designs and the resulting effects on performance. For the hardware, it discusses the limitations of the front-end electronics input/output (I/O) architecture and associated mass memory buffering. For the software, it discusses the limitations of the Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP) implementation and the data structure choices for file management. It also describes design changes that improve data handling performance in newer designs.

  19. Restoring Redundancy to the MAP Propulsion System

    NASA Technical Reports Server (NTRS)

    O'Donnell, James R., Jr.; Davis, Gary T.; Ward, David K.; Bauer, Frank H. (Technical Monitor)

    2002-01-01

    The Microwave Anisotropy Probe (MAP) is a follow-on to the Differential Microwave Radiometer (DMR) instrument on the Cosmic Background Explorer (COBE). Due to the MAP project's limited mass, power, and financial resources, a traditional reliability concept including fully redundant components was not feasible. The MAP design employs selective hardware redundancy, along with backup software modes and algorithms, to improve the odds of mission success. In particular, MAP's propulsion system, which is used for orbit maneuvers and momentum management, uses eight thrusters positioned and oriented in such a way that its thruster-based attitude control modes can maintain three-axis attitude control in the event of the failure of any one thruster.

  20. Effect of irrigation on surface roughness and fatigue resistance of controlled memory wire nickel-titanium instruments.

    PubMed

    Cai, J-J; Tang, X-N; Ge, J-Y

    2017-07-01

    To investigate the effect of irrigation on the surface roughness and fatigue resistance of HyFlex and M3 controlled memory (CM) wire nickel-titanium instruments. Two new files of each brand were analysed by atomic force microscopy (AFM). Then, the instruments were dynamically immersed in either 5.25% sodium hypochlorite (NaOCl) or 17% ethylene diamine tetraacetic acid (EDTA) solution for 10 min, followed by AFM analysis. The roughness average (Ra) and root mean square (RMS) values were analysed statistically using an independent sample t-test. Then, 36 files of each brand were randomly assigned to three groups (n = 12). Group 1 (the control group) was composed of new instruments. Groups 2 and 3 were dynamically immersed in 5.25% NaOCl and 17% EDTA solutions for 10 min, respectively. The number of rotations to failure for various groups was analysed using the one-way analysis of variance software. For M3 files, the Ra and RMS values significantly increased (P < 0.05) after the immersion. For the HyFlex file, the Ra and RMS values significantly increased (P < 0.05) only in EDTA, but not (P > 0.05) NaOCl. The resistance to cyclic fatigue of both HyFlex and M3 files did not significantly decrease (P > 0.05) by immersing in 5.25% NaOCl and 17% EDTA solutions. Except the HyFlex files immersed in NaOCl, the surface roughness of other files exposed to irrigants increased. However, a change in the surface tomography of CM wire instruments caused by contact with irrigants for 10 min did not trigger a decrease in cyclic fatigue resistance. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  1. Using LabVIEW for Telemetry Monitoring and Display

    NASA Technical Reports Server (NTRS)

    Wells, G.; Baroth, E.

    1994-01-01

    Part of the Jet Propulsion Laboratory's (JPL's) Instrumentation Section, the Measurement Technology Center (MTC) evaluates data acquisition hardware and software products for inclusion into the Instrument Loan Pool, which are the made available to JPL experimenters.

  2. Exploiting the potential of free software to evaluate root canal biomechanical preparation outcomes through micro-CT images.

    PubMed

    Neves, A A; Silva, E J; Roter, J M; Belladona, F G; Alves, H D; Lopes, R T; Paciornik, S; De-Deus, G A

    2015-11-01

    To propose an automated image processing routine based on free software to quantify root canal preparation outcomes in pairs of sound and instrumented roots after micro-CT scanning procedures. Seven mesial roots of human mandibular molars with different canal configuration systems were studied: (i) Vertucci's type 1, (ii) Vertucci's type 2, (iii) two individual canals, (iv) Vertucci's type 6, canals (v) with and (vi) without debris, and (vii) canal with visible pulp calcification. All teeth were instrumented with the BioRaCe system and scanned in a Skyscan 1173 micro-CT before and after canal preparation. After reconstruction, the instrumented stack of images (IS) was registered against the preoperative sound stack of images (SS). Image processing included contrast equalization and noise filtering. Sound canal volumes were obtained by a minimum threshold. For the IS, a fixed conservative threshold was chosen as the best compromise between instrumented canal and dentine whilst avoiding debris, resulting in instrumented canal plus empty spaces. Arithmetic and logical operations between sound and instrumented stacks were used to identify debris. Noninstrumented dentine was calculated using a minimum threshold in the IS and subtracting from the SS and total debris. Removed dentine volume was obtained by subtracting SS from IS. Quantitative data on total debris present in the root canal space after instrumentation, noninstrumented areas and removed dentine volume were obtained for each test case, as well as three-dimensional volume renderings. After standardization of acquisition, reconstruction and image processing micro-CT images, a quantitative approach for calculation of root canal biomechanical outcomes was achieved using free software. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  3. Simultaneous operation and control of about 100 telescopes for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Wegner, P.; Colomé, J.; Hoffmann, D.; Houles, J.; Köppel, H.; Lamanna, G.; Le Flour, T.; Lopatin, A.; Lyard, E.; Melkumyan, D.; Oya, I.; Panazol, L.-I.; Punch, M.; Schlenstedt, S.; Schmidt, T.; Stegmann, C.; Schwanke, U.; Walter, R.; Consortium, CTA

    2012-12-01

    The Cherenkov Telescope Array (CTA) project is an initiative to build the next generation ground-based very high energy (VHE) gamma-ray instrument. Compared to current imaging atmospheric Cherenkov telescope experiments CTA will extend the energy range and improve the angular resolution while increasing the sensitivity up to a factor of 10. With about 100 separate telescopes it will be operated as an observatory open to a wide astrophysics and particle physics community, providing a deep insight into the non-thermal high-energy universe. The CTA Array Control system (ACTL) is responsible for several essential control tasks supporting the evaluation and selection of proposals, as well as the preparation, scheduling, and finally the execution of observations with the array. A possible basic distributed software framework for ACTL being considered is the ALMA Common Software (ACS). The ACS framework follows a container component model and contains a high level abstraction layer to integrate different types of device. To achieve a low-level consolidation of connecting control hardware, OPC UA (OPen Connectivity-Unified Architecture) client functionality is integrated directly into ACS, thus allowing interaction with other OPC UA capable hardware. The CTA Data Acquisition System comprises the data readout of all cameras and the transfer of the data to a camera server farm, thereby using standard hardware and software technologies. CTA array control is also covering conceptions for a possible array trigger system and the corresponding clock distribution. The design of the CTA observations scheduler is introducing new algorithmic technologies to achieve the required flexibility.

  4. Autonomous Instrument Placement for Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Leger, P. Chris; Maimone, Mark

    2009-01-01

    Autonomous Instrument Placement (AutoPlace) is onboard software that enables a Mars Exploration Rover to act autonomously in using its manipulator to place scientific instruments on or near designated rock and soil targets. Prior to the development of AutoPlace, it was necessary for human operators on Earth to plan every motion of the manipulator arm in a time-consuming process that included downlinking of images from the rover, analysis of images and creation of commands, and uplinking of commands to the rover. AutoPlace incorporates image analysis and planning algorithms into the onboard rover software, eliminating the need for the downlink/uplink command cycle. Many of these algorithms are derived from the existing groundbased image analysis and planning algorithms, with modifications and augmentations for onboard use.

  5. A packet switched communications system for GRO

    NASA Astrophysics Data System (ADS)

    Husain, Shabu; Yang, Wen-Hsing; Vadlamudi, Rani; Valenti, Joseph

    1993-11-01

    This paper describes the packet switched Instrumenters Communication System (ICS) that was developed for the Command Management Facility at GSFC to support the Gamma Ray Observatory (GRO) spacecraft. The GRO ICS serves as a vital science data acquisition link to the GRO scientists to initiate commands for their spacecraft instruments. The system is ready to send and receive messages at any time, 24 hours a day and seven days a week. The system is based on X.25 and the International Standard Organization's (ISO) 7-layer Open Systems Interconnection (OSI) protocol model and has client and server components. The components of the GRO ICS are discussed along with how the Communications Subsystem for Interconnection (CSFI) and Network Control Program Packet Switching Interface (NPSI) software are used in the system.

  6. A new gated x-ray detector for the Orion laser facility

    NASA Astrophysics Data System (ADS)

    Clark, David D.; Aragonez, Robert; Archuleta, Thomas; Fatherley, Valerie; Hsu, Albert; Jorgenson, Justin; Mares, Danielle; Oertel, John; Oades, Kevin; Kemshall, Paul; Thomas, Phillip; Young, Trevor; Pederson, Neal

    2012-10-01

    Gated X-Ray Detectors (GXD) are considered the work-horse target diagnostic of the laser based inertial confinement fusion (ICF) program. Recently, Los Alamos National Laboratory (LANL) has constructed three new GXDs for the Orion laser facility at the Atomic Weapons Establishment (AWE) in the United Kingdom. What sets these three new instruments apart from what has previously been constructed for the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) is: improvements in detector head microwave transmission lines, solid state embedded hard drive and updated control software, and lighter air box design and other incremental mechanical improvements. In this paper we will present the latest GXD design enhancements and sample calibration data taken on the Trident laser facility at Los Alamos National Laboratory using the newly constructed instruments.

  7. Analysis of XMM-Newton Data from Extended Sources and the Diffuse X-Ray Background

    NASA Technical Reports Server (NTRS)

    Snowden, Steven

    2011-01-01

    Reduction of X-ray data from extended objects and the diffuse background is a complicated process that requires attention to the details of the instrumental response as well as an understanding of the multiple background components. We present methods and software that we have developed to reduce data from XMM-Newton EPIC imaging observations for both the MOS and PN instruments. The software has now been included in the Science Analysis System (SAS) package available through the XMM-Newton Science Operations Center (SOC).

  8. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    NASA Astrophysics Data System (ADS)

    Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.

    2017-08-01

    Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.

  9. Systems development of a stall/spin research facility using remotely controlled/augmented aircraft models. Volume 1: Systems overview

    NASA Technical Reports Server (NTRS)

    Montoya, R. J.; Jai, A. R.; Parker, C. D.

    1979-01-01

    A ground based, general purpose, real time, digital control system simulator (CSS) is specified, developed, and integrated with the existing instrumentation van of the testing facility. This CSS is built around a PDP-11/55, and its operational software was developed to meet the dual goal of providing the immediate capability to represent the F-18 drop model control laws and the flexibility for expansion to represent more complex control laws typical of control configured vehicles. Overviews of the two CSS's developed are reviewed as well as the overall system after their integration with the existing facility. Also the latest version of the F-18 drop model control laws (REV D) is described and the changes needed for its incorporation in the digital and analog CSS's are discussed.

  10. ELITE-3 active vibration isolation workstation

    NASA Astrophysics Data System (ADS)

    Anderson, Eric H.; Houghton, Bowie

    2001-06-01

    This paper describes the development and capabilities of ELITE-3, a product that incorporates piezoelectric actuators to provide ultrastable work surfaces for very high resolution wafer production, metrology, microscopy, and other applications. The electromechanical, electronic, and software/firmware parts of the ELITE-3 active workstation are described, with an emphasis on considerations relating to the piezoelectric transducers. Performance of the system and its relation to the smart materials is discussed. As the floor beneath a vibration-sensitive instrument supported by ELITE-3 moves, piezoelectrics are controlled to minimize the motion of the instrument. A digital signal processor (DSP) determines the appropriate signals to apply to the actuators. A PC-based interface allows reprogramming of control algorithms and resetting of other parameters within the firmware. The modular product allows incorporation of vibration isolator, actuator and sensor modules into original equipment manufacturer (OEM) products. Alternatively, a workstation can be integrated as an integrated standalone system. The paper describes the system architecture, overall approach to vibration isolation, and various system components, and summarizes motivations for key design approaches.

  11. Development of An Autonomous Underwater Glider for Observing Physical Ocean Parameters in Indonesian Seas

    NASA Astrophysics Data System (ADS)

    Ajie Linarka, Utoyo; Riyanto Trilaksono, Bambang; Sagala, M. Faisal; Hidayat, Egi; Sopaheluwakan, Ardhasena; Rizal, Jose; Heriyanto, Eko; Amsal Harapan, Ferdika; Eka Syahputra Makmur, Erwin

    2017-04-01

    Conducting a sustained monitoring and surveying of physical ocean parameters for research or operational purposes using moorings and ships would require high cost. Development of an inexpensive instrument capable to perform such tasks not only could reduce cost and risks but also increase cruising range and depth. For that reason, a prototype of underwater glider was developed, named "GaneshBlue". GaneshBlue works based on gliding principles which utilizes pitch angle and buoyancy control for moving. For one gliding movement, GaneshBlue passed through 5 phases of surface, descent, transition, ascent and back to surface. The glider is equipped with basic navigation system and remote control, programmable survey planning, temperature and salinity sampling instruments, lithium batteries for power supply, and information processing software. A field test at the shallow water showed that GaneshBule has successfully demonstrated gliding and surfacing movements with surge motion speed reaching 20 cm s-1and 20 m in depths. During the field test the glider was also equipped with three instruments, i.e. Inertial Measurement Unit (IMU) to estimate glider's speed and orientation; MiniCT to acquire temperature and conductivity data; and Altisounder to determine its distance to sea surface and to seabed. In general, all the instruments performed well but filter algorithm needs to be implemented on data collection procedure to remove data outliers.

  12. ICESat (GLAS) Science Processing Software Document Series. Volume 3; GLAS Science Software Requirements Document; Ver 2.1

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Lee, Jeffrey; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    This document addresses the software requirements of the Geoscience Laser Altimeter System (GLAS) Standard Data Software (SDS) supporting the GLAS instrument on the EOS ICESat Spacecraft. This Software Requirements Document represents the initial collection of the technical engineering information for the GLAS SDS. This information is detailed within the second of four main volumes of the Standard documentation, the Product Specification volume. This document is a "roll-out" from the governing volume outline containing the Concept and Requirements sections.

  13. MAX-DOAS observations from ground, ship, and research aircraft: maximizing signal-to-noise to measure 'weak' absorbers

    NASA Astrophysics Data System (ADS)

    Volkamer, Rainer; Coburn, Sean; Dix, Barbara; Sinreich, Roman

    2009-08-01

    Multi AXis Differential Optical Absorption Spectroscopy (MAX-DOAS) instruments, as solar straylight satellites, require an accurate characterization and elimination of Fraunhofer lines from solar straylight spectra to measure the atmospheric column abundance of reactive gases that destroy toxic and heat trapping ozone and form climate cooling aerosols, like glyoxal (CHOCHO), iodine oxide (IO), or bromine oxide (BrO). The currently achievable noise levels with state-of-the-art DOAS instruments are limited to δ'DL ~ 10-4 (noise equivalent differential optical density, δ') further noise reductions are typically not straightforward, and the reason for this barrier is not well understood. Here we demonstrate that the nonlinearity of state-of-the-art CCD detectors poses a limitation to accurately characterize Fraunhofer lines; the incomplete elimination of Fraunhofer lines is found to cause residual structures of δ' ~ 10-4, and only partially accounted by fitting of an "offset" spectrum. We have developed a novel software tool, the CU Data Acquisition Code that overcomes this barrier by actively controlling the CCD saturation level, and demonstrates that δ'DL on the order of 10-5 are possible without apparent limitations from the presence of Fraunhofer lines. The software also implements active control of the elevation angle (angle with respect to the horizon) by means of a Motion Compensation System for use with mobile MAX-DOAS deployments from ships and aircraft. Finally, a novel approach to convert slant column densities into line-of-sight averaged concentrations is discussed.

  14. EOS MLS Level 1B Data Processing Software. Version 3

    NASA Technical Reports Server (NTRS)

    Perun, Vincent S.; Jarnot, Robert F.; Wagner, Paul A.; Cofield, Richard E., IV; Nguyen, Honghanh T.; Vuu, Christina

    2011-01-01

    This software is an improvement on Version 2, which was described in EOS MLS Level 1B Data Processing, Version 2.2, NASA Tech Briefs, Vol. 33, No. 5 (May 2009), p. 34. It accepts the EOS MLS Level 0 science/engineering data, and the EOS Aura spacecraft ephemeris/attitude data, and produces calibrated instrument radiances and associated engineering and diagnostic data. This version makes the code more robust, improves calibration, provides more diagnostics outputs, defines the Galactic core more finely, and fixes the equator crossing. The Level 1 processing software manages several different tasks. It qualifies each data quantity using instrument configuration and checksum data, as well as data transmission quality flags. Statistical tests are applied for data quality and reasonableness. The instrument engineering data (e.g., voltages, currents, temperatures, and encoder angles) is calibrated by the software, and the filter channel space reference measurements are interpolated onto the times of each limb measurement with the interpolates being differenced from the measurements. Filter channel calibration target measurements are interpolated onto the times of each limb measurement, and are used to compute radiometric gain. The total signal power is determined and analyzed by each digital autocorrelator spectrometer (DACS) during each data integration. The software converts each DACS data integration from an autocorrelation measurement in the time domain into a spectral measurement in the frequency domain, and estimates separately the spectrally, smoothly varying and spectrally averaged components of the limb port signal arising from antenna emission and scattering effects. Limb radiances are also calibrated.

  15. Experiment Automation with a Robot Arm using the Liquids Reflectometer Instrument at the Spallation Neutron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zolnierczuk, Piotr A; Vacaliuc, Bogdan; Sundaram, Madhan

    The Liquids Reflectometer instrument installed at the Spallation Neutron Source (SNS) enables observations of chemical kinetics, solid-state reactions and phase-transitions of thin film materials at both solid and liquid surfaces. Effective measurement of these behaviors requires each sample to be calibrated dynamically using the neutron beam and the data acquisition system in a feedback loop. Since the SNS is an intense neutron source, the time needed to perform the measurement can be the same as the alignment process, leading to a labor-intensive operation that is exhausting to users. An update to the instrument control system, completed in March 2013, implementedmore » the key features of automated sample alignment and robot-driven sample management, allowing for unattended operation over extended periods, lasting as long as 20 hours. We present a case study of the effort, detailing the mechanical, electrical and software modifications that were made as well as the lessons learned during the integration, verification and testing process.« less

  16. Modeling, simulation and control for a cryogenic fluid management facility, preliminary report

    NASA Technical Reports Server (NTRS)

    Turner, Max A.; Vanbuskirk, P. D.

    1986-01-01

    The synthesis of a control system for a cryogenic fluid management facility was studied. The severe demand for reliability as well as instrumentation and control unique to the Space Station environment are prime considerations. Realizing that the effective control system depends heavily on quantitative description of the facility dynamics, a methodology for process identification and parameter estimation is postulated. A block diagram of the associated control system is also produced. Finally, an on-line adaptive control strategy is developed utilizing optimization of the velocity form control parameters (proportional gains, integration and derivative time constants) in appropriate difference equations for direct digital control. Of special concern are the communications, software and hardware supporting interaction between the ground and orbital systems. It is visualized that specialist in the OSI/ISO utilizing the Ada programming language will influence further development, testing and validation of the simplistic models presented here for adaptation to the actual flight environment.

  17. ESO Reflex: A Graphical Workflow Engine for Data Reduction

    NASA Astrophysics Data System (ADS)

    Hook, R.; Romaniello, M.; Péron, M.; Ballester, P.; Gabasch, A.; Izzo, C.; Ullgrén, M.; Maisala, S.; Oittinen, T.; Solin, O.; Savolainen, V.; Järveläinen, P.; Tyynelä, J.

    2008-08-01

    Sampo {http://www.eso.org/sampo} (Hook et al. 2005) is a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal is to assess the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Those prototypes will not only be used to validate concepts and understand requirements but will also be tools of immediate value for the community. Most of the raw data produced by ESO instruments can be reduced using CPL {http://www.eso.org/cpl} recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library. Currently reduction recipes are run in batch mode as part of the data flow system to generate the input to the ESO VLT/VLTI quality control process and are also made public for external users. Sampo has developed a prototype application called ESO Reflex {http://www.eso.org/sampo/reflex/} that integrates a graphical user interface and existing data reduction algorithms. ESO Reflex can invoke CPL-based recipes in a flexible way through a dedicated interface. ESO Reflex is based on the graphical workflow engine Taverna {http://taverna.sourceforge.net} that was originally developed by the UK eScience community, mostly for work in the life sciences. Workflows have been created so far for three VLT/VLTI instrument modes ( VIMOS/IFU {http://www.eso.org/instruments/vimos/}, FORS spectroscopy {http://www.eso.org/instruments/fors/} and AMBER {http://www.eso.org/instruments/amber/}), and the easy-to-use GUI allows the user to make changes to these or create workflows of their own. Python scripts and IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available.

  18. A rule-based expert system for generating control displays at the Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    Coulter, Karen J.

    1994-12-01

    The integration of a rule-based expert system for generating screen displays for controlling and monitoring instrumentation under the Experimental Physics and Industrial Control System (EPICS) is presented. The expert system is implemented using CLIPS, an expert system shell from the Software Technology Branch at Lyndon B. Johnson Space Center. The user selects the hardware input and output to be displayed and the expert system constructs a graphical control screen appropriate for the data. Such a system provides a method for implementing a common look and feel for displays created by several different users and reduces the amount of time required to create displays for new hardware configurations. Users are able to modify the displays as needed using the EPICS display editor tool.

  19. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  20. Virtual Character Animation Based on Affordable Motion Capture and Reconfigurable Tangible Interfaces.

    PubMed

    Lamberti, Fabrizio; Paravati, Gianluca; Gatteschi, Valentina; Cannavo, Alberto; Montuschi, Paolo

    2018-05-01

    Software for computer animation is generally characterized by a steep learning curve, due to the entanglement of both sophisticated techniques and interaction methods required to control 3D geometries. This paper proposes a tool designed to support computer animation production processes by leveraging the affordances offered by articulated tangible user interfaces and motion capture retargeting solutions. To this aim, orientations of an instrumented prop are recorded together with animator's motion in the 3D space and used to quickly pose characters in the virtual environment. High-level functionalities of the animation software are made accessible via a speech interface, thus letting the user control the animation pipeline via voice commands while focusing on his or her hands and body motion. The proposed solution exploits both off-the-shelf hardware components (like the Lego Mindstorms EV3 bricks and the Microsoft Kinect, used for building the tangible device and tracking animator's skeleton) and free open-source software (like the Blender animation tool), thus representing an interesting solution also for beginners approaching the world of digital animation for the first time. Experimental results in different usage scenarios show the benefits offered by the designed interaction strategy with respect to a mouse & keyboard-based interface both for expert and non-expert users.

Top