Benefits of Software GPS Receivers for Enhanced Signal Processing
2000-01-01
1 Published in GPS SOLUTIONS 4(1) Summer, 2000, pages 56-66. Benefits of Software GPS Receivers for Enhanced Signal Processing Alison Brown...Diego, CA 92110-3127 Number of Pages: 24 Number of Figures: 20 ABSTRACT In this paper the architecture of a software GPS receiver is described...and an analysis is included of the performance of a software GPS receiver when tracking the GPS signals in challenging environments. Results are
Frequency Estimator Performance for a Software-Based Beacon Receiver
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
Frequency Estimator Performance for a Software-Based Beacon Receiver
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Morse, Jacquelynne R.; Nessel, James A.
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a Q/V-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
Nieminen, Teemu; Lähteenmäki, Pasi; Tan, Zhenbing; Cox, Daniel; Hakonen, Pertti J
2016-11-01
We present a microwave correlation measurement system based on two low-cost USB-connected software defined radio dongles modified to operate as coherent receivers by using a common local oscillator. Existing software is used to obtain I/Q samples from both dongles simultaneously at a software tunable frequency. To achieve low noise, we introduce an easy low-noise solution for cryogenic amplification at 600-900 MHz based on single discrete HEMT with 21 dB gain and 7 K noise temperature. In addition, we discuss the quantization effects in a digital correlation measurement and determination of optimal integration time by applying Allan deviation analysis.
Radio Astronomy Software Defined Receiver Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vacaliuc, Bogdan; Leech, Marcus; Oxley, Paul
The paper describes a Radio Astronomy Software Defined Receiver (RASDR) that is currently under development. RASDR is targeted for use by amateurs and small institutions where cost is a primary consideration. The receiver will operate from HF thru 2.8 GHz. Front-end components such as preamps, block down-converters and pre-select bandpass filters are outside the scope of this development and will be provided by the user. The receiver includes RF amplifiers and attenuators, synthesized LOs, quadrature down converters, dual 8 bit ADCs and a Signal Processor that provides firmware processing of the digital bit stream. RASDR will interface to a usermore » s PC via a USB or higher speed Ethernet LAN connection. The PC will run software that provides processing of the bit stream, a graphical user interface, as well as data analysis and storage. Software should support MAC OS, Windows and Linux platforms and will focus on such radio astronomy applications as total power measurements, pulsar detection, and spectral line studies.« less
NASA Astrophysics Data System (ADS)
Karmazikov, Y. V.; Fainberg, E. M.
2005-06-01
Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.
GPS-based system for satellite tracking and geodesy
NASA Technical Reports Server (NTRS)
Bertiger, Willy I.; Thornton, Catherine L.
1989-01-01
High-performance receivers and data processing systems developed for GPS are reviewed. The GPS Inferred Positioning System (GIPSY) and the Orbiter Analysis and Simulation Software (OASIS) are described. The OASIS software is used to assess GPS system performance using GIPSY for data processing. Consideration is given to parameter estimation for multiday arcs, orbit repeatability, orbit prediction, daily baseline repeatability, agreement with VLBI, and ambiguity resolution. Also, the dual-frequency Rogue receiver, which can track up to eight GPS satellites simultaneously, is discussed.
Retina Image Screening and Analysis Software Version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Aykac, Deniz
2009-04-01
The software allows physicians or researchers to ground-truth images of retinas, identifying key physiological features and lesions that are indicative of disease. The software features methods to automatically detect the physiological features and lesions. The software contains code to measure the quality of images received from a telemedicine network; create and populate a database for a telemedicine network; review and report the diagnosis of a set of images; and also contains components to transmit images from a Zeiss camera to the network through SFTP.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlesinger, Adam M.
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.
Zamawe, F C
2015-03-01
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
Structural Analysis and Design Software
NASA Technical Reports Server (NTRS)
1997-01-01
Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.
Validation results of the IAG Dancer project for distributed GPS analysis
NASA Astrophysics Data System (ADS)
Boomkamp, H.
2012-12-01
The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and demonstrates the obvious benefits of distributed analysis of geodetic data in general. IAG Dancer screenshot
Atmosphere Explorer control system software (version 1.0)
NASA Technical Reports Server (NTRS)
Villasenor, A.
1972-01-01
The basic design is described of the Atmosphere Explorer Control System (AECS) software used in the testing, integration, and flight contol of the AE spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The major processing sections are: executive control section, telemetry decommutation section, command generation section, and utility section.
Imai, Shungo; Yamada, Takehiro; Ishiguro, Nobuhisa; Miyamoto, Takenori; Kagami, Keisuke; Tomiyama, Naoki; Niinuma, Yusuke; Nagasaki, Daisuke; Suzuki, Koji; Yamagami, Akira; Kasashi, Kumiko; Kobayashi, Masaki; Iseki, Ken
2017-01-01
Based on the predictive performance in our previous study, we switched the therapeutic drug monitoring (TDM) analysis software for dose setting of vancomycin (VCM) from "Vancomycin MEEK TDM analysis software Ver2.0" (MEEK) to "SHIONOGI-VCM-TDM ver.2009" (VCM-TDM) in January 2015. In the present study, our aim was to validate the effectiveness of the changing VCM TDM analysis software in initial dose setting of VCM. The enrolled patients were divided into two groups, each having 162 patients in total, who received VCM with the initial dose set using MEEK (MEEK group) or VCM-TDM (VCM-TDM group). We compared the rates of attaining the therapeutic range (trough value; 10-20 μg/mL) of serum VCM concentration between the groups. Multivariate logistic regression analysis was performed to confirm that changing the VCM TDM analysis software was an independent factor related to attaining the therapeutic range. Switching the VCM TDM analysis software from MEEK to VCM-TDM improved the rate of attaining the therapeutic range by 21.6% (MEEK group: 42.6% vs. VCM-TDM group: 64.2%, p<0.01). Patient age ≥65 years, concomitant medication (furosemide) and the TDM analysis software used VCM-TDM were considered to be independent factors for attaining the therapeutic range. These results demonstrated the effectiveness of switching the VCM TDM analysis software from MEEK to VCM-TDM for initial dose setting of VCM.
Software requirements: Guidance and control software development specification
NASA Technical Reports Server (NTRS)
Withers, B. Edward; Rich, Don C.; Lowman, Douglas S.; Buckland, R. C.
1990-01-01
The software requirements for an implementation of Guidance and Control Software (GCS) are specified. The purpose of the GCS is to provide guidance and engine control to a planetary landing vehicle during its terminal descent onto a planetary surface and to communicate sensory information about that vehicle and its descent to some receiving device. The specification was developed using the structured analysis for real time system specification methodology by Hatley and Pirbhai and was based on a simulation program used to study the probability of success of the 1976 Viking Lander missions to Mars. Three versions of GCS are being generated for use in software error studies.
NASA Technical Reports Server (NTRS)
Psiaki, Mark L. (Inventor); Kintner, Jr., Paul M. (Inventor); Ledvina, Brent M. (Inventor); Powell, Steven P. (Inventor)
2007-01-01
A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.
NASA Technical Reports Server (NTRS)
Psiaki, Mark L. (Inventor); Ledvina, Brent M. (Inventor); Powell, Steven P. (Inventor); Kintner, Jr., Paul M. (Inventor)
2006-01-01
A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.
Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura
2017-01-01
A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.
NASA Technical Reports Server (NTRS)
Allen, B. Danette
1998-01-01
In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.
Sudhakar, S; Porcelvan, S; Francis, T.G. Tilak; Rathnamala, D; Radhakrishnan, R
2017-01-01
Introduction The postural adaptation is very common now a days in school going children, office desk oriented job, computer users and frequent mobile users, and in all major industrial workers. Several studies have documented a high incidence of postural abnormalities in a given population; however, methods of postural measurement were poorly defined. The implication of postural pro software to analyse the postural imbalance of upper body dysfunction is very rare and literature studies says that the kinematic changes in particular segment will produce pain/discomfort and thereby lesser productivity of subjects. Aim To evaluate the postural changes in subjects with upper body dysfunction after a corrective exercise strategy using postural analysis software and pectoralis minor muscle length testing. Materials and Methods After explaining the procedure and benefits, informed consent was taken from the participating subjects (age 25-55 years). Subjects with upper body dysfunction were randomly allocated into two groups (each group 30 subjects). The Group–A received the corrective exercise strategy and Group-B received the conventional exercise for eight weeks of study duration (15 reps each exercise, total duration of 40 min; four days/week. Pre and Post posture analysis were analysed using posture pro software along with flexibility of pectoralis minor was assessed using ruler scale method. Results After interpretation of data, both the group showed the postural alteration and pectoralis minor muscle length changes, p-value (p<0.01) of both group showed highly significant changes. But comparing the both groups, the subjects who received the corrective exercise strategy shown more percentage of improvement in posture alteration (56.25%), pectoralis minor muscle length changes (68.69%) than the conventional exercise received subjects in posture alteration (24.86%) and pectoralis minor muscle length changes (21.9%). Conclusion Altered postural changes and pectoralis minor muscle flexibility before and after the corrective exercise strategy evaluated by postural analysis software method shown to be a significant tool in clinical practice, which is easier and reproducible method. PMID:28893030
NASA Technical Reports Server (NTRS)
1981-01-01
The software developed to simulate the ground control point navigation system is described. The Ground Control Point Simulation Program (GCPSIM) is designed as an analysis tool to predict the performance of the navigation system. The system consists of two star trackers, a global positioning system receiver, a gyro package, and a landmark tracker.
Numerical evaluation of an innovative cup layout for open volumetric solar air receivers
NASA Astrophysics Data System (ADS)
Cagnoli, Mattia; Savoldi, Laura; Zanino, Roberto; Zaversky, Fritz
2016-05-01
This paper proposes an innovative volumetric solar absorber design to be used in high-temperature air receivers of solar power tower plants. The innovative absorber, a so-called CPC-stacked-plate configuration, applies the well-known principle of a compound parabolic concentrator (CPC) for the first time in a volumetric solar receiver, heating air to high temperatures. The proposed absorber configuration is analyzed numerically, applying first the open-source ray-tracing software Tonatiuh in order to obtain the solar flux distribution on the absorber's surfaces. Next, a Computational Fluid Dynamic (CFD) analysis of a representative single channel of the innovative receiver is performed, using the commercial CFD software ANSYS Fluent. The solution of the conjugate heat transfer problem shows that the behavior of the new absorber concept is promising, however further optimization of the geometry will be necessary in order to exceed the performance of the classical absorber designs.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlensinger, Adam M.
2012-01-01
Modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more overhead through noisier channels, and software-defined radios use error-correction techniques that approach Shannon s theoretical limit of performance. The authors describe the benefit of closed-loop measurements for a receiver when paired with a counterpart transmitter and representative channel conditions. We also describe a real-time Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in real-time during the development of software defined radios.
NASA Tech Briefs, December 2004
NASA Technical Reports Server (NTRS)
2004-01-01
opics include: High-Rate Digital Receiver Board; Signal Design for Improved Ranging Among Multiple Transceivers; Automated Analysis, Classification, and Display of Waveforms; Fast-Acquisition/Weak-Signal-Tracking GPS Receiver for HEO; Format for Interchange and Display of 3D Terrain Data; Program Analyzes Radar Altimeter Data; Indoor Navigation using Direction Sensor and Beacons; Software Assists in Responding to Anomalous Conditions; Software for Autonomous Spacecraft Maneuvers; WinPlot; Software for Automated Testing of Mission-Control Displays; Nanocarpets for Trapping Microscopic Particles; Precious-Metal Salt Coatings for Detecting Hydrazines; Amplifying Electrochemical Indicators; Better End-Cap Processing for Oxidation-Resistant Polyimides; Carbon-Fiber Brush Heat Exchangers; Solar-Powered Airplane with Cameras and WLAN; A Resonator for Low-Threshold Frequency Conversion; Masked Proportional Routing; Algorithm Determines Wind Speed and Direction from Venturi-Sensor Data; Feature-Identification and Data-Compression Software; Alternative Attitude Commanding and Control for Precise Spacecraft Landing; Inspecting Friction Stir Welding using Electromagnetic Probes; and Helicity in Supercritical O2/H2 and C7H16/N2 Mixing Layers.
Preliminary description of the area navigation software for a microcomputer-based Loran-C receiver
NASA Technical Reports Server (NTRS)
Oguri, F.
1983-01-01
The development of new software implementation of this software on a microcomputer (MOS 6502) to provide high quality navigation information is described. This software development provides Area/Route Navigation (RNAV) information from Time Differences (TDs) in raw form using an elliptical Earth model and a spherical model. The software is prepared for the microcomputer based Loran-C receiver. To compute navigation infomation, a (MOS 6502) microcomputer and a mathematical chip (AM 9511A) were combined with the Loran-C receiver. Final data reveals that this software does indeed provide accurate information with reasonable execution times.
Mining Program Source Code for Improving Software Quality
2013-01-01
conduct static verification on the software application under analysis to detect defects around APIs. (a) Papers published in peer-reviewed journals...N/A for none) Enter List of papers submitted or published that acknowledge ARO support from the start of the project to the date of this printing...List the papers , including journal references, in the following categories: Received Paper 05/06/2013 21.00 Tao Xie, Suresh Thummalapenta, David Lo
Hirose, Tomohiro; Nitta, Norihisa; Shiraishi, Junji; Nagatani, Yukihiro; Takahashi, Masashi; Murata, Kiyoshi
2008-12-01
The aim of this study was to evaluate the usefulness of computer-aided diagnosis (CAD) software for the detection of lung nodules on multidetector-row computed tomography (MDCT) in terms of improvement in radiologists' diagnostic accuracy in detecting lung nodules, using jackknife free-response receiver-operating characteristic (JAFROC) analysis. Twenty-one patients (6 without and 15 with lung nodules) were selected randomly from 120 consecutive thoracic computed tomographic examinations. The gold standard for the presence or absence of nodules in the observer study was determined by consensus of two radiologists. Six expert radiologists participated in a free-response receiver operating characteristic study for the detection of lung nodules on MDCT, in which cases were interpreted first without and then with the output of CAD software. Radiologists were asked to indicate the locations of lung nodule candidates on the monitor with their confidence ratings for the presence of lung nodules. The performance of the CAD software indicated that the sensitivity in detecting lung nodules was 71.4%, with 0.95 false-positive results per case. When radiologists used the CAD software, the average sensitivity improved from 39.5% to 81.0%, with an increase in the average number of false-positive results from 0.14 to 0.89 per case. The average figure-of-merit values for the six radiologists were 0.390 without and 0.845 with the output of the CAD software, and there was a statistically significant difference (P < .0001) using the JAFROC analysis. The CAD software for the detection of lung nodules on MDCT has the potential to assist radiologists by increasing their accuracy.
Digital beacon receiver for ionospheric TEC measurement developed with GNU Radio
NASA Astrophysics Data System (ADS)
Yamamoto, M.
2008-11-01
A simple digital receiver named GNU Radio Beacon Receiver (GRBR) was developed for the satellite-ground beacon experiment to measure the ionospheric total electron content (TEC). The open-source software toolkit for the software defined radio, GNU Radio, is utilized to realize the basic function of the receiver and perform fast signal processing. The software is written in Python for a LINUX PC. The open-source hardware called Universal Software Radio Peripheral (USRP), which best matches the GNU Radio, is used as a front-end to acquire the satellite beacon signals of 150 and 400 MHz. The first experiment was successful as results from GRBR showed very good agreement to those from the co-located analog beacon receiver. Detailed design information and software codes are open at the URL http://www.rish.kyoto-u.ac.jp/digitalbeacon/.
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
Bodzon-Kulakowska, Anna; Marszalek-Grabska, Marta; Antolak, Anna; Drabik, Anna; Kotlinska, Jolanta H; Suder, Piotr
Data analysis from mass spectrometry imaging (MSI) imaging experiments is a very complex task. Most of the software packages devoted to this purpose are designed by the mass spectrometer manufacturers and, thus, are not freely available. Laboratories developing their own MS-imaging sources usually do not have access to the commercial software, and they must rely on the freely available programs. The most recognized ones are BioMap, developed by Novartis under Interactive Data Language (IDL), and Datacube, developed by the Dutch Foundation for Fundamental Research of Matter (FOM-Amolf). These two systems were used here for the analysis of images received from rat brain tissues subjected to morphine influence and their capabilities were compared in terms of ease of use and the quality of obtained results.
Noncoherent sampling technique for communications parameter estimations
NASA Technical Reports Server (NTRS)
Su, Y. T.; Choi, H. J.
1985-01-01
This paper presents a method of noncoherent demodulation of the PSK signal for signal distortion analysis at the RF interface. The received RF signal is downconverted and noncoherently sampled for further off-line processing. Any mismatch in phase and frequency is then compensated for by the software using the estimation techniques to extract the baseband waveform, which is needed in measuring various signal parameters. In this way, various kinds of modulated signals can be treated uniformly, independent of modulation format, and additional distortions introduced by the receiver or the hardware measurement instruments can thus be eliminated. Quantization errors incurred by digital sampling and ensuing software manipulations are analyzed and related numerical results are presented also.
Wide-Area Persistent Energy-Efficient Maritime Sensing
2015-09-30
Matt Reynolds, Lefteris Kampianakis, and Andreas Pedrosse-Engel at UW designed and tested a Software Defined Radar testbed as well as an Arduino - based ...hardware based on a software-defined radio platform. 2) Development of a standalone Arduino - based backscatter node. 3) Analysis of the limits of the... Arduino - based node that can modulate radar backscatter with data received from a sensor using a low-power Arduino Nano processor. Figure 5 shows a
Data and Analysis Center for Software
1990-03-01
is available to DACS users. 7.4 Bibliographic Services Bibliographic inquiries to the DACS are received in many forms: by letter, by telephone call, by...of potential users concerning the DACS and its products and services. 9-4 10.0 TASK 9 - SPECIAL STUDIES AND PROJECTS 10.1 Introduction There are many ...problems related to software technology that can be solved through the full service capabilities provided by the DACS. Many of these are sizable
GNSS software receiver sampling noise and clock jitter performance and impact analysis
NASA Astrophysics Data System (ADS)
Chen, Jian Yun; Feng, XuZhe; Li, XianBin; Wu, GuangYao
2015-02-01
In the design of a multi-frequency multi-constellation GNSS software defined radio receivers is becoming more and more popular due to its simple architecture, flexible configuration and good coherence in multi-frequency signal processing. It plays an important role in navigation signal processing and signal quality monitoring. In particular, GNSS software defined radio receivers driving the sampling clock of analogue-to-digital converter (ADC) by FPGA implies that a more flexible radio transceiver design is possible. According to the concept of software defined radio (SDR), the ideal is to digitize as close to the antenna as possible. Whereas the carrier frequency of GNSS signal is of the frequency of GHz, converting at this frequency is expensive and consumes more power. Band sampling method is a cheaper, more effective alternative. When using band sampling method, it is possible to sample a RF signal at twice the bandwidth of the signal. Unfortunately, as the other side of the coin, the introduction of SDR concept and band sampling method induce negative influence on the performance of the GNSS receivers. ADC's suffer larger sampling clock jitter generated by FPGA; and low sampling frequency introduces more noise to the receiver. Then the influence of sampling noise cannot be neglected. The paper analyzes the sampling noise, presents its influence on the carrier noise ratio, and derives the ranging error by calculating the synchronization error of the delay locked loop. Simulations aiming at each impact factors of sampling-noise-induced ranging error are performed. Simulation and experiment results show that if the target ranging accuracy is at the level of centimeter, the quantization length should be no less than 8 and the sampling clock jitter should not exceed 30ps.
NASA Technical Reports Server (NTRS)
Zhou, Hanying
2007-01-01
PREDICTS is a computer program that predicts the frequencies, as functions of time, of signals to be received by a radio science receiver in this case, a special-purpose digital receiver dedicated to analysis of signals received by an antenna in NASA s Deep Space Network (DSN). Unlike other software used in the DSN, PREDICTS does not use interpolation early in the calculations; as a consequence, PREDICTS is more precise and more stable. The precision afforded by the other DSN software is sufficient for telemetry; the greater precision afforded by PREDICTS is needed for radio-science experiments. In addition to frequencies as a function of time, PREDICTS yields the rates of change and interpolation coefficients for the frequencies and the beginning and ending times of reception, transmission, and occultation. PREDICTS is applicable to S-, X-, and Ka-band signals and can accommodate the following link configurations: (1) one-way (spacecraft to ground), (2) two-way (from a ground station to a spacecraft to the same ground station), and (3) three-way (from a ground transmitting station to a spacecraft to a different ground receiving station).
NASA Technical Reports Server (NTRS)
Chie, C. M.; Su, Y. T.; Lindsey, W. C.; Koukos, J.
1984-01-01
The autonomous and integrated aspects of the operation of the AIRS (Autonomous Integrated Receive System) are discussed from a system operation point of view. The advantages of AIRS compared to the existing SSA receive chain equipment are highlighted. The three modes of AIRS operation are addressed in detail. The configurations of the AIRS are defined as a function of the operating modes and the user signal characteristics. Each AIRS configuration selection is made up of three components: the hardware, the software algorithms and the parameters used by these algorithms. A comparison between AIRS and the wide dynamics demodulation (WDD) is provided. The organization of the AIRS analytical/simulation software is described. The modeling and analysis is for simulating the performance of the PN subsystem is documented. The frequence acquisition technique using a frequency-locked loop is also documented. Doppler compensation implementation is described. The technological aspects of employing CCD's for PN acquisition are addressed.
Digital coherent receiver based transmitter penalty characterization.
Geisler, David J; Kaufmann, John E
2016-12-26
For optical communications links where receivers are signal-power-starved, such as through free-space, it is important to design transmitters and receivers that can operate as close as practically possible to theoretical limits. A total system penalty is typically assessed in terms of how far the end-to-end bit-error rate (BER) is from these limits. It is desirable, but usually difficult, to determine the division of this penalty between the transmitter and receiver. This paper describes a new rigorous and computationally based method that isolates which portion of the penalty can be assessed against the transmitter. There are two basic parts to this approach: (1) use of a coherent optical receiver to perform frequency down-conversion of a transmitter's optical signal waveform to the electrical domain, preserving both optical field amplitude and phase information, and (2): software-based analysis of the digitized electrical waveform. The result is a single numerical metric that quantifies how close a transmitter's signal waveform is to the ideal, based on its BER performance with a perfect software-defined matched-filter receiver demodulator. A detailed description of applying the proposed methodology to the waveform characterization of an optical burst-mode differential phase-shifted keying (DPSK) transmitter is experimentally demonstrated.
The March 1985 demonstration of the fiducial network concept for GPS geodesy: A preliminary report
NASA Technical Reports Server (NTRS)
Davidson, J. M.; Thornton, C. L.; Dixon, T. H.; Vegos, C. J.; Young, L. E.; Yunck, T. P.
1986-01-01
The first field tests in preparation for the NASA Global Positioning System (GPS) Caribbean Initiative were conducted in late March and Early April of 1985. The GPS receivers were located at the POLARIS Very Long Base Interferometry (VLBI) stations at Westford, Massachusetts; Richmond, Florida; and Ft. Davis, Texas; and at the Mojave, Owens Valley, and Hat Creek VLBI stations in California. Other mobile receivers were placed near Mammoth Lakes, California; Pt. Mugu, California; Austin, Texas; and Dahlgren, Virginia. These sites were equipped with a combination of GPS receiver types, including SERIES-X, TI-4100 and AFGL dual frequency receivers. The principal objectives of these tests were the demonstration of the fiducial network concept for precise GPS geodesy, the performance assessment of the participating GPS receiver types, and to conduct the first in a series of experiments to monitor ground deformation in the Mammoth Lakes-Long Valley caldera region in California. Other objectives included the testing of the water vapor radiometers for the calibration of GPS data, the development of efficient procedures for planning and coordinating GPS field exercise, the establishment of institutional interfaces for future cooperating ventures, the testing of the GPS Data Analysis Software (GIPSY, for GPS Inferred Positioning SYstem), and the establishment of a set of calibration baselines in California. Preliminary reports of the success of the field tests, including receiver performance and data quality, and on the status of the data analysis software are given.
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
Listening to the student voice to improve educational software.
van Wyk, Mari; van Ryneveld, Linda
2017-01-01
Academics often develop software for teaching and learning purposes with the best of intentions, only to be disappointed by the low acceptance rate of the software by their students once it is implemented. In this study, the focus is on software that was designed to enable veterinary students to record their clinical skills. A pilot of the software clearly showed that the program had not been received as well as had been anticipated, and therefore the researchers used a group interview and a questionnaire with closed-ended and open-ended questions to obtain the students' feedback. The open-ended questions were analysed with conceptual content analysis, and themes were identified. Students made valuable suggestions about what they regarded as important considerations when a new software program is introduced. The most important lesson learnt was that students cannot always predict their needs accurately if they are asked for input prior to the development of software. For that reason student input should be obtained on a continuous and regular basis throughout the design and development phases.
PPM Receiver Implemented in Software
NASA Technical Reports Server (NTRS)
Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement
2010-01-01
A computer program has been written as a tool for developing optical pulse-position- modulation (PPM) receivers in which photodetector outputs are fed to analog-to-digital converters (ADCs) and all subsequent signal processing is performed digitally. The program can be used, for example, to simulate an all-digital version of the PPM receiver described in Parallel Processing of Broad-Band PPM Signals (NPO-40711), which appears elsewhere in this issue of NASA Tech Briefs. The program can also be translated into a design for digital PPM receiver hardware. The most notable innovation embodied in the software and the underlying PPM-reception concept is a digital processing subsystem that performs synchronization of PPM time slots, even though the digital processing is, itself, asynchronous in the sense that no attempt is made to synchronize it with the incoming optical signal a priori and there is no feedback to analog signal processing subsystems or ADCs. Functions performed by the software receiver include time-slot synchronization, symbol synchronization, coding preprocessing, and diagnostic functions. The program is written in the MATLAB and Simulink software system. The software receiver is highly parameterized and, hence, programmable: for example, slot- and symbol-synchronization filters have programmable bandwidths.
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Hansson, Jonny; Månsson, Lars Gunnar; Båth, Magnus
2016-06-01
The purpose of the present work was to investigate the validity of using single-reader-adapted receiver operating characteristics (ROC) software for analysis of visual grading characteristics (VGC) data. VGC data from four published VGC studies on optimisation of X-ray examinations, previously analysed using ROCFIT, were reanalysed using a recently developed software dedicated to VGC analysis (VGC Analyzer), and the outcomes [the mean and 95 % confidence interval (CI) of the area under the VGC curve (AUCVGC) and the p-value] were compared. The studies included both paired and non-paired data and were reanalysed both for the fixed-reader and the random-reader situations. The results showed good agreement between the softwares for the mean AUCVGC For non-paired data, wider CIs were obtained with VGC Analyzer than previously reported, whereas for paired data, the previously reported CIs were similar or even broader. Similar observations were made for the p-values. The results indicate that the use of single-reader-adapted ROC software such as ROCFIT for analysing non-paired VGC data may lead to an increased risk of committing Type I errors, especially in the random-reader situation. On the other hand, the use of ROC software for analysis of paired VGC data may lead to an increased risk of committing Type II errors, especially in the fixed-reader situation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Baxa, E. G., Jr.
1974-01-01
A theoretical formulation of differential and composite OMEGA error is presented to establish hypotheses about the functional relationships between various parameters and OMEGA navigational errors. Computer software developed to provide for extensive statistical analysis of the phase data is described. Results from the regression analysis used to conduct parameter sensitivity studies on differential OMEGA error tend to validate the theoretically based hypothesis concerning the relationship between uncorrected differential OMEGA error and receiver separation range and azimuth. Limited results of measurement of receiver repeatability error and line of position measurement error are also presented.
Modified timing module for Loran-C receiver
NASA Technical Reports Server (NTRS)
Lilley, R. W.
1983-01-01
Full hardware documentation is provided for the circuit card implementing the Loran-C timing loop, and the receiver event-mark and re-track functions. This documentation is to be combined with overall receiver drawings to form the as-built record for this device. Computer software to support this module is integrated with the remainder of the receiver software, in the LORPROM program.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
Treatment planning and dose analysis for interstitial photodynamic therapy of prostate cancer
NASA Astrophysics Data System (ADS)
Davidson, Sean R. H.; Weersink, Robert A.; Haider, Masoom A.; Gertner, Mark R.; Bogaards, Arjen; Giewercer, David; Scherz, Avigdor; Sherar, Michael D.; Elhilali, Mostafa; Chin, Joseph L.; Trachtenberg, John; Wilson, Brian C.
2009-04-01
With the development of new photosensitizers that are activated by light at longer wavelengths, interstitial photodynamic therapy (PDT) is emerging as a feasible alternative for the treatment of larger volumes of tissue. Described here is the application of PDT treatment planning software developed by our group to ensure complete coverage of larger, geometrically complex target volumes such as the prostate. In a phase II clinical trial of TOOKAD vascular targeted photodynamic therapy (VTP) for prostate cancer in patients who failed prior radiotherapy, the software was used to generate patient-specific treatment prescriptions for the number of treatment fibres, their lengths, their positions and the energy each delivered. The core of the software is a finite element solution to the light diffusion equation. Validation against in vivo light measurements indicated that the software could predict the location of an iso-fluence contour to within approximately ±2 mm. The same software was used to reconstruct the treatments that were actually delivered, thereby providing an analysis of the threshold light dose required for TOOKAD-VTP of the post-irradiated prostate. The threshold light dose for VTP-induced prostate damage, as measured one week post-treatment using contrast-enhanced MRI, was found to be highly heterogeneous, both within and between patients. The minimum light dose received by 90% of the prostate, D90, was determined from each patient's dose-volume histogram and compared to six-month sextant biopsy results. No patient with a D90 less than 23 J cm-2 had complete biopsy response, while 8/13 (62%) of patients with a D90 greater than 23 J cm-2 had negative biopsies at six months. The doses received by the urethra and the rectal wall were also investigated.
NASA Technical Reports Server (NTRS)
Caplin, R. S.; Royer, E. R.
1977-01-01
Design analysis of a microbial load monitor system flight engineering model was presented. Checkout of the card taper and media pump system was fabricated as well as the final two incubating reading heads, the sample receiving and card loading device assembly, related sterility testing, and software. Progress in these areas was summarized.
Status report of the SRT radiotelescope control software: the DISCOS project
NASA Astrophysics Data System (ADS)
Orlati, A.; Bartolini, M.; Buttu, M.; Fara, A.; Migoni, C.; Poppi, S.; Righini, S.
2016-08-01
The Sardinia Radio Telescope (SRT) is a 64-m fully-steerable radio telescope. It is provided with an active surface to correct for gravitational deformations, allowing observations from 300 MHz to 100 GHz. At present, three receivers are available: a coaxial LP-band receiver (305-410 MHz and 1.5-1.8 GHz), a C-band receiver (5.7-7.7 GHz) and a 7-feed K-band receiver (18-26.5 GHz). Several back-ends are also available in order to perform the different data acquisition and analysis procedures requested by scientific projects. The design and development of the SRT control software started in 2004, and now belongs to a wider project called DISCOS (Development of the Italian Single-dish COntrol System), which provides a common infrastructure to the three Italian radio telescopes (Medicina, Noto and SRT dishes). DISCOS is based on the Alma Common Software (ACS) framework, and currently consists of more than 500k lines of code. It is organized in a common core and three specific product lines, one for each telescope. Recent developments, carried out after the conclusion of the technical commissioning of the instrument (October 2013), consisted in the addition of several new features in many parts of the observing pipeline, spanning from the motion control to the digital back-ends for data acquisition and data formatting; we brie y describe such improvements. More importantly, in the last two years we have supported the astronomical validation of the SRT radio telescope, leading to the opening of the first public call for proposals in late 2015. During this period, while assisting both the engineering and the scientific staff, we massively employed the control software and were able to test all of its features: in this process we received our first feedback from the users and we could verify how the system performed in a real-life scenario, drawing the first conclusions about the overall system stability and performance. We examine how the system behaves in terms of network load and system load, how it reacts to failures and errors, and what components and services seem to be the most critical parts of our architecture, showing how the ACS framework impacts on these aspects. Moreover, the exposure to public utilization has highlighted the major flaws in our development and software management process, which had to be tuned and improved in order to achieve faster release cycles in response to user feedback, and safer deploy operations. In this regard we show how the introduction of testing practices, along with continuous integration, helped us to meet higher quality standards. Having identified the most critical aspects of our software, we conclude showing our intentions for the future development of DISCOS, both in terms of software features and software infrastructures.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
NASA Technical Reports Server (NTRS)
Wennersten, Miriam; Banes, Vince; Boegner, Greg; Clagnett, Charles; Dougherty, Lamar; Edwards, Bernard; Roman, Joe; Bauer, Frank H. (Technical Monitor)
2001-01-01
NASA Goddard Space Flight Center has built an open architecture, 24 channel spaceflight Global Positioning System (GPS) receiver. The compact PCI PiVoT GPS receiver card is based on the Mitel/GEC Plessey Builder 2 board. PiVoT uses two Plessey 2021 correlators to allow tracking of up to 24 separate GPS SV's on unique channels. Its four front ends can support four independent antennas, making it a useful card for hosting GPS attitude determination algorithms. It has been built using space quality, radiation tolerant parts. The PiVoT card works at a lower signal to noise ratio than the original Builder 2 board. It also hosts an improved clock oscillator. The PiVoT software is based on the original Piessey Builder 2 software ported to the Linux operating system. The software is posix compliant and can be easily converted to other posix operating systems. The software is open source to anyone with a licensing agreement with Plessey. Additional tasks can be added to the software to support GPS science experiments or attitude determination algorithms. The next generation PiVoT receiver will be a single radiation hardened compact PCI card containing the microprocessor and the GPS receiver optimized for use above the GPS constellation.
Scientific Data Analysis and Software Support: Geodynamics
NASA Technical Reports Server (NTRS)
Klosko, Steven; Sanchez, B. (Technical Monitor)
2000-01-01
The support on this contract centers on development of data analysis strategies, geodynamic models, and software codes to study four-dimensional geodynamic and oceanographic processes, as well as studies and mission support for near-Earth and interplanetary satellite missions. SRE had a subcontract to maintain the optical laboratory for the LTP, where instruments such as MOLA and GLAS are developed. NVI performed work on a Raytheon laser altimetry task through a subcontract, providing data analysis and final data production for distribution to users. HBG had a subcontract for specialized digital topography analysis and map generation. Over the course of this contract, Raytheon ITSS staff have supported over 60 individual tasks. Some tasks have remained in place during this entire interval whereas others have been completed and were of shorter duration. Over the course of events, task numbers were changed to reflect changes in the character of the work or new funding sources. The description presented below will detail the technical accomplishments that have been achieved according to their science and technology areas. What will be shown is a brief overview of the progress that has been made in each of these investigative and software development areas. Raytheon ITSS staff members have received many awards for their work on this contract, including GSFC Group Achievement Awards for TOPEX Precision Orbit Determination and the Joint Gravity Model One Team. NASA JPL gave the TOPEX/POSEIDON team a medal commemorating the completion of the primary mission and a Certificate of Appreciation. Raytheon ITSS has also received a Certificate of Appreciation from GSFC for its extensive support of the Shuttle Laser Altimeter Experiment.
Optical design and optimization of parabolic dish solar concentrator with a cavity hybrid receiver
NASA Astrophysics Data System (ADS)
Blázquez, R.; Carballo, J.; Silva, M.
2016-05-01
One of the main goals of the BIOSTIRLING-4SKA project, funded by the European Commission, is the development of a hybrid Dish-Stirling system based on a hybrid solar-gas receiver, which has been designed by the Swedish company Cleanergy. A ray tracing study, which is part of the design of this parabolic dish system, is presented in this paper. The study pursues the optimization of the concentrator and receiver cavity geometry according to the requirements of flux distribution on the receiver walls set by the designer of the hybrid receiver. The ray-tracing analysis has been performed with the open source software Tonatiuh, a ray-tracing tool specifically oriented to the modeling of solar concentrators.
Listening to the student voice to improve educational software
van Wyk, Mari; van Ryneveld, Linda
2017-01-01
ABSTRACT Academics often develop software for teaching and learning purposes with the best of intentions, only to be disappointed by the low acceptance rate of the software by their students once it is implemented. In this study, the focus is on software that was designed to enable veterinary students to record their clinical skills. A pilot of the software clearly showed that the program had not been received as well as had been anticipated, and therefore the researchers used a group interview and a questionnaire with closed-ended and open-ended questions to obtain the students’ feedback. The open-ended questions were analysed with conceptual content analysis, and themes were identified. Students made valuable suggestions about what they regarded as important considerations when a new software program is introduced. The most important lesson learnt was that students cannot always predict their needs accurately if they are asked for input prior to the development of software. For that reason student input should be obtained on a continuous and regular basis throughout the design and development phases. PMID:28678678
Atmosphere Explorer control system software (version 2.0)
NASA Technical Reports Server (NTRS)
Mocarsky, W.; Villasenor, A.
1973-01-01
The Atmosphere Explorer Control System (AECS) was developed to provide automatic computer control of the Atmosphere Explorer spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The AECS was written for a 48K XEROX Data System Sigma 5 computer, and coexists in core with the XDS Real-time Batch Monitor (RBM) executive system. RBM is a flexible operating system designed for a real-time foreground/background environment, and hence is ideally suited for this application. Existing capabilities of RBM have been used as much as possible by AECS to minimize programming redundancy. The most important functions of the AECS are to send commands to the spacecraft and experiments, and to receive, process, and display telemetry data.
SDR implementation of the receiver of adaptive communication system
NASA Astrophysics Data System (ADS)
Skarzynski, Jacek; Darmetko, Marcin; Kozlowski, Sebastian; Kurek, Krzysztof
2016-04-01
The paper presents software implementation of a receiver forming a part of an adaptive communication system. The system is intended for communication with a satellite placed in a low Earth orbit (LEO). The ability of adaptation is believed to increase the total amount of data transmitted from the satellite to the ground station. Depending on the signal-to-noise ratio (SNR) of the received signal, adaptive transmission is realized using different transmission modes, i.e., different modulation schemes (BPSK, QPSK, 8-PSK, and 16-APSK) and different convolutional code rates (1/2, 2/3, 3/4, 5/6, and 7/8). The receiver consists of a software-defined radio (SDR) module (National Instruments USRP-2920) and a multithread reception software running on Windows operating system. In order to increase the speed of signal processing, the software takes advantage of single instruction multiple data instructions supported by x86 processor architecture.
Giancarlo, R; Scaturro, D; Utro, F
2015-02-01
The prediction of the number of clusters in a dataset, in particular microarrays, is a fundamental task in biological data analysis, usually performed via validation measures. Unfortunately, it has received very little attention and in fact there is a growing need for software tools/libraries dedicated to it. Here we present ValWorkBench, a software library consisting of eleven well known validation measures, together with novel heuristic approximations for some of them. The main objective of this paper is to provide the interested researcher with the full software documentation of an open source cluster validation platform having the main features of being easily extendible in a homogeneous way and of offering software components that can be readily re-used. Consequently, the focus of the presentation is on the architecture of the library, since it provides an essential map that can be used to access the full software documentation, which is available at the supplementary material website [1]. The mentioned main features of ValWorkBench are also discussed and exemplified, with emphasis on software abstraction design and re-usability. A comparison with existing cluster validation software libraries, mainly in terms of the mentioned features, is also offered. It suggests that ValWorkBench is a much needed contribution to the microarray software development/algorithm engineering community. For completeness, it is important to mention that previous accurate algorithmic experimental analysis of the relative merits of each of the implemented measures [19,23,25], carried out specifically on microarray data, gives useful insights on the effectiveness of ValWorkBench for cluster validation to researchers in the microarray community interested in its use for the mentioned task. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Lehmann, Eldon D
2003-01-01
AIDA is a diabetes-computing program freely available at www.2aida.org on the Web. The software is intended to serve as an educational support tool and can be used by anyone who has an interest in diabetes, whether they be patients, relatives, health-care professionals, or students. In previous "Diabetes Information Technology & WebWatch" columns various indicators of usage of the AIDA program have been reviewed, and various comments from users of the software have been documented. The purpose of this column is to overview a proof-of-concept semi-automated analysis about why people are downloading the latest version of the AIDA educational diabetes program. AIDA permits the interactive simulation of plasma insulin and blood glucose profiles for teaching, demonstration, self-learning, and research purposes. It has been made freely available, without charge, on the Internet as a noncommercial contribution to continuing diabetes education. Since its launch in 1996 over 300,000 visits have been logged at the main AIDA Website-www.2aida.org-and over 60,000 copies of the AIDA program have been downloaded free-of-charge. This column documents the results of a semi-automated analysis of comments left by Website visitors while they were downloading the AIDA software, before they had a chance to use the program. The Internet-based survey methodology and semi-automated analysis were both found to be robust and reliable. Over a 5-month period (from October 3, 2001 to February 28, 2002) 400 responses were received. During the corresponding period 1,770 actual visits were made to the Website survey page-giving a response rate to this proof-of-concept study of 22.6%. Responses were received from participants in over 54 countries-with nearly half of these (n = 194; 48.5%) originating from the United States, United Kingdom, and Canada; 208 responses (52.0%) were received from patients with diabetes, 50 (12.5%) from doctors, 49 (12.3%) from relatives of patients, with fewer responses from students, diabetes educators, nurses, pharmacists, and other end users. The semi-automated analysis adopted for this study has re-affirmed the feasibility of using the Internet to obtain free-text comments, at no real cost, from a substantial number of medical software downloaders/users. The survey has also offered some insight into why members of the public continue to turn to the Internet for medical information. Furthermore it has provided useful information about why people are actually downloading the AIDA v4.3a interactive educational "virtual diabetes patient" simulator.
Remote software upload techniques in future vehicles and their performance analysis
NASA Astrophysics Data System (ADS)
Hossain, Irina
Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.
Assessing the Effects of Multi-Node Sensor Network Configurations on the Operational Tempo
2014-09-01
receiver, nP is the noise power of the receiver, and iL is the implementation loss of the receiver due to hardware manufacturing. The received...13. ABSTRACT (maximum 200 words) The LPISimNet software tool provides the capability to quantify the performance of sensor network configurations by...INTENTIONALLY LEFT BLANK v ABSTRACT The LPISimNet software tool provides the capability to quantify the performance of sensor network configurations
Avionics Simulation, Development and Software Engineering
NASA Technical Reports Server (NTRS)
Francis, Ronald C.; Settle, Gray; Tobbe, Patrick A.; Kissel, Ralph; Glaese, John; Blanche, Jim; Wallace, L. D.
2001-01-01
This monthly report summarizes the work performed under contract NAS8-00114 for Marshall Space Flight Center in the following tasks: 1) Purchase Order No. H-32831D, Task Order 001A, GPB Program Software Oversight; 2) Purchase Order No. H-32832D, Task Order 002, ISS EXPRESS Racks Software Support; 3) Purchase Order No. H-32833D, Task Order 003, SSRMS Math Model Integration; 4) Purchase Order No. H-32834D, Task Order 004, GPB Program Hardware Oversight; 5) Purchase Order No. H-32835D, Task Order 005, Electrodynamic Tether Operations and Control Analysis; 6) Purchase Order No. H-32837D, Task Order 007, SRB Command Receiver/Decoder; and 7) Purchase Order No. H-32838D, Task Order 008, AVGS/DART SW and Simulation Support
Development of a Multi-frequency Interferometer Telescope for Radio Astronomy (MITRA)
NASA Astrophysics Data System (ADS)
Ingala, Dominique Guelord Kumamputu
2015-03-01
This dissertation describes the development and construction of the Multi-frequency Interferometer Telescope for Radio Astronomy (MITRA) at the Durban University of Technology. The MITRA station consists of 2 antenna arrays separated by a baseline distance of 8 m. Each array consists of 8 Log-Periodic Dipole Antennas (LPDAs) operating from 200 MHz to 800 MHz. The design and construction of the LPDA antenna and receiver system is described. The receiver topology provides an equivalent noise temperature of 113.1 K and 55.1 dB of gain. The Intermediate Frequency (IF) stage was designed to produce a fixed IF frequency of 800 MHz. The digital Back-End and correlator were implemented using a low cost Software Defined Radio (SDR) platform and Gnu-Radio software. Gnu-Octave was used for data analysis to generate the relevant received signal parameters including total power, real, and imaginary, magnitude and phase components. Measured results show that interference fringes were successfully detected within the bandwidth of the receiver using a Radio Frequency (RF) generator as a simulated source. This research was presented at the IEEE Africon 2013 / URSI Session Mauritius, and published in the proceedings.
NASA Technical Reports Server (NTRS)
Wennersten, Miriam Dvorak; Banes, Anthony Vince; Boegner, Gregory J.; Dougherty, Lamar; Edwards, Bernard L.; Roman, Joseph; Bauer, Frank H. (Technical Monitor)
2001-01-01
NASA Goddard Space Flight Center has built an open architecture, 24 channel space flight GPS receiver. The CompactPCI PiVoT GPS receiver card is based on the Mitel/GEC Plessey Builder-2 board. PiVoT uses two Plessey 2021 correlators to allow tracking of up to 24 separate GPS SV's on unique channels. Its four front ends can support four independent antennas, making it a useful card for hosting GPS attitude determination algorithms. It has been built using space quality, radiation tolerant parts. The PiVoT card will track a weaker signal than the original Builder 2 board. It also hosts an improved clock oscillator. The PiVoT software is based on the original Plessey Builder 2 software ported to the Linux operating system. The software is POSIX complaint and can easily be converted to other POSIX operating systems. The software is open source to anyone with a licensing agreement with Plessey. Additional tasks can be added to the software to support GPS science experiments or attitude determination algorithms. The next generation PiVoT receiver will be a single radiation hardened CompactPCI card containing the microprocessor and the GPS receiver optimized for use above the GPS constellation. PiVoT was flown successfully on a balloon in July, 2001, for its first non-simulated flight.
NASA Astrophysics Data System (ADS)
Markov, N. G.; E Vasilyeva, E.; Evsyutkin, I. V.
2017-01-01
The intellectual information system for management of geological and technical arrangements during oil fields exploitation is developed. Service-oriented architecture of its software is a distinctive feature of the system. The results of the cluster analysis of real field data received by means of this system are shown.
77 FR 65550 - Compete, Inc.; Analysis of Proposed Consent Order To Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
... Public Reference Room, Room 130-H, 600 Pennsylvania Avenue NW., Washington, DC 20580, either in person or... interested persons. Comments received during this period will become part of the public record. After thirty... proposed order. Compete develops software for tracking consumers as they shop, browse and interact with...
ERIC Educational Resources Information Center
Basile, Anthony; D'Aquila, Jill M.
2002-01-01
Accounting students received either traditional instruction (n=46) or used computer-mediated communication and WebCT course management software. There were no significant differences in attitudes about the course. However, computer users were more positive about course delivery and course management tools. (Contains 17 references.) (SK)
Ground station software for receiving and handling Irecin telemetry data
NASA Astrophysics Data System (ADS)
Ferrante, M.; Petrozzi, M.; Di Ciolo, L.; Ortenzi, A.; Troso, G
2004-11-01
The on board resources, needed to perform the mission tasks, are very limited in nano-satellites. This paper proposes a software system to receive, manage and process in Real Time the Telemetry data coming from IRECIN nanosatellite and transmit operator manual commands and operative procedures. During the receiving phase, it shows the IRECIN subsystem physical values, visualizes the IRECIN attitude, and performs other suitable functions. The IRECIN Ground Station program is in charge to exchange information between IRECIN and the Ground segment. It carries out, in real time during IRECIN transmission phase, IRECIN attitude drawing, sun direction drawing, power supply received from Sun, visualization of the telemetry data, visualization of Earth magnetic field and more other functions. The received data are memorized and interpreted by a module, parser, and distribute to the suitable modules. Moreover it allows sending manual and automatic commands. Manual commands are delivered by an operator, on the other hand, automatic commands are provided by pre-configured operative procedures. Operative procedures development is realized in a previous phase called configuration phase. This program is also in charge to carry out a test session by mean the scheduler and commanding modules allowing execution of specific tasks without operator control. A log module to memorize received and transmitted data is realized. A phase to analyze, filter and visualize in off line the collected data, called post analysis, is based on the data extraction form the log module. At the same time, the Ground Station Software can work in network allowing managing, receiving and sending data/commands from different sites. The proposed system constitutes the software of IRECIN Ground Station. IRECIN is a modular nanosatellite weighting less than 2 kg, constituted by sixteen external sides with surface-mounted solar cells and three internal Al plates, kept together by four steel bars. Lithium-ions batteries are used. Attitude is determined by two three-axis magnetometers and the solar panels data. Control is provided by an active magnetic control system. The spacecraft will be spin- stabilized with the spin-axis normal to the orbit. All IRECIN electronic components are SMD technology in order to reduce weight and size. The realized Electronic board are completely developed, realized and tested at the Vitrociset S.P.A. under control of Research and Develop Group
NASA Astrophysics Data System (ADS)
Boettcher, M. A.; Butt, B. M.; Klinkner, S.
2016-10-01
A major concern of a university satellite mission is to download the payload and the telemetry data from a satellite. While the ground station antennas are in general easy and with limited afford to procure, the receiving unit is most certainly not. The flexible and low-cost software-defined radio (SDR) transceiver "BladeRF" is used to receive the QPSK modulated and CCSDS compliant coded data of a satellite in the HAM radio S-band. The control software is based on the Open Source program GNU Radio, which also is used to perform CCSDS post processing of the binary bit stream. The test results show a good performance of the receiving system.
Sensible heat receiver for solar dynamic space power system
NASA Astrophysics Data System (ADS)
Perez-Davis, Marla E.; Gaier, James R.; Petrefski, Chris
A sensible heat receiver is considered which uses a vapor grown carbon fiber-carbon (VGCF/C) composite as the thermal storage medium and which was designed for a 7-kW Brayton engine. This heat receiver stores the required energy to power the system during eclipse in the VGCF/C composite. The heat receiver thermal analysis was conducted through the Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA) software package. The sensible heat receiver compares well with other latent and advanced sensible heat receivers analyzed in other studies, while avoiding the problems associated with latent heat storage salts and liquid metal heat pipes. The concept also satisfies the design requirements for a 7-kW Brayton engine system. The weight and size of the system can be optimized by changes in geometry and technology advances for this new material.
Sensible heat receiver for solar dynamic space power system
NASA Technical Reports Server (NTRS)
Perez-Davis, Marla E.; Gaier, James R.; Petrefski, Chris
1991-01-01
A sensible heat receiver considered in this study uses a vapor grown carbon fiber-carbon (VGCF/C) composite as the thermal storage media and was designed for a 7 kW Brayton engine. The proposed heat receiver stores the required energy to power the system during eclipse in the VGCF/C composite. The heat receiver thermal analysis was conducted through the Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA) software package. The sensible heat receiver compares well with other latent and advanced sensible heat receivers analyzed in other studies while avoiding the problems associated with latent heat storage salts and liquid metal heat pipes. The concept also satisfies the design requirements for a 7 kW Brayton engine system. The weight and size of the system can be optimized by changes in geometry and technology advances for this new material.
Sensible heat receiver for solar dynamic space power system
NASA Technical Reports Server (NTRS)
Perez-Davis, Marla E.; Gaier, James R.; Petrefski, Chris
1991-01-01
A sensible heat receiver is considered which uses a vapor grown carbon fiber-carbon (VGCF/C) composite as the thermal storage medium and which was designed for a 7-kW Brayton engine. This heat receiver stores the required energy to power the system during eclipse in the VGCF/C composite. The heat receiver thermal analysis was conducted through the Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA) software package. The sensible heat receiver compares well with other latent and advanced sensible heat receivers analyzed in other studies, while avoiding the problems associated with latent heat storage salts and liquid metal heat pipes. The concept also satisfies the design requirements for a 7-kW Brayton engine system. The weight and size of the system can be optimized by changes in geometry and technology advances for this new material.
Business Intelligence Applied to the ALMA Software Integration Process
NASA Astrophysics Data System (ADS)
Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.
2012-09-01
Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.
A reprogrammable receiver architecture for wireless signal interception
NASA Astrophysics Data System (ADS)
Yao, Timothy S.
2003-09-01
In this paper, a re-programmable receiver architecture, based on software-defined-radio concept, for wireless signal interception is presented. The radio-frequency (RF) signal that the receiver would like to intercept may come from a terrestrial cellular network or communication satellites, which their carrier frequency are in the range from 800 MHz (civilian mobile) to 15 GHz (Ku band). To intercept signals from such a wide range of frequency in these variant communication systems, the traditional way is to deploy multiple receivers to scan and detect the desired signal. This traditional approach is obviously unattractive due to the cost, efficiency, and accuracy. Instead, we propose a universal receiver, which is software-driven and re-configurable, to intercept signals of interest. The software-defined-radio based receiver first intercepts RF energy of wide spectrum (25MHz) through antenna, performs zero-IF down conversion (homodyne architecture) to baseband, and digital channelizes the baseband signal. The channelization module is a bank of high performance digital filters. The bandwidth of the filter bank is programmable according to the wireless communication protocol under watch. In the baseband processing, high-performance digital signal processors carry out the detection process and microprocessors handle the communication protocols. The baseband processing is also re-configurable for different wireless standards and protocol. The advantages of the software-defined-radio architecture over traditional RF receiver make it a favorable technology for the communication signal interception and surveillance.
Filtering Essays by Means of a Software Tool: Identifying Poor Essays
ERIC Educational Resources Information Center
Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit
2017-01-01
Writing essays and receiving feedback can be useful for fostering students' learning and motivation. When faced with large class sizes, it is desirable to identify students who might particularly benefit from feedback. In this article, we tested the potential of Latent Semantic Analysis (LSA) for identifying poor essays. A total of 14 teaching…
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
NASA Astrophysics Data System (ADS)
Davis, J. L.; Elosegui, P.; Nettles, M.
2012-12-01
Single-frequency GNSS data has not generally been used for high-accuracy geodetic applications since the 1990s, but there are significant advantages if single-frequency GNSS receivers can be usefully deployed for studies of fast-moving outlet glaciers. The cost for these receivers is significantly lower (~50%) than for dual-frequency receivers, a significant benefit given the high spatial density at which these system are deployed on the glacier and the high risk for damage or loss in the glacial environment. In addition, the size of the data files that need to be transferred from extremely remote locations, often at very slow transmission rates, is significantly reduced. Consideration of single-frequency systems for this application is viable because of the relatively small extent (< 50 km) of the entire network to be deployed. Unfortunately, the availability of research-quality software that can perform kinematic solutions on single-frequency data is limited. We have developed the BAKAR software employing a stochastic filter to analyze single-frequency GNSS data. The software can implement a range of stochastic models for time-dependent site position. In this presentation, we describe the BAKAR software, and discuss its strengths and weaknesses. On one hand, chief among the challenges we have encountered are determination of accurate prior positions, and bursts of polar ionospheric activity that impede cycle-slip detection, even over intersite distances as short as 10 km. On the other hand, use of a single-frequency observable is theoretically less sensitive to multipath and signal scattering. We will quantitatively assess these effects, and assess the accuracy of BAKAR in a range of situations and applications.
Formal methods demonstration project for space applications
NASA Technical Reports Server (NTRS)
Divito, Ben L.
1995-01-01
The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.
Open source software in a practical approach for post processing of radiologic images.
Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea
2015-03-01
The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.
An Exploration of Software-Based GNSS Signal Processing at Multiple Frequencies
NASA Astrophysics Data System (ADS)
Pasqual Paul, Manuel; Elosegui, Pedro; Lind, Frank; Vazquez, Antonio; Pankratius, Victor
2017-01-01
The Global Navigation Satellite System (GNSS; i.e., GPS, GLONASS, Galileo, and other constellations) has recently grown into numerous areas that go far beyond the traditional scope in navigation. In the geosciences, for example, high-precision GPS has become a powerful tool for a myriad of geophysical applications such as in geodynamics, seismology, paleoclimate, cryosphere, and remote sensing of the atmosphere. Positioning with millimeter-level accuracy can be achieved through carrier-phase-based, multi-frequency signal processing, which mitigates various biases and error sources such as those arising from ionospheric effects. Today, however, most receivers with multi-frequency capabilities are highly specialized hardware receiving systems with proprietary and closed designs, limited interfaces, and significant acquisition costs. This work explores alternatives that are entirely software-based, using Software-Defined Radio (SDR) receivers as a way to digitize the entire spectrum of interest. It presents an overview of existing open-source frameworks and outlines the next steps towards converting GPS software receivers from single-frequency to dual-frequency, geodetic-quality systems. In the future, this development will lead to a more flexible multi-constellation GNSS processing architecture that can be easily reused in different contexts, as well as to further miniaturization of receivers.
Multimedia Software Evaluation Form for Teachers
ERIC Educational Resources Information Center
Herring, Donna F.; Notar, Charles E.; Wilson, Janell D.
2005-01-01
Schools are currently receiving increased funds for multimedia software for classrooms. There is a need for good software in the schools, and there is a need to know how to evaluate software and not naively rely on advertisements. Evaluators of multimedia software for education must have the skills to critically evaluate and make decisions not…
Software package for performing experiments about the convolutionally encoded Voyager 1 link
NASA Technical Reports Server (NTRS)
Cheng, U.
1989-01-01
A software package enabling engineers to conduct experiments to determine the actual performance of long constraint-length convolutional codes over the Voyager 1 communication link directly from the Jet Propulsion Laboratory (JPL) has been developed. Using this software, engineers are able to enter test data from the Laboratory in Pasadena, California. The software encodes the data and then sends the encoded data to a personal computer (PC) at the Goldstone Deep Space Complex (GDSC) over telephone lines. The encoded data are sent to the transmitter by the PC at GDSC. The received data, after being echoed back by Voyager 1, are first sent to the PC at GDSC, and then are sent back to the PC at the Laboratory over telephone lines for decoding and further analysis. All of these operations are fully integrated and are completely automatic. Engineers can control the entire software system from the Laboratory. The software encoder and the hardware decoder interface were developed for other applications, and have been modified appropriately for integration into the system so that their existence is transparent to the users. This software provides: (1) data entry facilities, (2) communication protocol for telephone links, (3) data displaying facilities, (4) integration with the software encoder and the hardware decoder, and (5) control functions.
Center for Space Telemetering and Telecommunications Systems, New Mexico State University
NASA Technical Reports Server (NTRS)
Horan, Stephen; DeLeon, Phillip; Borah, Deva; Lyman, Ray
2002-01-01
This viewgraph presentation gives an overview of the Center for Space Telemetering and Telecommunications Systems activities at New Mexico State University. Presentations cover the following topics: (1) small satellite communications, including nanosatellite radio and virtual satellite development; (2) modulation and detection studies, including details on smooth phase interpolated keying (SPIK) spectra and highlights of an adaptive turbo multiuser detector; (3) decoupled approaches to nonlinear ISI compensation; (4) space internet testing; (4) optical communication; (5) Linux-based receiver for lightweight optical communications without a laser in space, including software design, performance analysis, and the receiver algorithm; (6) carrier tracking hardware; and (7) subband transforms for adaptive direct sequence spread spectrum receivers.
Barth, Martin; Weiß, Christel; Brenke, Christopher; Schmieder, Kirsten
2017-04-01
Software-based planning of a spinal implant inheres in the promise of precision and superior results. The purpose of the study was to analyze the measurement reliability, prognostic value, and scientific use of a surgical planning software in patients receiving anterior cervical discectomy and fusion (ACDF). Lateral neutral, flexion, and extension radiographs of patients receiving tailored cages as suggested by the planning software were available for analysis. Differences of vertebral wedging angles and segmental height of all cervical segments were determined at different timepoints using intraclass correlation coefficients (ICC). Cervical lordosis (C2/C7), segmental heights, global, and segmental range of motion (ROM) were determined at different timepoints. Clinical and radiological variables were correlated 12 months after surgery. 282 radiographs of 35 patients with a mean age of 53.1 ± 12.0 years were analyzed. Measurement of segmental height was highly accurate with an ICC near to 1, but angle measurements showed low ICC values. Likewise, the ICCs of the prognosticated values were low. Postoperatively, there was a significant decrease of segmental height (p < 0.0001) and loss of C2/C7 ROM (p = 0.036). ROM of unfused segments also significantly decreased (p = 0.016). High NDI was associated with low subsidence rates. The surgical planning software showed high accuracy in the measurement of height differences and lower accuracy values with angle measurements. Both the prognosticated height and angle values were arbitrary. Global ROM, ROM of the fused and intact segments, is restricted after ACDF.
A Post-Processing Receiver for the Lunar Laser Communications Demonstration Project
NASA Technical Reports Server (NTRS)
Srinivasan, Meera; Birnbaum, Kevin; Cheng, Michael; Quirk, Kevin
2013-01-01
The Lunar Laser Communications Demonstration Project undertaken by MIT Lincoln Laboratory and NASA's Goddard Space Flight Center will demonstrate high-rate laser communications from lunar orbit to the Earth. NASA's Jet Propulsion Laboratory is developing a backup ground station supporting a data rate of 39 Mbps that is based on a non-real-time software post-processing receiver architecture. This approach entails processing sample-rate-limited data without feedback in the presence high uncertainty in downlink clock characteristics under low signal flux conditions. In this paper we present a receiver concept that addresses these challenges with descriptions of the photodetector assembly, sample acquisition and recording platform, and signal processing approach. End-to-end coded simulation and laboratory data analysis results are presented that validate the receiver conceptual design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharibyan, N.
In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less
Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David
2017-10-01
Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P < .001). Means of software machine-derived values differed significantly from actual PLT yield, 4.72 × 10 11 vs.6.12 × 10 11 , respectively, (P < .001). The following equation was developed to adjust these values: actual PLT yield= 0.221 + (1.254 × theoretical platelet yield). ROC curve model showed an optimal apheresis device software prediction cut-off of 4.65 × 10 11 to obtain a DP, with a sensitivity of 82.2%, specificity of 93.3%, and an area under the curve (AUC) of 0.909. Trima Accel v6.0 software consistently underestimated PLT yields. Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Roelfsema, P. R.; Kester, D. J. M.; Wesselius, P. R.; Wieprech, E.; Sym, N.
1992-01-01
The software which is currently being developed for the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO) is described. The spectrometer has a wide range of capabilities in the 2-45 micron infrared band. SWS contains two independent gratings, one for the long and one for the short wavelength section of the band. With the gratings a spectral resolution of approximately 1000 to approximately 2500 can be obtained. The instrument also contains two Fabry-Perault's yielding a resolution between approximately 1000 and approximately 20000. Software is currently being developed for the acquisition, calibration, and analysis of SWS data. The software is firstly required to run in a pipeline mode without human interaction, to process data as they are received from the telescope. However, both for testing and calibration of the instrument as well as for evaluation of the planned operating procedures the software should also be suitable for interactive use. Thirdly the same software will be used for long term characterization of the instrument. The software must work properly within the environment designed by the European Space Agency (ESA) for the spacecraft operations. As a result strict constraints are put on I/O devices, throughput etc.
Thompson, J; Hogg, P; Thompson, S; Manning, D; Szczepura, K
2012-01-01
ROCView has been developed as an image display and response capture (IDRC) solution to image display and consistent recording of reader responses in relation to the free-response receiver operating characteristic paradigm. A web-based solution to IDRC for observer response studies allows observations to be completed from any location, assuming that display performance and viewing conditions are consistent with the study being completed. The simplistic functionality of the software allows observations to be completed without supervision. ROCView can display images from multiple modalities, in a randomised order if required. Following registration, observers are prompted to begin their image evaluation. All data are recorded via mouse clicks, one to localise (mark) and one to score confidence (rate) using either an ordinal or continuous rating scale. Up to nine “mark-rating” pairs can be made per image. Unmarked images are given a default score of zero. Upon completion of the study, both true-positive and false-positive reports can be downloaded and adapted for analysis. ROCView has the potential to be a useful tool in the assessment of modality performance difference for a range of imaging methods. PMID:22573294
Malware detection and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Ken; Lloyd, Levi; Crussell, Jonathan
Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable tomore » the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.« less
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, John C.
1987-01-01
Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.
Overview of Hazard Assessment and Emergency Planning Software of Use to RN First Responders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waller, E; Millage, K; Blakely, W F
2008-08-26
There are numerous software tools available for field deployment, reach-back, training and planning use in the event of a radiological or nuclear (RN) terrorist event. Specialized software tools used by CBRNe responders can increase information available and the speed and accuracy of the response, thereby ensuring that radiation doses to responders, receivers, and the general public are kept as low as reasonably achievable. Software designed to provide health care providers with assistance in selecting appropriate countermeasures or therapeutic interventions in a timely fashion can improve the potential for positive patient outcome. This paper reviews various software applications of relevance tomore » radiological and nuclear (RN) events that are currently in use by first responders, emergency planners, medical receivers, and criminal investigators.« less
Advanced sensible heat solar receiver for space power
NASA Technical Reports Server (NTRS)
Bennett, Timothy J.; Lacy, Dovie E.
1988-01-01
NASA Lewis, through in-house efforts, has begun a study to generate a conceptual design of a sensible heat solar receiver and to determine the feasibility of such a system for space power applications. The sensible heat solar receiver generated in this study uses pure lithium as the thermal storage medium and was designed for a 7 kWe Brayton (PCS) operating at 1100 K. The receiver consists of two stages interconnected via temperature sensing variable conductance sodium heat pipes. The lithium is contained within a niobium vessel and the outer shell of the receiver is constructed of third generation rigid, fibrous ceramic insulation material. Reradiation losses are controlled with niobium and aluminum shields. By nature of design, the sensible heat receiver generated in this study is comparable in both size and mass to a latent heat system of similar thermal capacitance. The heat receiver design and thermal analysis was conducted through the combined use of PATRAN, SINDA, TRASYS, and NASTRAN software packages.
Advanced sensible heat solar receiver for space power
NASA Technical Reports Server (NTRS)
Bennett, Timothy J.; Lacy, Dovie E.
1988-01-01
NASA Lewis, through in-house efforts, has begun a study to generate a conceptual design of a sensible heat solar receiver and to determine the feasibility of such a system for space power applications. The sensible heat solar receiver generated in this study uses pure lithium as the thermal storage medium and was designed for a 7 kWe Brayton (PCS) operating at 1100 K. The receiver consists of two stages interconnected via temperature sensing variable conductance sodium heat pipes. The lithium is contained within a niobium vessel and the outer shell of the receiver is constructed of third generation rigid, fibrous ceramic insulation material. Reradiation losses are controlled with niobium and aluminum shields. By nature of design, the sensible heat receiver generated in this study is comparable in both size and mass to a latent heat system of similar thermal capacitance. The heat receiver design and thermal analysis were conducted through the combined use of PATRAN, SINDA, TRASYS, and NASTRAN software packages.
System on chip module configured for event-driven architecture
Robbins, Kevin; Brady, Charles E.; Ashlock, Tad A.
2017-10-17
A system on chip (SoC) module is described herein, wherein the SoC modules comprise a processor subsystem and a hardware logic subsystem. The processor subsystem and hardware logic subsystem are in communication with one another, and transmit event messages between one another. The processor subsystem executes software actors, while the hardware logic subsystem includes hardware actors, the software actors and hardware actors conform to an event-driven architecture, such that the software actors receive and generate event messages and the hardware actors receive and generate event messages.
Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station
NASA Technical Reports Server (NTRS)
Kirby, Randy L.; Mann, David; Prenger, Stephen G.; Craig, Wayne; Greenwood, Andrew; Morsics, Jonathan; Fricker, Charles H.; Quach, Son; Lechese, Paul
2003-01-01
United Space Alliance (USA) developed and used a new software development method to meet technical, schedule, and budget challenges faced during the development and delivery of the new Shuttle Telemetry Ground Station at Kennedy Space Center. This method, called Collaborative Software Development, enabled KSC to effectively leverage industrial software and build additional capabilities to meet shuttle system and operational requirements. Application of this method resulted in reduced time to market, reduced development cost, improved product quality, and improved programmer competence while developing technologies of benefit to a small company in California (AP Labs Inc.). Many modifications were made to the baseline software product (VMEwindow), which improved its quality and functionality. In addition, six new software capabilities were developed, which are the subject of this article and add useful functionality to the VMEwindow environment. These new software programs are written in C or VXWorks and are used in conjunction with other ground station software packages, such as VMEwindow, Matlab, Dataviews, and PVWave. The Space Shuttle Telemetry Ground Station receives frequency-modulation (FM) and pulse-code-modulated (PCM) signals from the shuttle and support equipment. The hardware architecture (see figure) includes Sun workstations connected to multiple PCM- and FM-processing VersaModule Eurocard (VME) chassis. A reflective memory network transports raw data from PCM Processors (PCMPs) to the programmable digital-to-analog (D/A) converters, strip chart recorders, and analysis and controller workstations.
Method for network analyzation and apparatus
Bracht, Roger B.; Pasquale, Regina V.
2001-01-01
A portable network analyzer and method having multiple channel transmit and receive capability for real-time monitoring of processes which maintains phase integrity, requires low power, is adapted to provide full vector analysis, provides output frequencies of up to 62.5 MHz and provides fine sensitivity frequency resolution. The present invention includes a multi-channel means for transmitting and a multi-channel means for receiving, both in electrical communication with a software means for controlling. The means for controlling is programmed to provide a signal to a system under investigation which steps consecutively over a range of predetermined frequencies. The resulting received signal from the system provides complete time domain response information by executing a frequency transform of the magnitude and phase information acquired at each frequency step.
ERIC Educational Resources Information Center
Wood, Clare; Pillinger, Claire; Jackson, Emma
2010-01-01
This paper reports an extended analysis of the study reported in [Wood, C. (2005). "Beginning readers' use of 'talking books' software can affect their reading strategies." "Journal of Research in Reading, 28," 170-182.], in which five and six-year-old children received either six sessions using specially designed talking books or six sessions of…
How do particle physicists learn the programming concepts they need?
NASA Astrophysics Data System (ADS)
Kluth, S.; Pia, M. G.; Schoerner-Sadenius, T.; Steinbach, P.
2015-12-01
The ability to read, use and develop code efficiently and successfully is a key ingredient in modern particle physics. We report the experience of a training program, identified as “Advanced Programming Concepts”, that introduces software concepts, methods and techniques to work effectively on a daily basis in a HEP experiment or other programming intensive fields. This paper illustrates the principles, motivations and methods that shape the “Advanced Computing Concepts” training program, the knowledge base that it conveys, an analysis of the feedback received so far, and the integration of these concepts in the software development process of the experiments as well as its applicability to a wider audience.
50 CFR 660.15 - Equipment requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... receivers, computer hardware for electronic fish ticket software and computer hardware for electronic logbook software. (b) Performance and technical requirements for scales used to weigh catch at sea... ticket software provided by Pacific States Marine Fish Commission are required to meet the hardware and...
34 CFR 464.42 - What limit applies to purchasing computer hardware and software?
Code of Federal Regulations, 2013 CFR
2013-07-01
... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...
34 CFR 464.42 - What limit applies to purchasing computer hardware and software?
Code of Federal Regulations, 2012 CFR
2012-07-01
... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...
34 CFR 464.42 - What limit applies to purchasing computer hardware and software?
Code of Federal Regulations, 2011 CFR
2011-07-01
... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...
34 CFR 464.42 - What limit applies to purchasing computer hardware and software?
Code of Federal Regulations, 2010 CFR
2010-07-01
... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...
34 CFR 464.42 - What limit applies to purchasing computer hardware and software?
Code of Federal Regulations, 2014 CFR
2014-07-01
... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...
Web-based multi-channel analyzer
Gritzo, Russ E.
2003-12-23
The present invention provides an improved multi-channel analyzer designed to conveniently gather, process, and distribute spectrographic pulse data. The multi-channel analyzer may operate on a computer system having memory, a processor, and the capability to connect to a network and to receive digitized spectrographic pulses. The multi-channel analyzer may have a software module integrated with a general-purpose operating system that may receive digitized spectrographic pulses for at least 10,000 pulses per second. The multi-channel analyzer may further have a user-level software module that may receive user-specified controls dictating the operation of the multi-channel analyzer, making the multi-channel analyzer customizable by the end-user. The user-level software may further categorize and conveniently distribute spectrographic pulse data employing non-proprietary, standard communication protocols and formats.
NASA Technical Reports Server (NTRS)
2011-01-01
Topics covered include: Wind and Temperature Spectrometry of the Upper Atmosphere in Low-Earth Orbit; Health Monitor for Multitasking, Safety-Critical, Real-Time Software; Stereo Imaging Miniature Endoscope; Early Oscillation Detection Technique for Hybrid DC/DC Converters; Parallel Wavefront Analysis for a 4D Interferometer; Schottky Heterodyne Receivers With Full Waveguide Bandwidth; Carbon Nanofiber-Based, High-Frequency, High-Q, Miniaturized Mechanical Resonators; Ultracapacitor-Based Uninterrupted Power Supply System; Coaxial Cables for Martian Extreme Temperature Environments; Using Spare Logic Resources To Create Dynamic Test Points; Autonomous Coordination of Science Observations Using Multiple Spacecraft; Autonomous Phase Retrieval Calibration; EOS MLS Level 1B Data Processing Software, Version 3; Cassini Tour Atlas Automated Generation; Software Development Standard Processes (SDSP); Graphite Composite Panel Polishing Fixture; Material Gradients in Oxygen System Components Improve Safety; Ridge Waveguide Structures in Magnesium-Doped Lithium Niobate; Modifying Matrix Materials to Increase Wetting and Adhesion; Lightweight Magnetic Cooler With a Reversible Circulator; The Invasive Species Forecasting System; Method for Cleanly and Precisely Breaking Off a Rock Core Using a Radial Compressive Force; Praying Mantis Bending Core Breakoff and Retention Mechanism; Scoring Dawg Core Breakoff and Retention Mechanism; Rolling-Tooth Core Breakoff and Retention Mechanism; Vibration Isolation and Stabilization System for Spacecraft Exercise Treadmill Devices; Microgravity-Enhanced Stem Cell Selection; Diagnosis and Treatment of Neurological Disorders by Millimeter-Wave Stimulation; Passive Vaporizing Heat Sink; Remote Sensing and Quantization of Analog Sensors; Phase Retrieval for Radio Telescope and Antenna Control; Helium-Cooled Black Shroud for Subscale Cryogenic Testing; Receive Mode Analysis and Design of Microstrip Reflectarrays; and Chance-Constrained Guidance With Non-Convex Constraints.
Software Modules for the Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol
NASA Technical Reports Server (NTRS)
Woo, Simon S.; Veregge, John R.; Gao, Jay L.; Clare, Loren P.; Mills, David
2012-01-01
The Proximity-1 Space Link Interleaved Time Synchronization (PITS) protocol provides time distribution and synchronization services for space systems. A software prototype implementation of the PITS algorithm has been developed that also provides the test harness to evaluate the key functionalities of PITS with simulated data source and sink. PITS integrates time synchronization functionality into the link layer of the CCSDS Proximity-1 Space Link Protocol. The software prototype implements the network packet format, data structures, and transmit- and receive-timestamp function for a time server and a client. The software also simulates the transmit and receive-time stamp exchanges via UDP (User Datagram Protocol) socket between a time server and a time client, and produces relative time offsets and delay estimates.
Ground Processing of Data From the Mars Exploration Rovers
NASA Technical Reports Server (NTRS)
Wright, Jesse; Sturdevant, Kathryn; Noble, David
2006-01-01
A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.
NASA Astrophysics Data System (ADS)
Obuchowski, Nancy A.; Bullen, Jennifer A.
2018-04-01
Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.
NASA Technical Reports Server (NTRS)
Crane, Robert K.; Wang, Xuhe; Westenhaver, David
1996-01-01
The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.
Diagnosis and Prognosis of Weapon Systems
NASA Technical Reports Server (NTRS)
Nolan, Mary; Catania, Rebecca; deMare, Gregory
2005-01-01
The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.
Method for automation of tool preproduction
NASA Astrophysics Data System (ADS)
Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.
2018-03-01
The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.
Development of wide band digital receiver for atmospheric radars using COTS board based SDR
NASA Astrophysics Data System (ADS)
Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.
2016-07-01
Digital receiver extracts the received echo signal information, and is a potential subsystem for atmospheric radar, also referred to as wind profiling radar (WPR), which provides the vertical profiles of 3-dimensional wind vector in the atmosphere. This paper presents the development of digital receiver using COTS board based Software Defined Radio technique, which can be used for atmospheric radars. The developmental work is being carried out at National Atmospheric Research Laboratory (NARL), Gadanki. The digital receiver consists of a commercially available software defined radio (SDR) board called as universal software radio peripheral B210 (USRP B210) and a personal computer. USRP B210 operates over a wider frequency range from 70 MHz to 6 GHz and hence can be used for variety of radars like Doppler weather radars operating in S/C bands, in addition to wind profiling radars operating in VHF, UHF and L bands. Due to the flexibility and re-configurability of SDR, where the component functionalities are implemented in software, it is easy to modify the software to receive the echoes and process them as per the requirement suitable for the type of the radar intended. Hence, USRP B210 board along with the computer forms a versatile digital receiver from 70 MHz to 6 GHz. It has an inbuilt direct conversion transceiver with two transmit and two receive channels, which can be operated in fully coherent 2x2 MIMO fashion and thus it can be used as a two channel receiver. Multiple USRP B210 boards can be synchronized using the pulse per second (PPS) input provided on the board, to configure multi-channel digital receiver system. RF gain of the transceiver can be varied from 0 to 70 dB. The board can be controlled from the computer via USB 3.0 interface through USRP hardware driver (UHD), which is an open source cross platform driver. The USRP B210 board is connected to the personal computer through USB 3.0. Reference (10 MHz) clock signal from the radar master oscillator is used to lock the board, which is essential for deriving Doppler information. Input from the radar analog receiver is given to one channel of USRP B210, which is down converted to baseband. 12-bit ADC present on the board digitizes the signal and produces I (in-phase) and Q (quadrature-phase) data. The maximum sampling rate possible is about 61 MSPS. The I and Q (time series) data is sent to PC via USB 3.0, where the signal processing is carried out. The online processing steps include decimation, range gating, decoding, coherent integration and FFT computation (optional). The processed data is then stored in the hard disk. C++ programming language is used for developing the real time signal processing. Shared memory along with multi threading is used to collect and process data simultaneously. Before implementing the real time operation, stand alone test of the board was carried out through GNU radio software and the base band output data obtained is found satisfactory. Later the board is integrated with the existing Lower Atmospheric Wind Profiling radar at NARL. The radar receive IF output at 70 MHz is given to the board and the real-time radar data is collected. The data is processed off-line and the range-doppler spectrum is obtained. Online processing software is under progress.
ERIC Educational Resources Information Center
Wood, Eileen; Anderson, Alissa; Piquette-Tomei, Noella; Savage, Robert; Mueller, Julie
2011-01-01
Support requests were documented for 10 teachers (4 kindergarten, 4 grade one, and 2 grade one/two teachers) who received just-in-time instructional support over a 2 1/2 month period while implementing a novel reading software program as part of their literacy instruction. In-class observations were made of each instructional session. Analysis of…
Tanpitukpongse, Teerath P.; Mazurowski, Maciej A.; Ikhena, John; Petrella, Jeffrey R.
2016-01-01
Background and Purpose To assess prognostic efficacy of individual versus combined regional volumetrics in two commercially-available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer's disease. Materials and Methods Data was obtained through the Alzheimer's Disease Neuroimaging Initiative. 192 subjects (mean age 74.8 years, 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1WI MRI sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant® and Neuroreader™. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated using a univariable approach employing individual regional brain volumes, as well as two multivariable approaches (multiple regression and random forest), combining multiple volumes. Results On univariable analysis of 11 NeuroQuant® and 11 Neuroreader™ regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69 NeuroQuant®, 0.68 Neuroreader™), and was not significantly different (p > 0.05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63 logistic regression, 0.60 random forest NeuroQuant®; 0.65 logistic regression, 0.62 random forest Neuroreader™). Conclusion Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer's disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in MCI, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. PMID:28057634
Back to the future: An online OSCE Management Information System for nursing OSCEs.
Meskell, Pauline; Burke, Eimear; Kropmans, Thomas J B; Byrne, Evelyn; Setyonugroho, Winny; Kennedy, Kieran M
2015-11-01
The Objective Structured Clinical Examination (OSCE) is an established tool in the repertoire of clinical assessment methods in nurse education. The use of OSCEs facilitates the assessment of psychomotor skills as well as knowledge and attitudes. Identified benefits of OSCE assessment include development of students' confidence in their clinical skills and preparation for clinical practice. However, a number of challenges exist with the traditional paper methodology, including documentation errors and inadequate student feedback. To explore electronic OSCE delivery and evaluate the benefits of using an electronic OSCE management system. To explore assessors' perceptions of and attitudes to the computer based package. This study was conducted using electronic software in the management of a four station OSCE assessment with a cohort of first year undergraduate nursing students delivered over two consecutive years (n=203) in one higher education institution in Ireland. A quantitative descriptive survey methodology was used to obtain the views of the assessors on the process and outcome of using the software. OSCE documentation was converted to electronic format. Assessors were trained in the use of the OSCE management software package and laptops were procured to facilitate electronic management of the OSCE assessment. Following the OSCE assessment, assessors were invited to evaluate the experience. Electronic software facilitated the storage and analysis of overall group and individual results thereby offering considerable time savings. Submission of electronic forms was allowed only when fully completed thus removing the potential for missing data. The feedback facility allowed the student to receive timely evaluation on their performance and to benchmark their performance against the class. Assessors' satisfaction with the software was high. Analysis of assessment results can highlight issues around internal consistency being moderate and examiners variability. Regression analysis increases fairness of result calculations. Copyright © 2015. Published by Elsevier Ltd.
A study of land mobile satellite service multipath effects using SATLAB software
NASA Technical Reports Server (NTRS)
Campbell, Richard L.
1991-01-01
A software package is proposed that uses the known properties of signals received in multipath environments along with the mathematical relationships between signal characteristics to explore the effects of antenna pattern, vehicle velocity, shadowing of the direct wave, distributions of scatters around the moving vehicle and levels of scattered signals on the received complex envelope, fade rates and fade duration, Doppler spectrum, signal arrival angle spectrum, and spatial correlation. The data base may be either actual measured received signals entered as ASCII flat files or data synthesized using a built in model. An example illustrates the effect of using different antennas to receive signals in the same environment.
NASA Technical Reports Server (NTRS)
Nickum, J. D.
1978-01-01
The software package developed for the KIM-1 Micro-System and the Mini-L PLL receiver to simplify taking flight test data is described along with the address and data bus buffers used in the KIM-1 Micro-system. The interface hardware and timing are also presented to describe completely the software programs.
ERIC Educational Resources Information Center
Watkins, Beverly T.
1992-01-01
Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)
Software Solutions for Better Administration.
ERIC Educational Resources Information Center
Kazanjian, Edward
1997-01-01
The CO/OP (founded in 1973 as the Massachusetts Association of School Business Officials Cooperative Corporation) has created and produced administrative software for schools. Describes two areas in which software can increase revenue and provide protection for personnel: (1) invoice/accounts receivable for the rental of school space; and (2) an…
Tracking Clouds with low cost GNSS chips aided by the Arduino platform
NASA Astrophysics Data System (ADS)
Hameed, Saji; Realini, Eugenio; Ishida, Shinya
2016-04-01
The Global Navigation Satellite System (GNSS) is a constellation of satellites that is used to provide geo-positioning services. Besides this application, the GNSS system is important for a wide range of scientific and civilian applications. For example, GNSS systems are routinely used in civilian applications such as surveying and scientific applications such as the study of crustal deformation. Another important scientific application of GNSS system is in meteorological research. Here it is mainly used to determine the total water vapour content of the troposphere, hereafter Precipitable Water Vapor (PWV). However, both GNSS receivers and software have prohibitively high price due to a variety of reasons. To overcome this somewhat artificial barrier we are exploring the use of low-cost GNSS receivers along with open source GNSS software for scientific research, in particular for GNSS meteorology research. To achieve this aim, we have developed a custom Arduino compatible data logging board that is able to operate together with a specific low-cost single frequency GNSS receiver chip from NVS Technologies AG. We have also developed an open-source software bundle that includes a new Arduino core for the Atmel324p chip, which is the main processor used in our custom logger. We have also developed software code that enables data collection, logging and parsing of the GNSS data stream. Additionally we have comprehensively evaluated the low power characteristics of the GNSS receiver and logger boards. Currently we are exploring the use of several openly source or free to use for research software to map GNSS delays to PWV. These include the open source goGPS (http://www.gogps-project.org/) and gLAB (http://gage.upc.edu/gLAB) and the openly available GAMIT software from Massachusetts Institute of Technology (MIT). We note that all the firmware and software developed as part of this project is available on an open source license.
Low cost and compact quantum key distribution
NASA Astrophysics Data System (ADS)
Duligall, J. L.; Godfrey, M. S.; Harrison, K. A.; Munro, W. J.; Rarity, J. G.
2006-10-01
We present the design of a novel free-space quantum cryptography system, complete with purpose-built software, that can operate in daylight conditions. The transmitter and receiver modules are built using inexpensive off-the-shelf components. Both modules are compact allowing the generation of renewed shared secrets on demand over a short range of a few metres. An analysis of the software is shown as well as results of error rates and therefore shared secret yields at varying background light levels. As the system is designed to eventually work in short-range consumer applications, we also present a use scenario where the consumer can regularly 'top up' a store of secrets for use in a variety of one-time-pad (OTP) and authentication protocols.
Updates to FuncLab, a Matlab based GUI for handling receiver functions
NASA Astrophysics Data System (ADS)
Porritt, Robert W.; Miller, Meghan S.
2018-02-01
Receiver functions are a versatile tool commonly used in seismic imaging. Depending on how they are processed, they can be used to image discontinuity structure within the crust or mantle or they can be inverted for seismic velocity either directly or jointly with complementary datasets. However, modern studies generally require large datasets which can be challenging to handle; therefore, FuncLab was originally written as an interactive Matlab GUI to assist in handling these large datasets. This software uses a project database to allow interactive trace editing, data visualization, H-κ stacking for crustal thickness and Vp/Vs ratio, and common conversion point stacking while minimizing computational costs. Since its initial release, significant advances have been made in the implementation of web services and changes in the underlying Matlab platform have necessitated a significant revision to the software. Here, we present revisions to the software, including new features such as data downloading via irisFetch.m, receiver function calculations via processRFmatlab, on-the-fly cross-section tools, interface picking, and more. In the descriptions of the tools, we present its application to a test dataset in Michigan, Wisconsin, and neighboring areas following the passage of USArray Transportable Array. The software is made available online at https://robporritt.wordpress.com/software.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Validation and verification of a virtual environment for training naval submarine officers
NASA Astrophysics Data System (ADS)
Zeltzer, David L.; Pioch, Nicholas J.
1996-04-01
A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.
Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses
ERIC Educational Resources Information Center
Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.
2014-01-01
Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…
Control Software for Advanced Video Guidance Sensor
NASA Technical Reports Server (NTRS)
Howard, Richard T.; Book, Michael L.; Bryan, Thomas C.
2006-01-01
Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to a standby mode. When a command is received, the software goes into one of several operational modes (e.g. acquisition or tracking). The software then returns, to the external source, the data appropriate to the command.
Mock Data Challenge for the MPD/NICA Experiment on the HybriLIT Cluster
NASA Astrophysics Data System (ADS)
Gertsenberger, Konstantin; Rogachevsky, Oleg
2018-02-01
Simulation of data processing before receiving first experimental data is an important issue in high-energy physics experiments. This article presents the current Event Data Model and the Mock Data Challenge for the MPD experiment at the NICA accelerator complex which uses ongoing simulation studies to exercise in a stress-testing the distributed computing infrastructure and experiment software in the full production environment from simulated data through the physical analysis.
Becker, Anton S; Mueller, Michael; Stoffel, Elina; Marcon, Magda; Ghafoor, Soleen; Boss, Andreas
2018-02-01
To train a generic deep learning software (DLS) to classify breast cancer on ultrasound images and to compare its performance to human readers with variable breast imaging experience. In this retrospective study, all breast ultrasound examinations from January 1, 2014 to December 31, 2014 at our institution were reviewed. Patients with post-surgical scars, initially indeterminate, or malignant lesions with histological diagnoses or 2-year follow-up were included. The DLS was trained with 70% of the images, and the remaining 30% were used to validate the performance. Three readers with variable expertise also evaluated the validation set (radiologist, resident, medical student). Diagnostic accuracy was assessed with a receiver operating characteristic analysis. 82 patients with malignant and 550 with benign lesions were included. Time needed for training was 7 min (DLS). Evaluation time for the test data set were 3.7 s (DLS) and 28, 22 and 25 min for human readers (decreasing experience). Receiver operating characteristic analysis revealed non-significant differences (p-values 0.45-0.47) in the area under the curve of 0.84 (DLS), 0.88 (experienced and intermediate readers) and 0.79 (inexperienced reader). DLS may aid diagnosing cancer on breast ultrasound images with an accuracy comparable to radiologists, and learns better and faster than a human reader with no prior experience. Further clinical trials with dedicated algorithms are warranted. Advances in knowledge: DLS can be trained classify cancer on breast ultrasound images high accuracy even with comparably few training cases. The fast evaluation speed makes real-time image analysis feasible.
PT-SAFE: a software tool for development and annunciation of medical audible alarms.
Bennett, Christopher L; McNeer, Richard R
2012-03-01
Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.
Reconfigurable Sensor Monitoring System
NASA Technical Reports Server (NTRS)
Alhorn, Dean C. (Inventor); Dutton, Kenneth R. (Inventor); Howard, David E. (Inventor); Smith, Dennis A. (Inventor)
2017-01-01
A reconfigurable sensor monitoring system includes software tunable filters, each of which is programmable to condition one type of analog signal. A processor coupled to the software tunable filters receives each type of analog signal so-conditioned.
NASA Technical Reports Server (NTRS)
Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.
2016-01-01
The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administrations (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.
NASA Technical Reports Server (NTRS)
Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.
2016-01-01
The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration's (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.
Estimation of total electron content (TEC) using spaceborne GPS measurements
NASA Astrophysics Data System (ADS)
Choi, Key-Rok; Lightsey, E. Glenn
2008-09-01
TerraSAR-X (TSX), a high-resolution interferometric Synthetic Aperture Radar (SAR) mission from DLR (German Aerospace Center, Deutsches Zentrum für Luft-und Raumfahrt), was successfully launched into orbit on June 15, 2007. It includes a dual-frequency GPS receiver called IGOR (Integrated GPS Occultation Receiver), which is a heritage NASA/JPL BlackJack receiver. The software for the TSX IGOR receiver was specially-modified software developed at UT/CSR. This software was upgraded to provide enhanced occultation capabilities. This paper describes total electron content (TEC) estimation using simulation data and onboard GPS data of TerraSAR-X. The simulated GPS data were collected using the IGOR Engineering Model (EM) in the laboratory and the onboard GPS data were collected from the IGOR Flight Model (FM) on TSX. To estimate vertical total electron content (vTEC) for the simulation data, inter-frequency biases (IFB) were estimated using the "carrier to code leveling process." For the onboard GPS data, IFBs of GPS satellites were retrieved from the navigation message and applied to the measurements.
Mendez Astudillo, Jorge; Lau, Lawrence; Tang, Yu-Ting; Moore, Terry
2018-02-14
As Global Navigation Satellite System (GNSS) signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP) technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD) is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS) tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm) with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, R.E.
1994-11-02
This document provides the software development plan for the Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store, and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.
Bourgarit, A; Mallet, H-P; Keshtmand, H; De Castro, N; Rambeloarisoa, J; Fain, O; Antoun, F; Picard, C; Rocher, G; Che, D; Farge, D
2009-10-01
The impact of the TB-info software was assessed on the care of patients treated with antituberculosis regimen (ATT). Cohort study of patients with tuberculosis who received an ATT in 2004 in two hospitals and five medical centres in Paris. Follow-up was implemented with the TB-info software. Data were compared to those of the 1999-2003 cohort. Two hundred and nine ATT were initiated in 2004, with a mean duration of 7.2 months. Demographic and clinical data reflected this population precariousness: 79% were foreign-born, 25% lived in institutions and half of them had no or unusual health insurance. Compared to the previous cohort, viral co-infections were tested in more than 80% cases and showed association with HIV, HBV or HCV in 11, 10 and 5% of the patients, respectively. Twenty-one patients were lost for follow-up (11%) and 76% of the smear-positive pulmonary tuberculosis therapies were declared successful but only 34% were declared cured with the WHO criteria. Analysis of the data obtained with TB-info software showed an improvement of tuberculosis patients care with more co-infection tested and less lost for follow-up. These results confirm the usefulness of this software for patients care and assessment of physicians practice in France.
Implementation of building information modeling in Malaysian construction industry
NASA Astrophysics Data System (ADS)
Memon, Aftab Hameed; Rahman, Ismail Abdul; Harman, Nur Melly Edora
2014-10-01
This study has assessed the implementation level of Building Information Modeling (BIM) in the construction industry of Malaysia. It also investigated several computer software packages facilitating BIM and challenges affecting its implementation. Data collection for this study was carried out using questionnaire survey among the construction practitioners. 95 completed forms of questionnaire received against 150 distributed questionnaire sets from consultant, contractor and client organizations were analyzed statistically. Analysis findings indicated that the level of implementation of BIM in the construction industry of Malaysia is very low. Average index method employed to assess the effectiveness of various software packages of BIM highlighted that Bentley construction, AutoCAD and ArchiCAD are three most popular and effective software packages. Major challenges to BIM implementation are it requires enhanced collaboration, add work to a designer, interoperability and needs enhanced collaboration. For improving the level of implementing BIM in Malaysian industry, it is recommended that a flexible training program of BIM for all practitioners must be created.
High resolution ultrasonic spectroscopy system for nondestructive evaluation
NASA Technical Reports Server (NTRS)
Chen, C. H.
1991-01-01
With increased demand for high resolution ultrasonic evaluation, computer based systems or work stations become essential. The ultrasonic spectroscopy method of nondestructive evaluation (NDE) was used to develop a high resolution ultrasonic inspection system supported by modern signal processing, pattern recognition, and neural network technologies. The basic system which was completed consists of a 386/20 MHz PC (IBM AT compatible), a pulser/receiver, a digital oscilloscope with serial and parallel communications to the computer, an immersion tank with motor control of X-Y axis movement, and the supporting software package, IUNDE, for interactive ultrasonic evaluation. Although the hardware components are commercially available, the software development is entirely original. By integrating signal processing, pattern recognition, maximum entropy spectral analysis, and artificial neural network functions into the system, many NDE tasks can be performed. The high resolution graphics capability provides visualization of complex NDE problems. The phase 3 efforts involve intensive marketing of the software package and collaborative work with industrial sectors.
Vulnerabilities in GSM technology and feasibility of selected attacks
NASA Astrophysics Data System (ADS)
Voznak, M.; Prokes, M.; Sevcik, L.; Frnda, J.; Toral-Cruz, Homer; Jakovlev, Sergej; Fazio, Peppino; Mehic, M.; Mikulec, M.
2015-05-01
Global System for Mobile communication (GSM) is the most widespread technology for mobile communications in the world and serving over 7 billion users. Since first publication of system documentation there has been notified a potential safety problem's occurrence. Selected types of attacks, based on the analysis of the technical feasibility and the degree of risk of these weaknesses, were implemented and demonstrated in laboratory of the VSB-Technical University of Ostrava, Czech Republic. These vulnerabilities were analyzed and afterwards possible attacks were described. These attacks were implemented using open-source tools, software programmable radio USRP (Universal Software RadioPeripheral) and DVB-T (Digital Video Broadcasting - Terrestrial) receiver. GSM security architecture is being scrutinized since first public releases of its specification mainly pointing out weaknesses in authentication and ciphering mechanisms. This contribution also summarizes practically proofed and used scenarios that are performed using opensource software tools and variety of scripts mostly written in Python. Main goal of this paper is in analyzing security issues in GSM network and practical demonstration of selected attacks.
Cultural and Technological Issues and Solutions for Geodynamics Software Citation
NASA Astrophysics Data System (ADS)
Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.
2014-12-01
Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.
Automation of Military Civil Engineering and Site Design Functions: Software Evaluation
1989-09-01
promising advantage over manual methods, USACERL is to evaluate available software to determine which, if any, is best suited to the type of civil...moved. Therefore, original surface data were assembled by scaling the northing and easting distances of field elevations and entering them manually into...in the software or requesting an update or addition to the software or manuals . Responses to forms submitted during the test were received at
NASA Technical Reports Server (NTRS)
Spector, E.; LeBlanc, A.; Shackelford, L.
1995-01-01
This study reports on the short-term in vivo precision and absolute measurements of three combinations of whole-body scan modes and analysis software using a Hologic QDR 2000 dual-energy X-ray densitometer. A group of 21 normal, healthy volunteers (11 male and 10 female) were scanned six times, receiving one pencil-beam and one array whole-body scan on three occasions approximately 1 week apart. The following combinations of scan modes and analysis software were used: pencil-beam scans analyzed with Hologic's standard whole-body software (PB scans); the same pencil-beam analyzed with Hologic's newer "enhanced" software (EPB scans); and array scans analyzed with the enhanced software (EA scans). Precision values (% coefficient of variation, %CV) were calculated for whole-body and regional bone mineral content (BMC), bone mineral density (BMD), fat mass, lean mass, %fat and total mass. In general, there was no significant difference among the three scan types with respect to short-term precision of BMD and only slight differences in the precision of BMC. Precision of BMC and BMD for all three scan types was excellent: < 1% CV for whole-body values, with most regional values in the 1%-2% range. Pencil-beam scans demonstrated significantly better soft tissue precision than did array scans. Precision errors for whole-body lean mass were: 0.9% (PB), 1.1% (EPB) and 1.9% (EA). Precision errors for whole-body fat mass were: 1.7% (PB), 2.4% (EPB) and 5.6% (EA). EPB precision errors were slightly higher than PB precision errors for lean, fat and %fat measurements of all regions except the head, although these differences were significant only for the fat and % fat of the arms and legs. In addition EPB precision values exhibited greater individual variability than PB precision values. Finally, absolute values of bone and soft tissue were compared among the three combinations of scan and analysis modes. BMC, BMD, fat mass, %fat and lean mass were significantly different between PB scans and either of the EPB or EA scans. Differences were as large as 20%-25% for certain regional fat and BMD measurements. Additional work may be needed to examine the relative accuracy of the scan mode/software combinations and to identify reasons for the differences in soft tissue precision with the array whole-body scan mode.
NASA Technical Reports Server (NTRS)
Parrish, E. A., Jr.; Aylor, J. H.
1975-01-01
To aid work being conducted on the feasibility of a low cost Omega navigational receiver, a control panel was designed and constructed according to supplied specifications. Since the proposed Omega receiver is designed around a microprocessor, software engineering necessary for control panel operation is included in the design. The control panel is to be used as an operational model for use in the design of a prototype receiver. A detailed description of the hardware design is presented along with a description of the software needed to operate the panel. A complete description of the operating procedures for the panel are also included.
NASA Technical Reports Server (NTRS)
Lux, James P.; Taylor, Gregory H.; Lang, Minh; Stern, Ryan A.
2011-01-01
An FPGA module leverages the previous work from Goddard Space Flight Center (GSFC) relating to NASA s Space Telecommunications Radio System (STRS) project. The STRS SpaceWire FPGA Module is written in the Verilog Register Transfer Level (RTL) language, and it encapsulates an unmodified GSFC core (which is written in VHDL). The module has the necessary inputs/outputs (I/Os) and parameters to integrate seamlessly with the SPARC I/O FPGA Interface module (also developed for the STRS operating environment, OE). Software running on the SPARC processor can access the configuration and status registers within the SpaceWire module. This allows software to control and monitor the SpaceWire functions, but it is also used to give software direct access to what is transmitted and received through the link. SpaceWire data characters can be sent/received through the software interface, as well as through the dedicated interface on the GSFC core. Similarly, SpaceWire time codes can be sent/received through the software interface or through a dedicated interface on the core. This innovation is designed for plug-and-play integration in the STRS OE. The SpaceWire module simplifies the interfaces to the GSFC core, and synchronizes all I/O to a single clock. An interrupt output (with optional masking) identifies time-sensitive events within the module. Test modes were added to allow internal loopback of the SpaceWire link and internal loopback of the client-side data interface.
Statistical Approaches to Adjusting Weights for Dependent Arms in Network Meta-analysis.
Su, Yu-Xuan; Tu, Yu-Kang
2018-05-22
Network meta-analysis compares multiple treatments in terms of their efficacy and harm by including evidence from randomized controlled trials. Most clinical trials use parallel design, where patients are randomly allocated to different treatments and receive only one treatment. However, some trials use within person designs such as split-body, split-mouth and cross-over designs, where each patient may receive more than one treatment. Data from treatment arms within these trials are no longer independent, so the correlations between dependent arms need to be accounted for within the statistical analyses. Ignoring these correlations may result in incorrect conclusions. The main objective of this study is to develop statistical approaches to adjusting weights for dependent arms within special design trials. In this study, we demonstrate the following three approaches: the data augmentation approach, the adjusting variance approach, and the reducing weight approach. These three methods could be perfectly applied in current statistic tools such as R and STATA. An example of periodontal regeneration was used to demonstrate how these approaches could be undertaken and implemented within statistical software packages, and to compare results from different approaches. The adjusting variance approach can be implemented within the network package in STATA, while reducing weight approach requires computer software programming to set up the within-study variance-covariance matrix. This article is protected by copyright. All rights reserved.
A memory-mapped output interface: Omega navigation output data from the JOLT (TM) microcomputer
NASA Technical Reports Server (NTRS)
Lilley, R. W.
1976-01-01
A hardware interface which allows both digital and analog data output from the JOLT microcomputer is described in the context of a software-based Omega Navigation receiver. The interface hardware described is designed for output of six (or eight with simple extensions) bits of binary output in response to a memory store command from the microcomputer. The interface was produced in breadboard form and is operational as an evaluation aid for the software Omega receiver.
Orbiter global positioning system design and Ku-band problem investigations, exhibit B, revision 1
NASA Technical Reports Server (NTRS)
Lindsey, W. C.
1983-01-01
The hardware, software, and interface between them was investigated for a low dynamics, nonhostile environment, low cost GPS receiver (GPS Z set). The set is basically a three dimensional geodetic and way point navigator with GPS time, ground speed, and ground track as possible outputs in addition to the usual GPS receiver set outputs. Each functional module comprising the GPS set is described, enumerating its functional inputs and outputs, leading to the interface between hardware and software of the set.
Technical design and system implementation of region-line primitive association framework
NASA Astrophysics Data System (ADS)
Wang, Min; Xing, Jinjin; Wang, Jie; Lv, Guonian
2017-08-01
Apart from regions, image edge lines are an important information source, and they deserve more attention in object-based image analysis (OBIA) than they currently receive. In the region-line primitive association framework (RLPAF), we promote straight-edge lines as line primitives to achieve powerful OBIAs. Along with regions, straight lines become basic units for subsequent extraction and analysis of OBIA features. This study develops a new software system called remote-sensing knowledge finder (RSFinder) to implement RLPAF for engineering application purposes. This paper introduces the extended technical framework, a comprehensively designed feature set, key technology, and software implementation. To our knowledge, RSFinder is the world's first OBIA system based on two types of primitives, namely, regions and lines. It is fundamentally different from other well-known region-only-based OBIA systems, such as eCogntion and ENVI feature extraction module. This paper has important reference values for the development of similarly structured OBIA systems and line-involved extraction algorithms of remote sensing information.
Bonneville Power Administration Communication Alarm Processor expert system:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goeltz, R.; Purucker, S.; Tonn, B.
This report describes the Communications Alarm Processor (CAP), a prototype expert system developed for the Bonneville Power Administration by Oak Ridge National Laboratory. The system is designed to receive and diagnose alarms from Bonneville's Microwave Communications System (MCS). The prototype encompasses one of seven branches of the communications network and a subset of alarm systems and alarm types from each system. The expert system employs a backward chaining approach to diagnosing alarms. Alarms are fed into the expert system directly from the communication system via RS232 ports and sophisticated alarm filtering and mailbox software. Alarm diagnoses are presented to operatorsmore » for their review and concurrence before the diagnoses are archived. Statistical software is incorporated to allow analysis of archived data for report generation and maintenance studies. The delivered system resides on a Digital Equipment Corporation VAX 3200 workstation and utilizes Nexpert Object and SAS for the expert system and statistical analysis, respectively. 11 refs., 23 figs., 7 tabs.« less
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
11 CFR 9033.12 - Production of computerized information.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Disbursements made and reimbursements received for the cost of transportation, ground services and facilities...'s software capabilities, such as user guides, technical manuals, formats, layouts and other... software and the computerized information prepared or maintained by the committee. ...
Software Configurable Multichannel Transceiver
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter
2009-01-01
Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.
A Upgrade of the Aeroheating Software "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce
2013-01-01
Many software packages assist engineers with performing flight vehicle analysis, but some of these packages have gone many years without updates or significant improvements to their workflows. One such software package, known as MINIVER, is a powerful yet lightweight tool used for aeroheating analyses. However, it is an aging program that has not seen major improvements within the past decade. As part of a collaborative effort with the Florida Institute of Technology, MINIVER has received a major user interface overhaul, a change in program language, and will be continually receiving updates to improve its capabilities. The user interface update includes a migration from a command-line interface to that of a graphical user interface supported in the Windows operating system. The organizational structure of the pre-processor has been transformed to clearly defined categories to provide ease of data entry. Helpful tools have been incorporated, including the ability to copy sections of cases as well as a generalized importer which aids in bulk data entry. A visual trajectory editor has been included, as well as a CAD Editor which allows the user to input simplified geometries in order to generate MINIVER cases in bulk. To demonstrate its continued effectiveness, a case involving the JAXA OREX flight vehicle will be included, providing comparisons to captured flight data as well as other computational solutions. The most recent upgrade effort incorporated the use of the CAD Editor, and current efforts are investigating methods to link MINIVER projects with SINDA/Fluint and Thermal Desktop.
An Upgrade of the Aeroheating Software "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce M.
2013-01-01
Many software packages assist engineers with performing flight vehicle analysis, but some of these packages have gone many years without updates or significant improvements to their workflows. One such software, known as MINIVER, is a powerful yet lightweight tool that is used for aeroheating analyses. However, it is an aging program that has not seen major improvements within the past decade. As part of a collaborative effort with Florida Institute of Technology, MINIVER has received a major user interface overhaul, a change in program language, and will be continually receiving updates to improve its capabilities. The user interface update includes a migration from a command-line interface to that of a graphical user interface supported in the Windows operating system. The organizational structure of the preprocessor has been transformed to clearly defined categories to provide ease of data entry. Helpful tools have been incorporated, including the ability to copy sections of cases as well as a generalized importer which aids in bulk data entry. A visual trajectory editor has been included, as well as a CAD Editor which allows the user to input simplified geometries in order to generate MINIVER cases in bulk. To demonstrate its continued effectiveness, a case involving the JAXA OREX flight vehicle will be included, providing comparisons to captured flight data as well as other computational solutions. The most recent upgrade effort incorporated the use of the CAD Editor, and current efforts are investigating methods to link MINIVER projects with SINDA/Fluint and Thermal Desktop.
Wide-bandwidth high-resolution search for extraterrestrial intelligence
NASA Technical Reports Server (NTRS)
Horowitz, Paul
1993-01-01
A third antenna was added to the system. It is a terrestrial low-gain feed, to act as a veto for local interference. The 3-chip design for a 4 megapoint complex FFT was reduced to finished working hardware. The 4-Megachannel circuit board contains 36 MByte of DRAM, 5 CPLDs, the three large FFT ASICs, and 74 ICs in all. The Austek FDP-based Spectrometer/Power Accumulator (SPA) has now been implemented as a 4-layer printed circuit. A PC interface board has been designed and together with its associated user interface and control software allows an IBM compatible computer to control the SPA board, and facilitates the transfer of spectra to the PC for display, processing, and storage. The Feature Recognizer Array cards receive the stream of modulus words from the 4M FFT cards, and forward a greatly thinned set of reports to the PC's in whose backplane they reside. In particular, a powerful ROM-based state-machine architecture has been adopted, and DRAM has been added to permit integration modes when tracking or reobserving source candidates. The general purpose (GP) array consists of twenty '486 PC class computers, each of which receives and processes the data from a feature extractor/correlator board set. The array performs a first analysis on the provided 'features' and then passes this information on to the workstation. The core workstation software is now written. That is, the communication channels between the user interface, the backend monitor program and the PC's have working software.
1999-01-01
published in December of 1998. In addition, Mr. Drake is the author of a theme article entitled: "Measuring Software Quality: A Case Study...and services may run on different platforms in differing combinations , • Partial application failure (e.g., a client running, service down) is...result in a combined utility function that is some aggregation of the underlying utility functions. The benefit a client receives from a service
Development of a platform-independent receiver control system for SISIFOS
NASA Astrophysics Data System (ADS)
Lemke, Roland; Olberg, Michael
1998-05-01
Up to now receiver control software was a time consuming development usually written by receiver engineers who had mainly the hardware in mind. We are presenting a low-cost and very flexible system which uses a minimal interface to the real hardware, and which makes it easy to adapt to new receivers. Our system uses Tcl/Tk as a graphical user interface (GUI), SpecTcl as a GUI builder, Pgplot as plotting software, a simple query language (SQL) database for information storage and retrieval, Ethernet socket to socket communication and SCPI as a command control language. The complete system is in principal platform independent but for cost saving reasons we are using it actually on a PC486 running Linux 2.0.30, which is a copylefted Unix. The only hardware dependent part are the digital input/output boards, analog to digital and digital to analog convertors. In the case of the Linux PC we are using a device driver development kit to integrate the boards fully into the kernel of the operating system, which indeed makes them look like an ordinary device. The advantage of this system is firstly the low price and secondly the clear separation between the different software components which are available for many operating systems. If it is not possible, due to CPU performance limitations, to run all the software in a single machine,the SQL-database or the graphical user interface could be installed on separate computers.
Inertial Navigation System Standardized Software Development. Volume 1. Introduction and Summary
1976-06-01
the Loran receiver, the Tacan receiver, the Omega receiver, the satelite based instrumentation, the multimode radar, the star tracker and the visual...accelerometer scale factor, and the barometric altimeter bias. The accuracy (1o values) of typical navigation-aid measurements (other than satelite derived
Visual exploration and analysis of ionospheric scintillation monitoring data: The ISMR Query Tool
NASA Astrophysics Data System (ADS)
Vani, Bruno César; Shimabukuro, Milton Hirokazu; Galera Monico, João Francisco
2017-07-01
Ionospheric Scintillations are rapid variations on the phase and/or amplitude of a radio signal as it passes through ionospheric plasma irregularities. The ionosphere is a specific layer of the Earth's atmosphere located approximately between 50 km and 1000 km above the Earth's surface. As Global Navigation Satellite Systems (GNSS) - such as GPS, Galileo, BDS and GLONASS - use radio signals, these variations degrade their positioning service quality. Due to its location, Brazil is one of the places most affected by scintillation in the world. For that reason, ionosphere monitoring stations have been deployed over Brazilian territory since 2011 through cooperative projects between several institutions in Europe and Brazil. Such monitoring stations compose a network that generates a large amount of monitoring data everyday. GNSS receivers deployed at these stations - named Ionospheric Scintillation Monitor Receivers (ISMR) - provide scintillation indices and related signal metrics for available satellites dedicated to satellite-based navigation and positioning services. With this monitoring infrastructure, more than ten million observation values are generated and stored every day. Extracting the relevant information from this huge amount of data was a hard process and required the expertise of computer and geoscience scientists. This paper describes the concepts, design and aspects related to the implementation of the software that has been supporting research on ISMR data - the so-called ISMR Query Tool. Usability and other aspects are also presented via examples of application. This web based software has been designed and developed aiming to ensure insights over the huge amount of ISMR data that is fetched every day on an integrated platform. The software applies and adapts time series mining and information visualization techniques to extend the possibilities of exploring and analyzing ISMR data. The software is available to the scientific community through the World Wide Web, therefore constituting an analysis infrastructure that complements the monitoring one, providing support for researching ionospheric scintillation in the GNSS context. Interested researchers can access the functionalities without cost at http://is-cigala-calibra.fct.unesp.br/, under online request to the Space Geodesy Study Group from UNESP - Univ Estadual Paulista at Presidente Prudente.
Tanpitukpongse, T P; Mazurowski, M A; Ikhena, J; Petrella, J R
2017-03-01
Alzheimer disease is a prevalent neurodegenerative disease. Computer assessment of brain atrophy patterns can help predict conversion to Alzheimer disease. Our aim was to assess the prognostic efficacy of individual-versus-combined regional volumetrics in 2 commercially available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer disease. Data were obtained through the Alzheimer's Disease Neuroimaging Initiative. One hundred ninety-two subjects (mean age, 74.8 years; 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1-weighted MR imaging sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant and Neuroreader. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated by using a univariable approach using individual regional brain volumes and 2 multivariable approaches (multiple regression and random forest), combining multiple volumes. On univariable analysis of 11 NeuroQuant and 11 Neuroreader regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69, NeuroQuant; 0.68, Neuroreader) and was not significantly different ( P > .05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63, logistic regression; 0.60, random forest NeuroQuant; 0.65, logistic regression; 0.62, random forest Neuroreader). Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in mild cognitive impairment, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. © 2017 by American Journal of Neuroradiology.
NASA Technical Reports Server (NTRS)
2003-01-01
Topics covered include: Real-Time, High-Frequency QRS Electrocardiograph; Software for Improved Extraction of Data From Tape Storage; Radio System for Locating Emergency Workers; Software for Displaying High-Frequency Test Data; Capacitor-Chain Successive-Approximation ADC; Simpler Alternative to an Optimum FQPSK-B Viterbi Receiver; Multilayer Patch Antenna Surrounded by a Metallic Wall; Software To Secure Distributed Propulsion Simulations; Explicit Pore Pressure Material Model in Carbon-Cloth Phenolic; Meshed-Pumpkin Super-Pressure Balloon Design; Corrosion Inhibitors as Penetrant Dyes for Radiography; Transparent Metal-Salt-Filled Polymeric Radiation Shields; Lightweight Energy Absorbers for Blast Containers; Brush-Wheel Samplers for Planetary Exploration; Dry Process for Making Polyimide/ Carbon-and-Boron-Fiber Tape; Relatively Inexpensive Rapid Prototyping of Small Parts; Magnetic Field Would Reduce Electron Backstreaming in Ion Thrusters; Alternative Electrochemical Systems for Ozonation of Water; Interferometer for Measuring Displacement to Within 20 pm; UV-Enhanced IR Raman System for Identifying Biohazards; Prognostics Methodology for Complex Systems; Algorithms for Haptic Rendering of 3D Objects; Modeling and Control of Aerothermoelastic Effects; Processing Digital Imagery to Enhance Perceptions of Realism; Analysis of Designs of Space Laboratories; Shields for Enhanced Protection Against High-Speed Debris; Study of Dislocation-Ordered In(x)Ga(1-x)As/GaAs Quantum Dots; and Tilt-Sensitivity Analysis for Space Telescopes.
Chen, Qianting; Dai, Congling; Zhang, Qianjun; Du, Juan; Li, Wen
2016-10-01
To study the prediction performance evaluation with five kinds of bioinformatics software (SIFT, PolyPhen2, MutationTaster, Provean, MutationAssessor). From own database for genetic mutations collected over the past five years, Chinese literature database, Human Gene Mutation Database, and dbSNP, 121 missense mutations confirmed by functional studies, and 121 missense mutations suspected to be pathogenic by pedigree analysis were used as positive gold standard, while 242 missense mutations with minor allele frequency (MAF)>5% in dominant hereditary diseases were used as negative gold standard. The selected mutations were predicted with the five software. Based on the results, the performance of the five software was evaluated for their sensitivity, specificity, positive predict value, false positive rate, negative predict value, false negative rate, false discovery rate, accuracy, and receiver operating characteristic curve (ROC). In terms of sensitivity, negative predictive value and false negative rate, the rank was MutationTaster, PolyPhen2, Provean, SIFT, and MutationAssessor. For specificity and false positive rate, the rank was MutationTaster, Provean, MutationAssessor, SIFT, and PolyPhen2. For positive predict value and false discovery rate, the rank was MutationTaster, Provean, MutationAssessor, PolyPhen2, and SIFT. For area under the ROC curve (AUC) and accuracy, the rank was MutationTaster, Provean, PolyPhen2, MutationAssessor, and SIFT. The prediction performance of software may be different when using different parameters. Among the five software, MutationTaster has the best prediction performance.
Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio
2011-01-01
Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.
artdaq: DAQ software development made simple
NASA Astrophysics Data System (ADS)
Biery, Kurt; Flumerfelt, Eric; Freeman, John; Ketchum, Wesley; Lukhanin, Gennadiy; Rechenmacher, Ron
2017-10-01
For a few years now, the artdaq data acquisition software toolkit has provided numerous experiments with ready-to-use components which allow for rapid development and deployment of DAQ systems. Developed within the Fermilab Scientific Computing Division, artdaq provides data transfer, event building, run control, and event analysis functionality. This latter feature includes built-in support for the art event analysis framework, allowing experiments to run art modules for real-time filtering, compression, disk writing and online monitoring. As art, also developed at Fermilab, is also used for offline analysis, a major advantage of artdaq is that it allows developers to easily switch between developing online and offline software. artdaq continues to be improved. Support for an alternate mode of running whereby data from some subdetector components are only streamed if requested has been added; this option will reduce unnecessary DAQ throughput. Real-time reporting of DAQ metrics has been implemented, along with the flexibility to choose the format through which experiments receive the reports; these formats include the Ganglia, Graphite and syslog software packages, along with flat ASCII files. Additionally, work has been performed investigating more flexible modes of online monitoring, including the capability to run multiple online monitoring processes on different hosts, each running its own set of art modules. Finally, a web-based GUI interface through which users can configure details of their DAQ system has been implemented, increasing the ease of use of the system. Already successfully deployed on the LArlAT, DarkSide-50, DUNE 35ton and Mu2e experiments, artdaq will be employed for SBND and is a strong candidate for use on ICARUS and protoDUNE. With each experiment comes new ideas for how artdaq can be made more flexible and powerful. The above improvements will be described, along with potential ideas for the future.
Making the most of a translator.
Dannenfeldt, D
1994-01-01
Anaheim Memorial Hospital in California is a trailblazer. It's one of the first hospitals in the nation to use translation software for both supply procurement and claims-related transactions. Initially, it acquired the software to streamline the ordering of supplies by shifting to standard electronic formats. Today, the hospital is using the software to receive electronic remittance advice, and it has plans for other labor-saving applications.
17 CFR 37.202 - Access requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... software vendor with impartial access to its market(s) and market services, including any indicative quote... electronic confirmation of their status as eligible contract participants, as defined by the Act and... participants and independent software vendors receiving comparable access to, or services from, the swap...
Summary of paper: Area navigation implementation for a microcomputer-based Loran-C receiver
NASA Technical Reports Server (NTRS)
Oguri, Fujiko
1987-01-01
The development of an area navigation program and the implementation of this software on a microcomputer-based Loran-C receiver to provide high-quality, practical area navigation information for general aviation are described. This software provides range and bearing angle to a selected waypoint, cross-track error, course deviation indication (CDI), ground speed, and estimated time of arrival at the waypoint. The range/bearing calculation, using an elliptical Earth model, provides very good accuracy; the error does not exceed more than -.012 nm (range) or 0.09 degree (bearing) for a maximum range to 530 nm. The alpha-beta filtering is applied in order to reduce the random noise on Loran-C raw data and in the ground speed calculation. Due to alpha-beta filtering, the ground speed calculation has good stability for constant or low-accelerative flight. The execution time of this software is approximately 0.2 second. Flight testing was done with a prototype Loran-C front-end receiver, with the Loran-C area navigation software demonstrating the ability to provide navigation for the pilot to any point in the Loran-C coverage area in true area navigation fashion without line-of-sight and range restriction typical of VOR area navigation.
NASA Technical Reports Server (NTRS)
Siegmann, W. L.; Robertson, J. S.; Jacobson, M. J.
1993-01-01
The final report for progress during the period from 15 Nov. 1988 to 14 Nov. 1991 is presented. Research on methods for analysis of sound propagation through the atmosphere and on results obtained from application of our methods are summarized. Ten written documents of NASA research are listed, and these include publications, manuscripts accepted, submitted, or in preparation for publication, and reports. Twelve presentations of results, either at scientific conferences or at research or technical organizations, since the start of the grant period are indicated. Names of organizations to which software produced under the grant was distributed are provided, and the current arrangement whereby the software is being distributed to the scientific community is also described. Finally, the names of seven graduate students who worked on NASA research and received Rensselaer degrees during the grant period, along with their current employers are given.
NASA Technical Reports Server (NTRS)
2004-01-01
Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.
NASA Astrophysics Data System (ADS)
1981-03-01
Support documentation for a second generation heliostat project is presented. Flowcharts of control software are included. Numerical and graphic test results are provided. Project management information is also provided.
78 FR 23685 - Airworthiness Directives; The Boeing Company
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-22
... installing new operational software for the electrical load management system and configuration database. The..., installing a new electrical power control panel, and installing new operational software for the electrical load management system and configuration database. Since the proposed AD was issued, we have received...
NASA Technical Management Report (533Q)
NASA Technical Reports Server (NTRS)
Klosko, S. M.; Sanchez, B. (Technical Monitor)
2001-01-01
The objective of this task is analytical support of the NASA Satellite Laser Ranging (SLR) program in the areas of SLR data analysis, software development, assessment of SLR station performance, development of improved models for atmospheric propagation and interpretation of station calibration techniques, and science coordination and analysis functions for the NASA led Central Bureau of the International Laser Ranging Service (ILRS). The contractor shall in each year of the five year contract: (1) Provide software development and analysis support to the NASA SLR program and the ILRS. Attend and make analysis reports at the monthly meetings of the Central Bureau of the ILRS covering data received during the previous period. Provide support to the Analysis Working Group of the ILRS including special tiger teams that are established to handle unique analysis problems. Support the updating of the SLR Bibliography contained on the ILRS web site; (2) Perform special assessments of SLR station performance from available data to determine unique biases and technical problems at the station; (3) Develop improvements to models of atmospheric propagation and for handling pre- and post-pass calibration data provided by global network stations; (4) Provide review presentation of overall ILRS network data results at one major scientific meeting per year; (5) Contribute to and support the publication of NASA SLR and ILRS reports highlighting the results of SLR analysis activity.
Management and Analysis of Radiation Portal Monitor Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowe, Nathan C; Alcala, Scott; Crye, Jason Michael
2014-01-01
Oak Ridge National Laboratory (ORNL) receives, archives, and analyzes data from radiation portal monitors (RPMs). Over time the amount of data submitted for analysis has grown significantly, and in fiscal year 2013, ORNL received 545 gigabytes of data representing more than 230,000 RPM operating days. This data comes from more than 900 RPMs. ORNL extracts this data into a relational database, which is accessed through a custom software solution called the Desktop Analysis and Reporting Tool (DART). DART is used by data analysts to complete a monthly lane-by-lane review of RPM status. Recently ORNL has begun to extend its datamore » analysis based on program-wide data processing in addition to the lane-by-lane review. Program-wide data processing includes the use of classification algorithms designed to identify RPMs with specific known issues and clustering algorithms intended to identify as-yet-unknown issues or new methods and measures for use in future classification algorithms. This paper provides an overview of the architecture used in the management of this data, performance aspects of the system, and additional requirements and methods used in moving toward an increased program-wide analysis paradigm.« less
SigrafW: An easy-to-use program for fitting enzyme kinetic data.
Leone, Francisco Assis; Baranauskas, José Augusto; Furriel, Rosa Prazeres Melo; Borin, Ivana Aparecida
2005-11-01
SigrafW is Windows-compatible software developed using the Microsoft® Visual Basic Studio program that uses the simplified Hill equation for fitting kinetic data from allosteric and Michaelian enzymes. SigrafW uses a modified Fibonacci search to calculate maximal velocity (V), the Hill coefficient (n), and the enzyme-substrate apparent dissociation constant (K). The estimation of V, K, and the sum of the squares of residuals is performed using a Wilkinson nonlinear regression at any Hill coefficient (n). In contrast to many currently available kinetic analysis programs, SigrafW shows several advantages for the determination of kinetic parameters of both hyperbolic and nonhyperbolic saturation curves. No initial estimates of the kinetic parameters are required, a measure of the goodness-of-the-fit for each calculation performed is provided, the nonlinear regression used for calculations eliminates the statistical bias inherent in linear transformations, and the software can be used for enzyme kinetic simulations either for educational or research purposes. Persons interested in receiving a free copy of the software should contact Dr. F. A. Leone. Copyright © 2005 International Union of Biochemistry and Molecular Biology, Inc.
SenseMyHeart: A cloud service and API for wearable heart monitors.
Pinto Silva, P M; Silva Cunha, J P
2015-01-01
In the era of ubiquitous computing, the growing adoption of wearable systems and body sensor networks is trailing the path for new research and software for cardiovascular intensity, energy expenditure and stress and fatigue detection through cardiovascular monitoring. Several systems have received clinical-certification and provide huge amounts of reliable heart-related data in a continuous basis. PhysioNet provides equally reliable open-source software tools for ECG processing and analysis that can be combined with these devices. However, this software remains difficult to use in a mobile environment and for researchers unfamiliar with Linux-based systems. In the present paper we present an approach that aims at tackling these limitations by developing a cloud service that provides an API for a PhysioNet-based pipeline for ECG processing and Heart Rate Variability measurement. We describe the proposed solution, along with its advantages and tradeoffs. We also present some client tools (windows and Android) and several projects where the developed cloud service has been used successfully as a standard for Heart Rate and Heart Rate Variability studies in different scenarios.
New Digisonde for research and monitoring applications
NASA Astrophysics Data System (ADS)
Reinisch, B. W.; Galkin, I. A.; Khmyrov, G. M.; Kozlov, A. V.; Bibl, K.; Lisysyan, I. A.; Cheney, G. P.; Huang, X.; Kitrosser, D. F.; Paznukhov, V. V.; Luo, Y.; Jones, W.; Stelmash, S.; Hamel, R.; Grochmal, J.
2009-02-01
The new Digisonde-4D, while preserving the basic principles of the Digisonde family, introduces important hardware and software changes that implement the latest capabilities of new digital radio frequency (RF) circuitry and embedded computers. The "D" refers to digital transmitters and receivers in which no analog circuitry is used for conversion between the baseband and the RF. In conjunction with the new hardware design, new software solutions offer significantly enhanced measurement flexibility, enhanced signal selectivity, and new types of data, e.g., the complete set of time domain samples of all four antenna signals suitable for independent scientific analysis. With the new method of mitigating in-band RF interference, the ionogram running time can be made as short as a couple of seconds. The h'(f) precision ranging technique with an accuracy of better than 1 km can be used on a routine basis. The 4D model runs the new ARTIST-5 ionogram autoscaling software which reports in real time the required data for assimilation in ionospheric models. The paper highlights technical advances of the new Digisonde for research and monitoring applications.
Tools Lighten Designs, Maintain Structural Integrity
NASA Technical Reports Server (NTRS)
2009-01-01
Collier Research Corporation of Hampton, Virginia, licensed software developed at Langley Research Center to reduce design weight through the use of composite materials. The first license of NASA-developed software, it has now been used in everything from designing next-generation cargo containers, to airframes, rocket engines, ship hulls, and train bodies. The company now has sales of the NASA-derived software topping $4 million a year and has recently received several Small Business Innovation Research (SBIR) contracts to apply its software to nearly all aspects of the new Orion crew capsule design.
Low cost airborne microwave landing system receiver, task 3
NASA Technical Reports Server (NTRS)
Hager, J. B.; Vancleave, J. R.
1979-01-01
Work performed on the low cost airborne Microwave Landing System (MLS) receiver is summarized. A detailed description of the prototype low cost MLS receiver is presented. This detail includes block diagrams, schematics, board assembly drawings, photographs of subassemblies, mechanical construction, parts lists, and microprocessor software. Test procedures are described and results are presented.
Development of medical data information systems
NASA Technical Reports Server (NTRS)
Anderson, J.
1971-01-01
Computerized storage and retrieval of medical information is discussed. Tasks which were performed in support of the project are: (1) flight crew health stabilization computer system, (2) medical data input system, (3) graphic software development, (4) lunar receiving laboratory support, and (5) Statos V printer/plotter software development.
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.
Future GOES-R global ground receivers
NASA Astrophysics Data System (ADS)
Dafesh, P. A.; Grayver, E.
2006-08-01
The Aerospace Corporation has developed an end-to-end testbed to demonstrate a wide range of modern modulation and coding alternatives for future broadcast by the GOES-R Global Rebroadcast (GRB) system. In particular, this paper describes the development of a compact, low cost, flexible GRB digital receiver that was designed, implemented, fabricated, and tested as part of the development. This receiver demonstrates a 10-fold increase in data rate compared to the rate achievable by the current GOES generation, without a major impact on either cost or size. The digital receiver is integrated on a single PCI card with an FPGA device, and analog-to-digital converters. It supports a wide range of modulations (including 8-PSK and 16-QAM) and turbo coding. With appropriate FPGA firmware and software changes, it can also be configured to receive the current (legacy) GOES signals. The receiver has been validated by sending large image files over a high-fidelity satellite channel emulator, including a space-qualified power amplifier and a white noise source. The receiver is a key component of a future GOES-R weather receiver system (also called user terminal) that includes the antenna, low-noise amplifier, downconverter, filters, digital receiver, and receiver system software. This work describes this receiver proof of concept and its application to providing a very credible estimate of the impact of using modern modulation and coding techniques in the future GOES-R system.
NASA Astrophysics Data System (ADS)
Kardava, Irakli; Tadyszak, Krzysztof; Gulua, Nana; Jurga, Stefan
2017-02-01
For more flexibility of environmental perception by artificial intelligence it is needed to exist the supporting software modules, which will be able to automate the creation of specific language syntax and to make a further analysis for relevant decisions based on semantic functions. According of our proposed approach, of which implementation it is possible to create the couples of formal rules of given sentences (in case of natural languages) or statements (in case of special languages) by helping of computer vision, speech recognition or editable text conversion system for further automatic improvement. In other words, we have developed an approach, by which it can be achieved to significantly improve the training process automation of artificial intelligence, which as a result will give us a higher level of self-developing skills independently from us (from users). At the base of our approach we have developed a software demo version, which includes the algorithm and software code for the entire above mentioned component's implementation (computer vision, speech recognition and editable text conversion system). The program has the ability to work in a multi - stream mode and simultaneously create a syntax based on receiving information from several sources.
Virtual reality for mine safety training.
Filigenzi, M T; Orr, T J; Ruff, T M
2000-06-01
Mining has long remained one of America's most hazardous occupations. Researchers believe that by developing realistic, affordable VR training software, miners will be able to receive accurate training in hazard recognition and avoidance. In addition, the VR software will allow miners to follow mine evacuation routes and safe procedures without exposing themselves to danger. This VR software may ultimately be tailored to provide training in other industries, such as the construction, agricultural, and petroleum industries.
Garsson, B
1988-01-01
Remember that computer software is designed for accrual accounting, whereas your business operates and reports income on a cash basis. The rules of tax law stipulate that professional practices may use the cash method of accounting, but if accrual accounting is ever used to report taxable income the government may not permit a switch back to cash accounting. Therefore, always consider the computer as a bookkeeper, not a substitute for a qualified accountant. (Your accountant will have readily accessible payroll and general ledger data available for analysis and tax reports, thanks to the magic of computer processing.) Accounts Payable reports are interfaced with the general ledger and are of interest for transaction detail, open invoice and cash flow analysis, and for a record of payments by vendor. Payroll reports, including check register and withholding detail are provided and interfaced with the general ledger. The use of accounting software expands the use of in-office computers to areas beyond professional billing and insurance form generation. It simplifies payroll recordkeeping; maintains payables details; integrates payables, receivables, and payroll with general ledger files; provides instantaneous information on all aspects of the business office; and creates a continuous "audit-trail" following the entering of data. The availability of packaged accounting software allows the professional business office an array of choices. The person(s) responsible for bookkeeping and accounting should choose carefully, ensuring that any system is easy to use, has been thoroughly tested, and provides at least as much control over office records as has been outlined in this article.
Chen, Yan; James, Jonathan J; Turnbull, Anne E; Gale, Alastair G
2015-10-01
To establish whether lower resolution, lower cost viewing devices have the potential to deliver mammographic interpretation training. On three occasions over eight months, fourteen consultant radiologists and reporting radiographers read forty challenging digital mammography screening cases on three different displays: a digital mammography workstation, a standard LCD monitor, and a smartphone. Standard image manipulation software was available for use on all three devices. Receiver operating characteristic (ROC) analysis and ANOVA (Analysis of Variance) were used to determine the significance of differences in performance between the viewing devices with/without the application of image manipulation software. The effect of reader's experience was also assessed. Performance was significantly higher (p < .05) on the mammography workstation compared to the other two viewing devices. When image manipulation software was applied to images viewed on the standard LCD monitor, performance improved to mirror levels seen on the mammography workstation with no significant difference between the two. Image interpretation on the smartphone was uniformly poor. Film reader experience had no significant effect on performance across all three viewing devices. Lower resolution standard LCD monitors combined with appropriate image manipulation software are capable of displaying mammographic pathology, and are potentially suitable for delivering mammographic interpretation training. • This study investigates potential devices for training in mammography interpretation. • Lower resolution standard LCD monitors are potentially suitable for mammographic interpretation training. • The effect of image manipulation tools on mammography workstation viewing is insignificant. • Reader experience had no significant effect on performance in all viewing devices. • Smart phones are not suitable for displaying mammograms.
NASA Technical Reports Server (NTRS)
Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam
2013-01-01
The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the receiver under test is subjected to conditions where its performance degrades to high error rates (30 percent or beyond). The design incorporates a number of features, such as watchdog triggers that permit the SDA system to recover from large receiver upsets automatically and continue accumulating performance analysis unaided by operator intervention. This accommodates tests that can last in the order of days in order to gain statistical confidence in results and is also useful for capturing snapshots of rare events.
Sun, Tao
2016-01-01
Introduction Using network meta-analysis, we evaluated the adverse effects of the seven most common treatment methods, i.e., bridging external fixation, non-bridging external fixation, K-wire fixation, plaster fixation, dorsal plating, volar plating, and dorsal and volar plating, by their associated risk of developing complex regional pain syndrome (CRPS) in distal radius fracture (DRF) patients. Material and methods Following an exhaustive search of scientific literature databases for high quality studies, randomized controlled trials (RCTs) related to our study topic were screened and selected based on stringent predefined inclusion and exclusion criteria. Data extracted from the selected studies were used for statistical analyses using Stata 12.0 software. Results A total of 17 RCTs, including 1658 DRF patients, were enrolled in this network meta-analysis. Among the 1658 DRF patients, 452 received bridging external fixation, 525 received non-bridging external fixation, 154 received K-wire fixation, 84 received plaster fixation, 132 received dorsal plating, 123 received volar plating, and 188 received dorsal and volar plating. When compared to bridging external fixation patients, there was no marked difference in the CRPS risk in DRF patients receiving different treatments (all p > 0.05). However, the surface under the cumulative ranking curves (SUCRA) for plaster fixation (77.0%) and non-bridging external fixation (71.3%) were significantly higher compared with the other five methods. Conclusions Our findings suggest that compared with bridging external fixation, K-wire fixation, dorsal plating, volar plating, dorsal and volar plating, plaster fixation and non-bridging external fixation might be the better treatment methods to reduce the risk of CRPS in DRF patients. PMID:28144268
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1980-01-01
The column normalizing technique was used to adjust the data for variations in the amplitude of the signal due to look angle effects with respect to solar zenith angle along the scan lines (i.e., across columns). Evaluation of the data set containing the geometric and radiometric adjustments, indicates that the data set should be satisfactory for further processing and analysis. Software was developed for degrading the spatial resolution of the aircraft data to produce a total of four data sets for further analysis. The quality of LANDSAT 2 CCT data for the test site is good for channels four, five, and six. Channel seven was not present on the tape. The data received were reformatted and analysis of the test site area was initiated.
Diagnostic evaluation of three cardiac software packages using a consecutive group of patients
2011-01-01
Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226
Microprogramming for real-time data acquisition
NASA Technical Reports Server (NTRS)
Patella, F. J.
1977-01-01
Transmit microcode trap logic is conditioned by preset clock. Measurement request or issuance of command is controlled by set of software-initialized polling tables. Receive microcode trap logic is conditioned by transmit/receive hardware when response is returned on data bus.
Improving Mathematics Learning of Kindergarten Students through Computer-Assisted Instruction
ERIC Educational Resources Information Center
Foster, Matthew E.; Anthony, Jason L.; Clements, Doug H.; Sarama, Julie; Williams, Jeffrey M.
2016-01-01
This study evaluated the effects of a mathematics software program, the Building Blocks software suite, on young children's mathematics performance. Participants included 247 Kindergartners from 37 classrooms in 9 schools located in low-income communities. Children within classrooms were randomly assigned to receive 21 weeks of computer-assisted…
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
A software system for the simulation of chest lesions
NASA Astrophysics Data System (ADS)
Ryan, John T.; McEntee, Mark; Barrett, Saoirse; Evanoff, Michael; Manning, David; Brennan, Patrick
2007-03-01
We report on the development of a novel software tool for the simulation of chest lesions. This software tool was developed for use in our study to attain optimal ambient lighting conditions for chest radiology. This study involved 61 consultant radiologists from the American Board of Radiology. Because of its success, we intend to use the same tool for future studies. The software has two main functions: the simulation of lesions and retrieval of information for ROC (Receiver Operating Characteristic) and JAFROC (Jack-Knife Free Response ROC) analysis. The simulation layer operates by randomly selecting an image from a bank of reportedly normal chest x-rays. A random location is then generated for each lesion, which is checked against a reference lung-map. If the location is within the lung fields, as derived from the lung-map, a lesion is superimposed. Lesions are also randomly selected from a bank of manually created chest lesion images. A blending algorithm determines which are the best intensity levels for the lesion to sit naturally within the chest x-ray. The same software was used to run a study for all 61 radiologists. A sequence of images is displayed in random order. Half of these images had simulated lesions, ranging from subtle to obvious, and half of the images were normal. The operator then selects locations where he/she thinks lesions exist and grades the lesion accordingly. We have found that this software was very effective in this study and intend to use the same principles for future studies.
Optical Coherence Tomography Angiography in Optic Disc Swelling.
Fard, Masoud Aghsaei; Jalili, Jalil; Sahraiyan, Alireza; Khojasteh, Hassan; Hejazi, Marjane; Ritch, Robert; Subramanian, Prem S
2018-05-04
To compare optical coherence tomography angiography (OCT-A) of peripapillary total vasculature and capillaries in patients with optic disc swelling. Cross-sectional study. Twenty nine eyes with acute nonarteritic anterior ischemic optic neuropathy (NAION), 44 eyes with papilledema, 8 eyes with acute optic neuritis, and 48 eyes of normal subjects were imaged using OCT-A. Peripapillary total vasculature information was recorded using a commercial vessel density map. Customized image analysis with major vessel removal was also used to measure whole-image capillary density and peripapillary capillary density (PCD). Mixed models showed that the peripapillary total vasculature density values were significantly lower in NAION eyes, followed by papilledema eyes and control eyes, using commercial software (P < .0001 for all comparisons). The customized software also showed significantly lower PCD of NAION eyes compared with papilledema eyes (all P < .001), but did not show significant differences between papilledema and control subjects. Our software showed significantly lower whole image and PCD in eyes with optic neuritis than papilledema. There was no significant difference between NAION and optic neuritis using our software. The area under the receiver operating curves for discriminating NAION from papilledema eyes and optic neuritis from papilledema eyes was highest for whole-image capillary density (0.94 and 0.80, respectively) with our software, followed by peripapillary total vasculature (0.9 and 0.74, respectively ) with commercial software. OCT-A is helpful to distinguish NAION and papillitis from papilledema. Whole-image capillary density had the greatest diagnostic accuracy for differentiating disc swelling. Copyright © 2018 Elsevier Inc. All rights reserved.
Interactive Visualization of Healthcare Data Using Tableau.
Ko, Inseok; Chang, Hyejung
2017-10-01
Big data analysis is receiving increasing attention in many industries, including healthcare. Visualization plays an important role not only in intuitively showing the results of data analysis but also in the whole process of collecting, cleaning, analyzing, and sharing data. This paper presents a procedure for the interactive visualization and analysis of healthcare data using Tableau as a business intelligence tool. Starting with installation of the Tableau Desktop Personal version 10.3, this paper describes the process of understanding and visualizing healthcare data using an example. The example data of colon cancer patients were obtained from health insurance claims in years 2012 and 2013, provided by the Health Insurance Review and Assessment Service. To explore the visualization of healthcare data using Tableau for beginners, this paper describes the creation of a simple view for the average length of stay of colon cancer patients. Since Tableau provides various visualizations and customizations, the level of analysis can be increased with small multiples, view filtering, mark cards, and Tableau charts. Tableau is a software that can help users explore and understand their data by creating interactive visualizations. The software has the advantages that it can be used in conjunction with almost any database, and it is easy to use by dragging and dropping to create an interactive visualization expressing the desired format.
Multiscale analysis of river networks using the R package linbin
Welty, Ethan Z.; Torgersen, Christian E.; Brenkman, Samuel J.; Duda, Jeffrey J.; Armstrong, Jonathan B.
2015-01-01
Analytical tools are needed in riverine science and management to bridge the gap between GIS and statistical packages that were not designed for the directional and dendritic structure of streams. We introduce linbin, an R package developed for the analysis of riverscapes at multiple scales. With this software, riverine data on aquatic habitat and species distribution can be scaled and plotted automatically with respect to their position in the stream network or—in the case of temporal data—their position in time. The linbin package aggregates data into bins of different sizes as specified by the user. We provide case studies illustrating the use of the software for (1) exploring patterns at different scales by aggregating variables at a range of bin sizes, (2) comparing repeat observations by aggregating surveys into bins of common coverage, and (3) tailoring analysis to data with custom bin designs. Furthermore, we demonstrate the utility of linbin for summarizing patterns throughout an entire stream network, and we analyze the diel and seasonal movements of tagged fish past a stationary receiver to illustrate how linbin can be used with temporal data. In short, linbin enables more rapid analysis of complex data sets by fisheries managers and stream ecologists and can reveal underlying spatial and temporal patterns of fish distribution and habitat throughout a riverscape.
NASA Technical Reports Server (NTRS)
1994-01-01
A Small Business Innovation Research (SBIR) contract resulted in a series of commercially available lasers, which have application in fiber optic communications, difference frequency generation, fiber optic sensing and general laboratory use. Developed under a Small Business Innovation Research (SBIR) contract, the Phase Doppler Particles Analyzer is a non-disruptive, highly accurate laser-based method of determining particle size, number density, trajectory, turbulence and other information about particles passing through a measurement probe volume. The system consists of an optical transmitter and receiver, signal processor and computer with data acquisition and analysis software. A variety of systems are offered for applications including spray characterization for paint, and agricultural and other sprays. The Microsizer, a related product, is used in medical equipment manufacturing and analysis of contained flows. High frequency components and subsystems produced by Millitech Corporation are marketed for both research and commercial use. These systems, which operate in the upper portion of the millimeter wave, resulted from a number of Small Business Innovation Research (SBIR) projects. By developing very high performance mixers and multipliers, the company has advanced the state of the art in sensitive receiver technology. Components are used in receivers and transceivers for monitoring chlorine monoxides, ozone, in plasma characterization and in material properties characterization.
Improved Spacecraft Tracking and Navigation Using a Portable Radio Science Receiver
NASA Technical Reports Server (NTRS)
Soriano, Melissa; Jacobs, Christopher; Navarro, Robert; Naudet, Charles; Rogstad, Stephen; White, Leslie; Finley, Susan; Goodhart, Charles; Sigman, Elliott; Trinh, Joseph
2013-01-01
The Portable Radio Science Receiver (PRSR) is a suitcase-sized open-loop digital receiver designed to be small and easy to transport so that it can be deployed quickly and easily anywhere in the world. The PRSR digitizes, downconverts, and filters using custom hardware, firmware, and software. Up to 16 channels can be independently configured and recorded with a total data rate of up to 256 Mbps. The design and implementation of the system's hardware, firmware, and software is described. To minimize costs and time to deployment, our design leveraged elements of the hardware, firmware, and software designs from the existing full-sized operational (non-portable) Radio Science Receivers (RSR) and Wideband VLBI Science Receivers (WVSR), which have successfully supported flagship NASA deep space missions at all Deep Space Network (DSN) sites. We discuss a demonstration of the PRSR using VLBI, with one part per billion angular resolution: 1 nano-radian / 200 ?as synthesized beam. This is the highest resolution astronomical instrument ever operated solely from the Southern Hemisphere. Preliminary results from two sites are presented, including the European Space Agency (ESA) sites at Cebreros, Spain and Malargue, Argentina. Malargue's South American location is of special interest because it greatly improves the geometric coverage for spacecraft navigation in the Southern Hemisphere and will for the first time provide coverage to the 1/4 of the range of declination that has been excluded from reference frame work at Ka-band.
Real-Time Mapping alert system; user's manual
Torres, L.A.
1996-01-01
The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water- related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field monitoring sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. These alert values can help keep water- resource specialists informed of current hydrologic conditions. The current alert status at monitoring sites is of critical importance during floods, hurricanes, and other extreme hydrologic events where quick analysis of the situation is needed. This manual provides instructions for using the Real-Time Mapping software, a series of computer programs developed by the U.S. Geological Survey for quick analysis of hydrologic conditions, and guides users through a basic interactive session. The software provides interactive graphics display and query of real-time information in a map-based, menu-driven environment.
Risk Factors for premature birth in a hospital 1
Ahumada-Barrios, Margarita E.; Alvarado, German F.
2016-01-01
Abstract Objective: to determine the risk factors for premature birth. Methods: retrospective case-control study of 600 pregnant women assisted in a hospital, with 298 pregnant women in the case group (who gave birth prematurely <37 weeks) and 302 pregnant women who gave birth to a full-term newborn in the control group. Stata software version 12.2 was used. The Chi-square test was used in bivariate analysis and logistic regression was used in multivariate analysis, from which Odds Ratios (OR) and Confidence Intervals (CI) of 95% were derived. Results: risk factors associated with premature birth were current twin pregnancy (adjusted OR= 2.4; p= 0.02), inadequate prenatal care (< 6 controls) (adjusted OR= 3.2; p <0.001), absent prenatal care (adjusted OR= 3.0; p <0.001), history of premature birth (adjusted OR= 3.7; p <0.001) and preeclampsia (adjusted OR= 1.9; p= 0.005). Conclusion: history of premature birth, preeclampsia, not receiving prenatal care and receiving inadequate prenatal care were risk factors for premature birth. PMID:27463110
Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad
2015-01-01
Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.
Bayesian Hierarchical Random Effects Models in Forensic Science.
Aitken, Colin G G
2018-01-01
Statistical modeling of the evaluation of evidence with the use of the likelihood ratio has a long history. It dates from the Dreyfus case at the end of the nineteenth century through the work at Bletchley Park in the Second World War to the present day. The development received a significant boost in 1977 with a seminal work by Dennis Lindley which introduced a Bayesian hierarchical random effects model for the evaluation of evidence with an example of refractive index measurements on fragments of glass. Many models have been developed since then. The methods have now been sufficiently well-developed and have become so widespread that it is timely to try and provide a software package to assist in their implementation. With that in mind, a project (SAILR: Software for the Analysis and Implementation of Likelihood Ratios) was funded by the European Network of Forensic Science Institutes through their Monopoly programme to develop a software package for use by forensic scientists world-wide that would assist in the statistical analysis and implementation of the approach based on likelihood ratios. It is the purpose of this document to provide a short review of a small part of this history. The review also provides a background, or landscape, for the development of some of the models within the SAILR package and references to SAILR as made as appropriate.
ROC analysis for diagnostic accuracy of fracture by using different monitors.
Liang, Zhigang; Li, Kuncheng; Yang, Xiaolin; Du, Xiangying; Liu, Jiabin; Zhao, Xin; Qi, Xiangdong
2006-09-01
The purpose of this study was to compare diagnostic accuracy by using two types of monitors. Four radiologists with 10 years experience twice interpreted the films of 77 fracture cases by using the ViewSonic P75f+ and BARCO MGD221 monitors, with a time interval of 3 weeks. Each time the radiologists used one type of monitor to interpret the images. The image browser used was the Unisight software provided by Atlastiger Company (Shanghai, China), and interpretation result was analyzed via the LABMRMC software. In studies of receiver operating characteristics to score the presence or absence of fracture, the results of images interpreted through monochromic monitors showed significant statistical difference compared to those interpreted using the color monitors. A significant difference was observed in the results obtained by using two kinds of monitors. Color monitors cannot serve as substitutes for monochromatic monitors in the process of interpreting computed radiography (CR) images with fractures.
NASA Technical Reports Server (NTRS)
Densmore, A. C.
1988-01-01
A digital phase-locked loop (PLL) scheme is described which detects the phase and power of a high SNR calibration tone. The digital PLL is implemented in software directly from the given description. It was used to evaluate the stability of the Goldstone Deep Space Station open loop receivers for Radio Science. Included is a derivative of the Allan variance sensitivity of the PLL imposed by additive white Gaussian noise; a lower limit is placed on the carrier frequency.
Area navigation implementation for a microcomputer-based LORAN-C receiver
NASA Technical Reports Server (NTRS)
Oguri, F.
1983-01-01
Engineering performed to make LORAN-C a more useful and practical navigation system for general aviation is described. Development of new software, and implementation of this software on a (MOS6502) microcomputer to provide high quality practical area navigation information directly to the pilot and considered. Flight tests were performed specifically to examine the efficacy of this new software. Final results were exceptionally good and clearly demonstrate the merits of this new LORAN-C area navigation system.
Pantic, Igor; Dacic, Sanja; Brkic, Predrag; Lavrnja, Irena; Pantic, Senka; Jovanovic, Tomislav; Pekovic, Sanja
2014-10-01
This aim of this study was to assess the discriminatory value of fractal and grey level co-occurrence matrix (GLCM) analysis methods in standard microscopy analysis of two histologically similar brain white mass regions that have different nerve fiber orientation. A total of 160 digital micrographs of thionine-stained rat brain white mass were acquired using a Pro-MicroScan DEM-200 instrument. Eighty micrographs from the anterior corpus callosum and eighty from the anterior cingulum areas of the brain were analyzed. The micrographs were evaluated using the National Institutes of Health ImageJ software and its plugins. For each micrograph, seven parameters were calculated: angular second moment, inverse difference moment, GLCM contrast, GLCM correlation, GLCM variance, fractal dimension, and lacunarity. Using the Receiver operating characteristic analysis, the highest discriminatory value was determined for inverse difference moment (IDM) (area under the receiver operating characteristic (ROC) curve equaled 0.925, and for the criterion IDM≤0.610 the sensitivity and specificity were 82.5 and 87.5%, respectively). Most of the other parameters also showed good sensitivity and specificity. The results indicate that GLCM and fractal analysis methods, when applied together in brain histology analysis, are highly capable of discriminating white mass structures that have different axonal orientation.
Considerations for Future IGS Receivers
NASA Astrophysics Data System (ADS)
Humphreys, T. E.; Young, L. E.; Pany, T.
2008-12-01
Future International GNSS Service (IGS) receivers are considered against the backdrop of GNSS signal modernization and the IGS's goal of further improving the accuracy of its products. The purpose of this paper is to provide the IGS ---and any other group that uses geodetic-quality GNSS receivers---with a guide to making decisions about GNSS receivers. Modernized GNSS signals are analyzed with a view toward IGS applications. A schedule for minimum IGS receiver requirements is proposed. Features of idealized conceptual receivers are discussed. The prospects for standard commercial receivers and for software-defined GNSS receivers are examined. Recommendations are given for how the IGS should proceed in order to maximally benefit from the transformation in GNSS that will occur over the next decade. There are two reasons why it makes sense for the IGS to study GNSS receivers that will be integrated into its network in the coming years. First, the new GNSS signals that will come on line over the next decade will render current IGS receivers obsolete, so it is prudent to examine receiver options going forward. Second, the push to improve the accuracy of IGS products beyond current limits demands greater accuracy in the models used to describe receiver measurements. As a result, the IGS must demand from vendors more transparency into receiver firmware or adoption of user-specified algorithms. This paper considers future IGS receivers from four different points of view. Section 2 looks at modernized GNSS signals and their benefits for the IGS. Section 3 surveys the range of expected receiver capability. Section 4 considers current and future commercial geodetic-quality receivers. Section 5 considers software GNSS receivers as an alternative to less reconfigurable traditional receivers. Section 6 lays out the authors' recommendations to the IGS.
Earth's external magnetic fields at low orbital altitudes
NASA Technical Reports Server (NTRS)
Klumpar, D. M.
1990-01-01
Under our Jun. 1987 proposal, Magnetic Signatures of Near-Earth Distributed Currents, we proposed to render operational a modeling procedure that had been previously developed to compute the magnetic effects of distributed currents flowing in the magnetosphere-ionosphere system. After adaptation of the software to our computing environment we would apply the model to low altitude satellite orbits and would utilize the MAGSAT data suite to guide the analysis. During the first year, basic computer codes to run model systems of Birkeland and ionospheric currents and several graphical output routines were made operational on a VAX 780 in our research facility. Software performance was evaluated using an input matchstick ionospheric current array, field aligned currents were calculated and magnetic perturbations along hypothetical satellite orbits were calculated. The basic operation of the model was verified. Software routines to analyze and display MAGSAT satellite data in terms of deviations with respect to the earth's internal field were also made operational during the first year effort. The complete set of MAGSAT data to be used for evaluation of the models was received at the end of the first year. A detailed annual report in May 1989 described these first year activities completely. That first annual report is included by reference in this final report. This document summarizes our additional activities during the second year of effort and describes the modeling software, its operation, and includes as an attachment the deliverable computer software specified under the contract.
Debugging and Performance Analysis Software Tools for Peregrine System |
High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea
Students’ Spatial Ability through Open-Ended Approach Aided by Cabri 3D
NASA Astrophysics Data System (ADS)
Priatna, N.
2017-09-01
The use of computer software such as Cabri 3D for learning activities is very unlimited. Students can adjust their learning speed according to their level of ability. Open-ended approach strongly supports the use of computer software in learning, because the goal of open-ended learning is to help developing creative activities and mathematical mindset of students through problem solving simultaneously. In other words, creative activities and mathematical mindset of students should be developed as much as possible in accordance with the ability of spatial ability of each student. Spatial ability is the ability of students in constructing and representing geometry models. This study aims to determine the improvement of spatial ability of junior high school students who obtained learning with open-ended approach aided by Cabri 3D. It adopted a quasi-experimental method with the non-randomized control group pretest-posttest design and the 2×3 factorial model. The instrument of the study is spatial ability test. Based on analysis of the data, it is found that the improvement of spatial ability of students who received open-ended learning aided by Cabri 3D was greater than students who received expository learning, both as a whole and based on the categories of students’ initial mathematical ability.
NASA Technical Reports Server (NTRS)
Simons, Rainee N.; Wintucky, Edwin G.; Landon, David G.; Sun, Jun Y.; Winn, James S.; Laraway, Stephen; McIntire, William K.; Metz, John L.; Smith, Francis J.
2011-01-01
The paper presents the first ever research and experimental results regarding the combination of a software-defined multi-Gbps modem and a broadband high power space amplifier when tested with an extended form of the industry standard DVB-S2 and LDPC rate 9/10 FEC codec. The modem supports waveforms including QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK, and 128-QAM. The broadband high power amplifier is a space qualified traveling-wave tube (TWT), which has a passband greater than 3 GHz at 33 GHz, output power of 200 W and efficiency greater than 60 percent. The modem and the TWTA together enabled an unprecedented data rate at 20 Gbps with low BER of 10(exp -9). The presented results include a plot of the received waveform constellation, BER vs. E(sub b)/N(sub 0) and implementation loss for each of the modulation types tested. The above results when included in an RF link budget analysis show that NASA s payload data rate can be increased by at least an order of magnitude (greater than 10X) over current state-of-practice, limited only by the spacecraft EIRP, ground receiver G/T, range, and available spectrum or bandwidth.
Scaffolding Executive Function Capabilities via Play-&-Learn Software for Preschoolers
ERIC Educational Resources Information Center
Axelsson, Anton; Andersson, Richard; Gulz, Agneta
2016-01-01
Educational software in the form of games or so called "computer assisted intervention" for young children has become increasingly common receiving a growing interest and support. Currently there are, for instance, more than 1,000 iPad apps tagged for preschool. Thus, it has become increasingly important to empirically investigate…
A Survey on the Use of Microcomputers in Special Libraries.
ERIC Educational Resources Information Center
Krieger, Tillie
1986-01-01
Describes a survey on the use of microcomputers in special libraries. The discussion of the findings includes types of hardware and software in use; applications in public services, technical processes, and administrative tasks; data back-up techniques; training received; evaluation of software; and future plans for microcomputer applications. (1…
ERIC Educational Resources Information Center
Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth
2010-01-01
This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…
Effectiveness of Software Training Using Simulations: An Exploratory Study
ERIC Educational Resources Information Center
McElroy, Arnold D., Jr.; Pan, Cheng-Chang
2009-01-01
This study was designed to explore the effectiveness in student performance and confidence of limited and full device simulators. The 30 employees from an information technology company who participated in this study were assigned to one of three groups. Each group received practice for learning a complex software procedure using traditional…
10 CFR 600.325 - Intellectual property.
Code of Federal Regulations, 2014 CFR
2014-01-01
... other than a small business concern, as defined in 35 U.S.C. 201(h) and receives an award or a subaward... rights data and/or restricted computer software, may be included. Inclusion of a background patent and/or..., an award will not require the delivery of limited rights data or restricted computer software...
10 CFR 600.325 - Intellectual property.
Code of Federal Regulations, 2013 CFR
2013-01-01
... other than a small business concern, as defined in 35 U.S.C. 201(h) and receives an award or a subaward... rights data and/or restricted computer software, may be included. Inclusion of a background patent and/or..., an award will not require the delivery of limited rights data or restricted computer software...
10 CFR 600.325 - Intellectual property.
Code of Federal Regulations, 2012 CFR
2012-01-01
... other than a small business concern, as defined in 35 U.S.C. 201(h) and receives an award or a subaward... rights data and/or restricted computer software, may be included. Inclusion of a background patent and/or..., an award will not require the delivery of limited rights data or restricted computer software...
A Software Defined Radio Based Airplane Communication Navigation Simulation System
NASA Astrophysics Data System (ADS)
He, L.; Zhong, H. T.; Song, D.
2018-01-01
Radio communication and navigation system plays important role in ensuring the safety of civil airplane in flight. Function and performance should be tested before these systems are installed on-board. Conventionally, a set of transmitter and receiver are needed for each system, thus all the equipment occupy a lot of space and are high cost. In this paper, software defined radio technology is applied to design a common hardware communication and navigation ground simulation system, which can host multiple airplane systems with different operating frequency, such as HF, VHF, VOR, ILS, ADF, etc. We use a broadband analog frontend hardware platform, universal software radio peripheral (USRP), to transmit/receive signal of different frequency band. Software is compiled by LabVIEW on computer, which interfaces with USRP through Ethernet, and is responsible for communication and navigation signal processing and system control. An integrated testing system is established to perform functional test and performance verification of the simulation signal, which demonstrate the feasibility of our design. The system is a low-cost and common hardware platform for multiple airplane systems, which provide helpful reference for integrated avionics design.
System for Secure Integration of Aviation Data
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Wang, Yao; Keller, Rich; Chidester, Tom; Statler, Irving; Lynch, Bob; Patel, Hemil; Windrem, May; Lawrence, Bob
2007-01-01
The Aviation Data Integration System (ADIS) of Ames Research Center has been established to promote analysis of aviation data by airlines and other interested users for purposes of enhancing the quality (especially safety) of flight operations. The ADIS is a system of computer hardware and software for collecting, integrating, and disseminating aviation data pertaining to flights and specified flight events that involve one or more airline(s). The ADIS is secure in the sense that care is taken to ensure the integrity of sources of collected data and to verify the authorizations of requesters to receive data. Most importantly, the ADIS removes a disincentive to collection and exchange of useful data by providing for automatic removal of information that could be used to identify specific flights and crewmembers. Such information, denoted sensitive information, includes flight data (here signifying data collected by sensors aboard an aircraft during flight), weather data for a specified route on a specified date, date and time, and any other information traceable to a specific flight. The removal of information that could be used to perform such tracing is called "deidentification." Airlines are often reluctant to keep flight data in identifiable form because of concerns about loss of anonymity. Hence, one of the things needed to promote retention and analysis of aviation data is an automated means of de-identification of archived flight data to enable integration of flight data with non-flight aviation data while preserving anonymity. Preferably, such an automated means would enable end users of the data to continue to use pre-existing data-analysis software to identify anomalies in flight data without identifying a specific anomalous flight. It would then also be possible to perform statistical analyses of integrated data. These needs are satisfied by the ADIS, which enables an end user to request aviation data associated with de-identified flight data. The ADIS includes client software integrated with other software running on flight-operations quality-assurance (FOQA) computers for purposes of analyzing data to study specified types of events or exceedences (departures of flight parameters from normal ranges). In addition to ADIS client software, ADIS includes server hardware and software that provide services to the ADIS clients via the Internet (see figure). The ADIS server receives and integrates flight and non-flight data pertaining to flights from multiple sources. The server accepts data updates from authorized sources only and responds to requests from authorized users only. In order to satisfy security requirements established by the airlines, (1) an ADIS client must not be accessible from the Internet by an unauthorized user and (2) non-flight data as airport terminal information system (ATIS) and weather data must be displayed without any identifying flight information. ADIS hardware and software architecture as well as encryption and data display scheme are designed to meet these requirements. When a user requests one or more selected aviation data characteristics associated with an event (e.g., a collision, near miss, equipment malfunction, or exceedence), the ADIS client augments the request with date and time information from encrypted files and submits the augmented request to the server. Once the user s authorization has been verified, the server returns the requested information in de-identified form.
Methods for Processing and Interpretation of AIS Signals Corrupted by Noise and Packet Collisions
NASA Astrophysics Data System (ADS)
Poļevskis, J.; Krastiņš, M.; Korāts, G.; Skorodumovs, A.; Trokšs, J.
2012-01-01
The authors deal with the operation of Automatic Identification System (AIS) used in the marine traffic monitoring to broadcast messages containing information about the vessel: id, payload, size, speed, destination etc., meant primarily for avoidance of ship collisions. To extend the radius of AIS operation, it is envisaged to dispose its receivers on satellites. However, in space, due to a large coverage area, interfering factors are especially pronounced - such as packet collision, Doppler's shift and noise impact on AIS message receiving, pre-processing and decoding. To assess the quality of an AIS receiver's operation, a test was carried out in which, varying automatically frequency, amplitude, noise, and other parameters, the data on the ability of the receiver's ability to decode AIS signals are collected. In the work, both hardware- and software-based AIS decoders were tested. As a result, quite satisfactory statistics has been gathered - both on the common and the differing features of such decoders when operating in space. To obtain reliable data on the software-defined radio AIS receivers, further research is envisaged.
Anticoagulant use for prevention of stroke in a commercial population with atrial fibrillation.
Patel, Aarti A; Lennert, Barb; Macomson, Brian; Nelson, Winnie W; Owens, Gary M; Mody, Samir H; Schein, Jeff
2012-07-01
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia, and patients with AF are at an increased risk for stroke. Thromboprophylaxis with vitamin K antagonists reduces the annual incidence of stroke by approximately 60%, but appropriate thromboprophylaxis is prescribed for only approximately 50% of eligible patients. Health plans may help to improve quality of care for patients with AF by analyzing claims data for care improvement opportunities. To analyze pharmacy and medical claims data from a large integrated commercial database to determine the risk for stroke and the appropriateness of anticoagulant use based on guideline recommendations for patients with AF. This descriptive, retrospective claims data analysis used the Anticoagulant Quality Improvement Analyzer software, which was designed to analyze health plan data. The data for this study were obtained from a 10% randomly selected sample from the PharMetrics Integrated Database. This 10% sample resulted in almost 26,000 patients with AF who met the inclusion criteria for this study. Patients with a new or existing diagnosis of AF between July 2008 and June 2010 who were aged ≥18 years were included in this analysis. The follow-up period was 1 year. Demographics, stroke risk level (CHADS2 and CHA2DS2-VASc scores), anticoagulant use, and inpatient stroke hospitalizations were analyzed through the analyzer software. Of the 25,710 patients with AF (CHADS2 score 0-6) who were eligible to be included in this study, 9093 (35%) received vitamin K antagonists and 16,617 (65%) did not receive any anticoagulant. Of the patients at high risk for stroke, as predicted by CHADS2, 39% received an anticoagulant medication. The rates of patients receiving anticoagulant medication varied by age-group-16% of patients aged <65 years, 22% of those aged 65 to 74 years, and 61% of elderly ≥75 years. Among patients hospitalized for stroke, only 28% were treated with an anticoagulant agent in the outpatient setting before admission. Our findings support the current literature, indicating that many patients with AF are not receiving appropriate thromboprophylaxis to counter their risk for stroke. Increased use of appropriate anticoagulation, particularly in high-risk patients, has the potential to reduce the incidence of stroke along with associated fatalities and morbidities.
NASA Technical Reports Server (NTRS)
Tikidjian, Raffi; Mackey, Ryan
2008-01-01
The DSN Array Simulator (wherein 'DSN' signifies NASA's Deep Space Network) is an updated version of software previously denoted the DSN Receive Array Technology Assessment Simulation. This software (see figure) is used for computational modeling of a proposed DSN facility comprising user-defined arrays of antennas and transmitting and receiving equipment for microwave communication with spacecraft on interplanetary missions. The simulation includes variations in spacecraft tracked and communication demand changes for up to several decades of future operation. Such modeling is performed to estimate facility performance, evaluate requirements that govern facility design, and evaluate proposed improvements in hardware and/or software. The updated version of this software affords enhanced capability for characterizing facility performance against user-defined mission sets. The software includes a Monte Carlo simulation component that enables rapid generation of key mission-set metrics (e.g., numbers of links, data rates, and date volumes), and statistical distributions thereof as functions of time. The updated version also offers expanded capability for mixed-asset network modeling--for example, for running scenarios that involve user-definable mixtures of antennas having different diameters (in contradistinction to a fixed number of antennas having the same fixed diameter). The improved version also affords greater simulation fidelity, sufficient for validation by comparison with actual DSN operations and analytically predictable performance metrics.
NASA Astrophysics Data System (ADS)
Santos, Maria J.; de Boer, Hugo; Dekker, Stefan
2017-04-01
Sustainability science has emerged as a key discipline that embraces both disciplinary depth and interdisciplinary breadth. The challenge is to design University courses that convey both properties without sacrificing either of them. Here we present the design of such course at Utrecht University (the Netherlands) for the MSC program 'Sustainable Development' and discuss the perceived learning and student evaluations. Our course (Sustainability Modelling and Indicators (SMI)) follows an introductory course on Sustainability Perspectives. SMI philosophy is that system thinking and system analysis is central to sustainability science. To convey this philosophy, we focus on four themes: the Anthropocene, Food security, Energy security and Agency and decision making. We developed four hands-on assignments with increasing complexity and make use of different software (Stella, Excel, IMAGE and Netlogo). The assignments aimed at: (1) teaching students the system components by using a pre-existing model in Stella, (2) challenge students to build their own coupled system in Excel, (3) assess outputs from the fully-coupled and dynamic model integrated assessment model IMAGE, and (4) understand emergent properties using an agent-based model in Netlogo. Based on detailed student evaluations (n = 95) we found that the mathematics presented a manageable challenge to a part of the students. The student pool identified a priori having higher experience with Excel in comparison with other software. Netlogo was the highest ranked software in the student evaluations and this was linked to its user-interface with moving agents. The Excel assignment received the highest and lowest scores, and students found it challenging, time consuming but also indicated that they learned the most from this assignment. Students graded what we considered 'easy' assignments with the highest grades. These results suggest that a systems analytical approach to sustainability science can be operationalized in diverse ways that relate to students background and making use of of-the-shelf software. The key challenge is to teach students all the concepts of systems analysis and the applied mathematics behind it. If the goal is to demonstrate process, this portfolio approach with of-the-shelf software can be very successful. This course can be complemented with programming that provides skills to modify and customize software to student needs.
Data Format Classification for Autonomous Software Defined Radios
NASA Technical Reports Server (NTRS)
Simon, Marvin; Divsalar, Dariush
2005-01-01
We present maximum-likelihood (ML) coherent and noncoherent classifiers for discriminating between NRZ and Manchester coded (biphase-L) data formats for binary phase-shift-keying (BPSK) modulation. Such classification of the data format is an essential element of so-called autonomous software defined radio (SDR) receivers (similar to so-called cognitive SDR receivers in the military application) where it is desired that the receiver perform each of its functions by extracting the appropriate knowledge from the received signal and, if possible, with as little information of the other signal parameters as possible. Small and large SNR approximations to the ML classifiers are also proposed that lead to simpler implementation with comparable performance in their respective SNR regions. Numerical performance results obtained by a combination of computer simulation and, wherever possible, theoretical analyses, are presented and comparisons are made among the various configurations based on the probability of misclassification as a performance criterion. Extensions to other modulations such as QPSK are readily accomplished using the same methods described in the paper.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
NASA Astrophysics Data System (ADS)
Ortega, Jesus Daniel
This work focuses on the development of a solar power thermal receiver for a supercritical-carbon dioxide (sCO2), Brayton power-cycle to produce ~1 MWe. Closed-loop sCO2 Brayton cycles are being evaluated in combination with concentrating solar power to provide higher thermal-to-electric conversion efficiencies relative to conventional steam Rankine cycles. High temperatures (923--973 K) and pressures (20--25 MPa) are required in the solar receiver to achieve thermal efficiencies of ~50%, making concentrating solar power (CSP) technologies a competitive alternative to current power generation methods. In this study, the CSP receiver is required to achieve an outlet temperature of 923 K at 25 MPa or 973 K at 20 MPa to meet the operating needs. To obtain compatible receiver tube material, an extensive material review was performed based the ASME Boiler and Pressure Vessel Code, ASME B31.1 and ASME B313.3 codes respectively. Subsequently, a thermal-structural model was developed using a commercial computational fluid (CFD) dynamics and structural mechanics software for designing and analyzing the tubular receiver that could provide the heat input for a ~2 MWth plant. These results were used to perform an analytical cumulative damage creep-fatigue analysis to estimate the work-life of the tubes. In sequence, an optical-thermal-fluid model was developed to evaluate the resulting thermal efficiency of the tubular receiver from the NSTTF heliostat field. The ray-tracing tool SolTrace was used to obtain the heat-flux distribution on the surfaces of the receiver. The K-ω SST turbulence model and P-1 radiation model used in Fluent were coupled with SolTrace to provide the heat flux distribution on the receiver surface. The creep-fatigue analysis displays the damage accumulated due to the cycling and the permanent deformation of the tubes. Nonetheless, they are able to support the required lifetime. The receiver surface temperatures were found to be within the safe operational limit while exhibiting a receiver thermal efficiency of ~85%. Future work includes the completion of a cyclic loading analysis to be performed using the Larson-Miller creep model in nCode Design Life to corroborate the structural integrity of the receiver over the desired lifetime of ~10,000 cycles.
Zhao, P Y; Yu, X; Yang, K; Feng, S Y; Wang, F X; Wang, B Y
2016-05-01
To understand the efficacy of antiretroviral therapy for people living with HIV/AIDS and influencing factors; and provide evidence to improve the treatment of HIV infection and AIDS for the better life of the patients. A cross sectional study was conducted in designated AIDS hospitals in Harbin. A questionnaire was used to collect the information of the patients receiving treatment in these hospitals. The statistical analysis was done with software SAS 9.2 and Excel 2010. Univariate analysis was performed with t test and multivariate analysis was performed with ordinal logistic regression model. Wilcoxon ranks sum test was conducted to compare the CD4(+) T lymphocyte counts. The number of the patients receiving antiretroviral therapy was in increase in recent years. The HIV infection route was mainly homosexual contact. The CD4(+)T lymphocyte count of the patients increased at different levels after ≥6 months treatment(P<0.01). Household income(P<0.05), adherence to treatment plan or not(P<0.05), social relationship(P< 0.05), concern of economic cost(P<0.01)medication compliance(P<0.01)and initial level of CD4(+) T lymphocyte(P<0.01)were the influencing factors for antiretroviral therapy efficacy. In designated hospitals in Harbin, the number of the patients receiving HIV antiretroviral therapy kept to increase and the efficacy of the treatment was obvious.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
Disseminating Metaproteomic Informatics Capabilities and Knowledge Using the Galaxy-P Framework
Easterly, Caleb; Gruening, Bjoern; Johnson, James; Kolmeder, Carolin A.; Kumar, Praveen; May, Damon; Mehta, Subina; Mesuere, Bart; Brown, Zachary; Elias, Joshua E.; Hervey, W. Judson; McGowan, Thomas; Muth, Thilo; Rudney, Joel; Griffin, Timothy J.
2018-01-01
The impact of microbial communities, also known as the microbiome, on human health and the environment is receiving increased attention. Studying translated gene products (proteins) and comparing metaproteomic profiles may elucidate how microbiomes respond to specific environmental stimuli, and interact with host organisms. Characterizing proteins expressed by a complex microbiome and interpreting their functional signature requires sophisticated informatics tools and workflows tailored to metaproteomics. Additionally, there is a need to disseminate these informatics resources to researchers undertaking metaproteomic studies, who could use them to make new and important discoveries in microbiome research. The Galaxy for proteomics platform (Galaxy-P) offers an open source, web-based bioinformatics platform for disseminating metaproteomics software and workflows. Within this platform, we have developed easily-accessible and documented metaproteomic software tools and workflows aimed at training researchers in their operation and disseminating the tools for more widespread use. The modular workflows encompass the core requirements of metaproteomic informatics: (a) database generation; (b) peptide spectral matching; (c) taxonomic analysis and (d) functional analysis. Much of the software available via the Galaxy-P platform was selected, packaged and deployed through an online metaproteomics “Contribution Fest“ undertaken by a unique consortium of expert software developers and users from the metaproteomics research community, who have co-authored this manuscript. These resources are documented on GitHub and freely available through the Galaxy Toolshed, as well as a publicly accessible metaproteomics gateway Galaxy instance. These documented workflows are well suited for the training of novice metaproteomics researchers, through online resources such as the Galaxy Training Network, as well as hands-on training workshops. Here, we describe the metaproteomics tools available within these Galaxy-based resources, as well as the process by which they were selected and implemented in our community-based work. We hope this description will increase access to and utilization of metaproteomics tools, as well as offer a framework for continued community-based development and dissemination of cutting edge metaproteomics software. PMID:29385081
Disseminating Metaproteomic Informatics Capabilities and Knowledge Using the Galaxy-P Framework.
Blank, Clemens; Easterly, Caleb; Gruening, Bjoern; Johnson, James; Kolmeder, Carolin A; Kumar, Praveen; May, Damon; Mehta, Subina; Mesuere, Bart; Brown, Zachary; Elias, Joshua E; Hervey, W Judson; McGowan, Thomas; Muth, Thilo; Nunn, Brook; Rudney, Joel; Tanca, Alessandro; Griffin, Timothy J; Jagtap, Pratik D
2018-01-31
The impact of microbial communities, also known as the microbiome, on human health and the environment is receiving increased attention. Studying translated gene products (proteins) and comparing metaproteomic profiles may elucidate how microbiomes respond to specific environmental stimuli, and interact with host organisms. Characterizing proteins expressed by a complex microbiome and interpreting their functional signature requires sophisticated informatics tools and workflows tailored to metaproteomics. Additionally, there is a need to disseminate these informatics resources to researchers undertaking metaproteomic studies, who could use them to make new and important discoveries in microbiome research. The Galaxy for proteomics platform (Galaxy-P) offers an open source, web-based bioinformatics platform for disseminating metaproteomics software and workflows. Within this platform, we have developed easily-accessible and documented metaproteomic software tools and workflows aimed at training researchers in their operation and disseminating the tools for more widespread use. The modular workflows encompass the core requirements of metaproteomic informatics: (a) database generation; (b) peptide spectral matching; (c) taxonomic analysis and (d) functional analysis. Much of the software available via the Galaxy-P platform was selected, packaged and deployed through an online metaproteomics "Contribution Fest" undertaken by a unique consortium of expert software developers and users from the metaproteomics research community, who have co-authored this manuscript. These resources are documented on GitHub and freely available through the Galaxy Toolshed, as well as a publicly accessible metaproteomics gateway Galaxy instance. These documented workflows are well suited for the training of novice metaproteomics researchers, through online resources such as the Galaxy Training Network, as well as hands-on training workshops. Here, we describe the metaproteomics tools available within these Galaxy-based resources, as well as the process by which they were selected and implemented in our community-based work. We hope this description will increase access to and utilization of metaproteomics tools, as well as offer a framework for continued community-based development and dissemination of cutting edge metaproteomics software.
NASA Technical Reports Server (NTRS)
Pons, R. L.; Grigsby, C. E.
1980-01-01
Activities planned for phase 2 Of the Small Community Solar Thermal Power Experiment (PFDR) program are summarized with emphasis on a dish-Rankine point focusing distributed receiver solar thermal electric system. Major design efforts include: (1) development of an advanced concept indirect-heated receiver;(2) development of hardware and software for a totally unmanned power plant control system; (3) implementation of a hybrid digital simulator which will validate plant operation prior to field testing; and (4) the acquisition of an efficient organic Rankine cycle power conversion unit. Preliminary performance analyses indicate that a mass-produced dish-Rankine PFDR system is potentially capable of producing electricity at a levelized busbar energy cost of 60 to 70 mills per KWh and with a capital cost of about $1300 per KW.
Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool
NASA Technical Reports Server (NTRS)
Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian
2011-01-01
The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the system can evolve into a process management server for the actual process.
Creating Interactive User Feedback in DGS Using Scripting Interfaces
ERIC Educational Resources Information Center
Fest, Andreas
2010-01-01
Feedback is an important component of interactive learning software. A conclusion from cognitive learning theory is that good software must give the learner more information about what he did. Following the ideas of constructivist learning theory the user should be in control of both the time and the level of feedback he receives. At the same time…
26 CFR 1.167(a)-14 - Treatment of certain intangible property excluded from section 197.
Code of Federal Regulations, 2014 CFR
2014-04-01
... of computer software that is section 179 property, as defined in section 179(d)(1)(A)(ii), must be... include certain computer software and certain other separately acquired rights, such as rights to receive... subject to the allowance for depreciation under section 167(a). (b) Computer software—(1) In general. The...
26 CFR 1.167(a)-14 - Treatment of certain intangible property excluded from section 197.
Code of Federal Regulations, 2012 CFR
2012-04-01
... of computer software that is section 179 property, as defined in section 179(d)(1)(A)(ii), must be... include certain computer software and certain other separately acquired rights, such as rights to receive... subject to the allowance for depreciation under section 167(a). (b) Computer software—(1) In general. The...
26 CFR 1.167(a)-14 - Treatment of certain intangible property excluded from section 197.
Code of Federal Regulations, 2011 CFR
2011-04-01
... of computer software that is section 179 property, as defined in section 179(d)(1)(A)(ii), must be... include certain computer software and certain other separately acquired rights, such as rights to receive... subject to the allowance for depreciation under section 167(a). (b) Computer software—(1) In general. The...
26 CFR 1.167(a)-14 - Treatment of certain intangible property excluded from section 197.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of computer software that is section 179 property, as defined in section 179(d)(1)(A)(ii), must be... include certain computer software and certain other separately acquired rights, such as rights to receive... subject to the allowance for depreciation under section 167(a). (b) Computer software—(1) In general. The...
26 CFR 1.167(a)-14 - Treatment of certain intangible property excluded from section 197.
Code of Federal Regulations, 2013 CFR
2013-04-01
... of computer software that is section 179 property, as defined in section 179(d)(1)(A)(ii), must be... include certain computer software and certain other separately acquired rights, such as rights to receive... subject to the allowance for depreciation under section 167(a). (b) Computer software—(1) In general. The...
A Randomized Trial of an Elementary School Mathematics Software Intervention: Spatial-Temporal Math
ERIC Educational Resources Information Center
Rutherford, Teomara; Farkas, George; Duncan, Greg; Burchinal, Margaret; Kibrick, Melissa; Graham, Jeneen; Richland, Lindsey; Tran, Natalie; Schneider, Stephanie; Duran, Lauren; Martinez, Michael E.
2014-01-01
Fifty-two low performing schools were randomly assigned to receive Spatial-Temporal (ST) Math, a supplemental mathematics software and instructional program, in second/third or fourth/fifth grades or to a business-as-usual control. Analyses reveal a negligible effect of ST Math on mathematics scores, which did not differ significantly across…
ERIC Educational Resources Information Center
Palumbo, Debra L; Palumbo, David B.
1993-01-01
Computer-based problem-solving software exposure was compared to Lego TC LOGO instruction. Thirty fifth graders received either Lego LOGO instruction, which couples Lego building block activities with LOGO computer programming, or instruction with various problem-solving computer programs. Although both groups showed significant progress, the Lego…
AWIPS II in the University Community: Unidata's efforts and capabilities of the software
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; James, Michael
2015-04-01
The Advanced Weather Interactive Processing System, version II (AWIPS II) is a weather forecasting, display and analysis tool that is used by the National Oceanic and Atmospheric Administration/National Weather Service (NOAA/NWS) and the National Centers for Environmental Prediction (NCEP) to ingest analyze and disseminate operational weather data. The AWIPS II software is built on a Service Oriented Architecture, takes advantage of open source software, and its design affords expandability, flexibility, and portability. Since many university meteorology programs are eager to use the same tools used by NWS forecasters, Unidata community interest in AWIPS II is high. The Unidata Program Center (UPC) has worked closely with NCEP staff during AWIPS II development in order to devise a way to make it available to the university. The Unidata AWIPS II software was released in beta form in 2014, and it incorporates a number of key changes to the baseline U. S. National Weather Service release to process and display additional data formats and run all components in a single-server standalone configuration. In addition to making available open-source instances of the software libraries that can be downloaded and run at any university, Unidata has also deployed the data-server side of AWIPS II, known as EDEX, in the Amazon Web Service and Microsoft Azure cloud environments. In this set up, universities receive all of the data from remote cloud instances, while they only have to run the AWIPS II client, known as CAVE, to analyze and visualize the data. In this presentation, we will describe Unidata's AWIPS II efforts, including the capabilities of the software in visualizing many different types of real-time meteorological data and its myriad uses in university and other settings.
NASA Astrophysics Data System (ADS)
Musaoglu, N.; Saral, A.; Seker, D. Z.
2012-12-01
Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly seen that the developed (TRA) software which uses two different methods for flood risk analysis, can be more effective for achieving different decision problems, from conventional techniques and produce more reliable results in a short time.; Study Area
NASA Astrophysics Data System (ADS)
Thongtan, Thayathip; Tirawanichakul, Pawit; Satirapod, Chalermchon
2017-12-01
Each GNSS constellation operates its own system times; namely, GPS system time (GPST), GLONASS system time (GLONASST), BeiDou system time (BDT) and Galileo system time (GST). They could be traced back to Coordinated Universal Time (UTC) scale and are aligned to GPST. This paper estimates the receiver clock offsets to three timescales: GPST, GLONASST and BDT. The two measurement scenarios use two identical multi-GNSS geodetic receivers connected to the same geodetic antenna through a splitter. One receiver is driven by its internal oscillators and another receiver is connected to the external frequency oscillators, caesium frequency standard, kept as the Thailand standard time scale at the National Institute of Metrology (Thailand) called UTC(NIMT). The three weeks data are observed at 30 seconds sample rate. The receiver clock offsets with respected to the three system time are estimated and analysed through the geodetic technique of static Precise Point Positioning (PPP) using a data processing software developed by Wuhan University - Positioning And Navigation Data Analyst (PANDA) software. The estimated receiver clock offsets are around 32, 33 and 18 nanoseconds from GPST, GLONASST and BDT respectively. This experiment is initially stated that each timescale is inter-operated with GPST and further measurements on receiver internal delay has to be determined for clock comparisons especially the high accuracy clock at timing laboratories.
The personal receiving document management and the realization of email function in OAS
NASA Astrophysics Data System (ADS)
Li, Biqing; Li, Zhao
2017-05-01
This software is an independent software system, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs. This software is an independent software system, using the current popular B/S (browser/server) structure and ASP.NET technology development, using the Windows 7 operating system, Microsoft SQL Server2005 Visual2008 and database as a development platform, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs.
Nuclear and Particle Physics Simulations: The Consortium of Upper-Level Physics Software
NASA Astrophysics Data System (ADS)
Bigelow, Roberta; Moloney, Michael J.; Philpott, John; Rothberg, Joseph
1995-06-01
The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.
A software engineering approach to expert system design and verification
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.; Goodwin, Mary Ann
1988-01-01
Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.
SureTrak Probability of Impact Display
NASA Technical Reports Server (NTRS)
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
Inertial Upper Stage (IUS) software analysis
NASA Technical Reports Server (NTRS)
Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.
1979-01-01
The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
Non-Contact Technique for Determining the Mechanical Stress in thin Films on Wafers by Profiler
NASA Astrophysics Data System (ADS)
Djuzhev, N. A.; Dedkova, A. A.; E Gusev, E.; Makhiboroda, M. A.; Glagolev, P. Y.
2017-04-01
This paper presents an algorithm for analysis of relief for the purpose of calculating mechanical stresses in a selected direction on the plate in the form of software package Matlab. The method allows for the measurement sample in the local area. It provides a visual representation of the data and allows to get stress distribution on wafer surface. Automated analysis process reduces the likelihood of errors researcher. Achieved time saving during processing results. In carrying out several measurements possible drawing card plate to predict yield crystals. According to this technique done in measurement of mechanical stresses of thermal silicon oxide film on a silicon substrate. Analysis of the results showed objectivity and reliability calculations. This method can be used for selecting the optimal parameters of the material deposition conditions. In software of device-technological simulation TCAD defined process time, temperature and oxidation of the operation of the sample environment for receiving the set value of the dielectric film thickness. Calculated thermal stresses are in the system silicon-silicon oxide. There is a good correlation between numerical simulations and analytical calculation. It is shown that the nature of occurrence of mechanical stress is not limited to the difference of thermal expansion coefficients of materials.
Becker, A S; Blüthgen, C; Phi van, V D; Sekaggya-Wiltshire, C; Castelnuovo, B; Kambugu, A; Fehr, J; Frauenfelder, T
2018-03-01
To evaluate the feasibility of Deep Learning-based detection and classification of pathological patterns in a set of digital photographs of chest X-ray (CXR) images of tuberculosis (TB) patients. In this prospective, observational study, patients with previously diagnosed TB were enrolled. Photographs of their CXRs were taken using a consumer-grade digital still camera. The images were stratified by pathological patterns into classes: cavity, consolidation, effusion, interstitial changes, miliary pattern or normal examination. Image analysis was performed with commercially available Deep Learning software in two steps. Pathological areas were first localised; detected areas were then classified. Detection was assessed using receiver operating characteristics (ROC) analysis, and classification using a confusion matrix. The study cohort was 138 patients with human immunodeficiency virus (HIV) and TB co-infection (median age 34 years, IQR 28-40); 54 patients were female. Localisation of pathological areas was excellent (area under the ROC curve 0.82). The software could perfectly distinguish pleural effusions from intraparenchymal changes. The most frequent misclassifications were consolidations as cavitations, and miliary patterns as interstitial patterns (and vice versa). Deep Learning analysis of CXR photographs is a promising tool. Further efforts are needed to build larger, high-quality data sets to achieve better diagnostic performance.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
Optimization of Passive Coherent Receiver System Placement
2013-09-01
spheroid object with a constant radar cross section (RCS). Additionally, the receiver and transmitters are assumed to be notional isotropic antennae...software- defined radio for equatorial plasma instability studies,” Radio Science, vol. 48, pp. 1–11. Aug. 2013. [2] P. C. Zhang and B. Y. Li, “Passive
Testing of the International Space Station and X-38 Crew Return Vehicle GPS Receiver
NASA Technical Reports Server (NTRS)
Simpson, James; Campbell, Chip; Carpenter, Russell; Davis, Ed; Kizhner, Semion; Lightsey, E. Glenn; Davis, George; Jackson, Larry
1999-01-01
This paper discusses the process and results of the performance testing of the GPS receiver planned for use on the International Space Station (ISS) and the X-38 Crew Return Vehicle (CRV). The receiver is a Force-19 unit manufactured by Trimble Navigation and modified in software by the NASA Goddard Space Flight Center (GSFC) to perform navigation and attitude determination in space. The receiver is the primary source of navigation and attitude information for ISS and CRV. Engineers at GSFC have developed and tested the new receiver with a Global Simulation Systems Ltd (GSS) GPS Signal Generator (GPSSG). This paper documents the unique aspects of ground testing a GPS receiver that is designed for use in space. A discussion of the design of tests using the GPSSG, documentation, data capture, data analysis, and lessons learned will precede an overview of the performance of the new receiver. A description of the challenges that were overcome during this testing exercise will be presented. Results from testing show that the receiver will be within or near the specifications for ISS attitude and navigation performance. The process for verifying other requirements such as Time to First Fix, Time to First Attitude, selection/deselection of a specific GPS satellite vehicles (SV), minimum signal strength while still obtaining attitude and navigation, navigation and attitude output coverage, GPS week rollover, and Y2K requirements are also given in this paper.
Testing of the International Space Station and X-38 Crew Return Vehicle GPS Receiver
NASA Technical Reports Server (NTRS)
Simpson, James; Campbell, Chip; Carpenter, Russell; Davis, Ed; Kizhner, Semion; Lightsey, E. Glenn; Davis, George; Jackson, Larry
1999-01-01
This paper discusses the process and results of the performance testing of the GPS receiver planned for use on the International Space Station (ISS) and the X-38 Crew Return Vehicle (CRV). The receiver is a Force-19 unit manufactured by Trimble Navigation and Modified in software by the NASA Goddard Space Flight Center (GSFC) to perform navigation and attitude determination in space. The receiver is the primary source of navigation and attitude information for ISS and CRV. Engineers at GSFC have developed and tested the new receiver with a Global Simulation Systems Ltd (GSS) GPS Signal Generator (GPSSG). This paper documents the unique aspects of ground testing a GPS receiver that is designed for use in space. A discussion of the design and tests using the GPSSG, documentation, data capture, data analysis, and lessons learned will precede an overview of the performance of the new receiver. A description of the challenges of that were overcome during this testing exercise will be presented. Results from testing show that the receiver will be within or near the specifications for ISS attitude and navigation performance. The process for verifying other requirements such as Time to First Fix, Time to First Attitude, selection/deselection of a specific GPS satellite vehicles (SV), minimum signal strength while still obtaining attitude and navigation, navigation and attitude output coverage, GPS week rollover, and Y2K requirements are also given in this paper.
Testing of the International Space Station and X-38 Crew Return Vehicle GPS Receiver
NASA Technical Reports Server (NTRS)
Simpson, James; Lightsey, Glenn; Campbell, Chip; Carpenter, Russell; Davis, George; Jackson, Larry; Davis, Ed; Kizhner, Semion
1999-01-01
This paper discusses the process and results of the performance testing of the GPS receiver planned for use on the International Space Station (ISS) and the X- 38CrewReturnVehicle(CRV). The receiver is a Force-19 unit manufactured by Trimble Navigation and modified in software by NASA:s Goddard Space Flight Center (GSFC) to perform navigation and attitude determination in space. The receiver is the primary source of navigation and attitude information for ISS and CRV. Engineers at GSFC have developed and tested the new receiver with a Global Simulation Systems Ltd (GSS) GPS Signal Generator (GPSSG). This paper documents the unique aspects of ground testing a GPS receiver that is designed for use in space. A discussion of the design of tests using the GPSSG, documentation, data capture, data analysis, and lessons learned will precede an overview of the performance of the new receiver. A description of the challenges that were overcome during this testing exercise will be presented. Results from testing show that the receiver will be within or near the specifications for ISS attitude and navigation performance. The process for verifying other requirements such as Time to First Fix, Time to First Attitude, selection/deselection of a specific GPS satellite vehicles (SV), minimum signal strength while still obtaining attitude and navigation, navigation and attitude output coverage, GPS week rollover, and Y2K requirements are also given in this paper.
Scintillation-Hardened GPS Receiver
NASA Technical Reports Server (NTRS)
Stephens, Donald R.
2015-01-01
CommLargo, Inc., has developed a scintillation-hardened Global Positioning System (GPS) receiver that improves reliability for low-orbit missions and complies with NASA's Space Telecommunications Radio System (STRS) architecture standards. A software-defined radio (SDR) implementation allows a single hardware element to function as either a conventional radio or as a GPS receiver, providing backup and redundancy for platforms such as the International Space Station (ISS) and high-value remote sensing platforms. The innovation's flexible SDR implementation reduces cost, weight, and power requirements. Scintillation hardening improves mission reliability and variability. In Phase I, CommLargo refactored an open-source GPS software package with Kalman filter-based tracking loops to improve performance during scintillation and also demonstrated improved navigation during a geomagnetic storm. In Phase II, the company generated a new field-programmable gate array (FPGA)-based GPS waveform to demonstrate on NASA's Space Communication and Navigation (SCaN) test bed.
Testing of CMA-2000 Microwave Landing System (MLS) airborne receiver
NASA Astrophysics Data System (ADS)
Labreche, L.; Murfin, A. J.
1989-09-01
Microwave landing system (MLS) is a precision approach and landing guidance system which provides position information and various air to ground data. Position information is provided on a wide coverage sector and is determined by an azimuth angle measurement, an elevation angle measurement, and a range measurement. MLS performance standards and testing of the MLS airborne receiver is mainly governed by Technical Standard Order TSO-C104 issued by the Federal Aviation Administration. This TSO defines detailed test procedures for use in determining the required performance under standard and stressed conditions. It also imposes disciplines on software development and testing procedures. Testing performed on the CMA-2000 MLS receiver and methods used in its validation are described. A computer automated test system has been developed to test for compliance with RTCA/DO-177 Minimum Operation Performance Standards. Extensive software verification and traceability tests designed to ensure compliance with RTCA/DO-178 are outlined.
Sensory Interactive Teleoperator Robotic Grasping
NASA Technical Reports Server (NTRS)
Alark, Keli; Lumia, Ron
1997-01-01
As the technological world strives for efficiency, the need for economical equipment that increases operator proficiency in minimal time is fundamental. This system links a CCD camera, a controller and a robotic arm to a computer vision system to provide an alternative method of image analysis. The machine vision system which was employed possesses software tools for acquiring and analyzing images which are received through a CCD camera. After feature extraction on the object in the image was performed, information about the object's location, orientation and distance from the robotic gripper is sent to the robot controller so that the robot can manipulate the object.
Magnetospheric Multiscale Mission Navigation Performance During Apogee-Raising and Beyond
NASA Technical Reports Server (NTRS)
Farahmand, Mitra; Long, Anne; Hollister, Jacob; Rose, Julie; Godine, Dominic
2017-01-01
The primary objective of the Magnetospheric Multiscale (MMS) Mission is to study the magnetic reconnection phenomena in the Earths magnetosphere. The MMS mission consists of four identical spinning spacecraft with the science objectives requiring a tetrahedral formation in highly elliptical orbits. The MMS spacecraft are equipped with onboard orbit and time determination software, provided by a weak-signal Global Positioning System (GPS) Navigator receiver hosting the Goddard Enhanced Onboard Navigation System (GEONS). This paper presents the results of MMS navigation performance analysis during the Phase 2a apogee-raising campaign and Phase 2b science segment of the mission.
NASA Technical Reports Server (NTRS)
2002-01-01
MarketMiner(R) Products, a line of automated marketing analysis tools manufactured by MarketMiner, Inc., can benefit organizations that perform significant amounts of direct marketing. MarketMiner received a Small Business Innovation Research (SBIR) contract from NASA's Johnson Space Center to develop the software as a data modeling tool for space mission applications. The technology was then built into the company current products to provide decision support for business and marketing applications. With the tool, users gain valuable information about customers and prospects from existing data in order to increase sales and profitability. MarketMiner(R) is a registered trademark of MarketMiner, Inc.
Journal of Open Source Software (JOSS): design and first-year review
NASA Astrophysics Data System (ADS)
Smith, Arfon M.
2018-01-01
JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.
Meta-analysis for psychological impact of breast reconstruction in patients with breast cancer.
Chen, Wanjing; Lv, Xiaoai; Xu, Xiaohong; Gao, Xiufei; Wang, Bei
2018-07-01
This meta-analysis aimed to evaluate the impact of breast reconstruction on the psychological aspects in patients with breast cancer. A literature search on PubMed, Embase, ScienceDirect and Google scholar databases was conducted up to September 2017. The pooled risk radio (RR) or standard mean difference (SMD) and the corresponding 95% confidence intervals (CIs) were calculated using the RevMan 5.3 software. A total of 5 studies were included in this meta-analysis. There were 551 breast cancer patients receiving mastectomy plus breast reconstruction and 574 breast cancer patients receiving mastectomy alone. The results showed that breast reconstruction can significantly decrease the incidence of anxiety (RR = 0.62, 95% CI 0.47-0.82, P = 0.0006)/depression (RR = 0.54, 95% CI 0.32-0.93, P = 0.02) and scale score for evaluating anxiety (SMD = - 0.20, 95% CI - 0.37 to - 0.03, P = 0.02)/depression (SMD = - 0.22, 95% CI - 0.39 to - 0.66, P = 0.007) compared with mastectomy alone. Breast reconstruction after mastectomy was benefit for improving the psychological damages in patients with breast cancer.
Ultrasonics and space instrumentation
NASA Technical Reports Server (NTRS)
1987-01-01
The design topic selected was an outgrowth of the experimental design work done in the Fluid Behavior in Space experiment, which relies on the measurement of minute changes of the pressure and temperature to obtain reasonably accurate volume determinations. An alternative method of volume determination is the use of ultrasonic imaging. An ultrasonic wave system is generated by wall mounted transducer arrays. The interior liquid configuration causes reflection and refraction of the pattern so that analysis of the received wave system provides a description of the configuration and hence volume. Both continuous and chirp probe beams were used in a laboratory experiment simulating a surface wetting propellant. The hardware included a simulated tank with gaseous voids, transmitting and receiving transducers, transmitters, receivers, computer interface, and computer. Analysis software was developed for image generation and interpretation of results. Space instrumentation was pursued in support of a number of experiments under development for GAS flights. The program included thirty undergraduate students pursuing major qualifying project work under the guidance of eight faculty supported by a teaching assistant. Both mechanical and electrical engineering students designed and built several microprocessor systems to measure parameters such as temperature, acceleration, pressure, velocity, and circulation in order to determine combustion products, vortex formation, gas entrainment, EMR emissions from thunderstorms, and milli-g-accelerations due to crew motions.
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
A tool to include gamma analysis software into a quality assurance program.
Agnew, Christina E; McGarry, Conor K
2016-03-01
To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.
Analysis of digital communication signals and extraction of parameters
NASA Astrophysics Data System (ADS)
Al-Jowder, Anwar
1994-12-01
The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).
Precise Interval Timer for Software Defined Radio
NASA Technical Reports Server (NTRS)
Pozhidaev, Aleksey (Inventor)
2014-01-01
A precise digital fractional interval timer for software defined radios which vary their waveform on a packet-by-packet basis. The timer allows for variable length in the preamble of the RF packet and allows to adjust boundaries of the TDMA (Time Division Multiple Access) Slots of the receiver of an SDR based on the reception of the RF packet of interest.
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
Ondersma, Steven J; Martin, Joanne; Fortson, Beverly; Whitaker, Daniel J; Self-Brown, Shannon; Beatty, Jessica; Loree, Amy; Bard, David; Chaffin, Mark
2017-11-01
Early home visitation (EHV) for child maltreatment prevention is widely adopted but has received inconsistent empirical support. Supplementation with interactive software may facilitate attention to major risk factors and use of evidence-based approaches. We developed eight 20-min computer-delivered modules for use by mothers during the course of EHV. These modules were tested in a randomized trial in which 413 mothers were assigned to software-supplemented e-Parenting Program ( ePP), services as usual (SAU), or community referral conditions, with evaluation at 6 and 12 months. Outcomes included satisfaction, working alliance, EHV retention, child maltreatment, and child maltreatment risk factors. The software was well-received overall. At the 6-month follow-up, working alliance ratings were higher in the ePP condition relative to the SAU condition (Cohen's d = .36, p < .01), with no differences at 12 months. There were no between-group differences in maltreatment or major risk factors at either time point. Despite good acceptability and feasibility, these findings provide limited support for use of this software within EHV. These findings contribute to the mixed results seen across different models of EHV for child maltreatment prevention.
Monitoring and Controlling an Underwater Robotic Arm
NASA Technical Reports Server (NTRS)
Haas, John; Todd, Brian Keith; Woodcock, Larry; Robinson, Fred M.
2009-01-01
The SSRMS Module 1 software is part of a system for monitoring an adaptive, closed-loop control of the motions of a robotic arm in NASA s Neutral Buoyancy Laboratory, where buoyancy in a pool of water is used to simulate the weightlessness of outer space. This software is so named because the robot arm is a replica of the Space Shuttle Remote Manipulator System (SSRMS). This software is distributed, running on remote joint processors (RJPs), each of which is mounted in a hydraulic actuator comprising the joint of the robotic arm and communicating with a poolside processor denoted the Direct Control Rack (DCR). Each RJP executes the feedback joint-motion control algorithm for its joint and communicates with the DCR. The DCR receives joint-angular-velocity commands either locally from an operator or remotely from computers that simulate the flight like SSRMS and perform coordinated motion calculations based on hand-controller inputs. The received commands are checked for validity before they are transmitted to the RJPs. The DCR software generates a display of the statuses of the RJPs for the DCR operator and can shut down the hydraulic pump when excessive joint-angle error or failure of a RJP is detected.
NASA Technical Reports Server (NTRS)
Ramachandran, Ganesh K.; Akopian, David; Heckler, Gregory W.; Winternitz, Luke B.
2011-01-01
Location technologies have many applications in wireless communications, military and space missions, etc. US Global Positioning System (GPS) and other existing and emerging Global Navigation Satellite Systems (GNSS) are expected to provide accurate location information to enable such applications. While GNSS systems perform very well in strong signal conditions, their operation in many urban, indoor, and space applications is not robust or even impossible due to weak signals and strong distortions. The search for less costly, faster and more sensitive receivers is still in progress. As the research community addresses more and more complicated phenomena there exists a demand on flexible multimode reference receivers, associated SDKs, and development platforms which may accelerate and facilitate the research. One of such concepts is the software GPS/GNSS receiver (GPS SDR) which permits a facilitated access to algorithmic libraries and a possibility to integrate more advanced algorithms without hardware and essential software updates. The GNU-SDR and GPS-SDR open source receiver platforms are such popular examples. This paper evaluates the performance of recently proposed block-corelator techniques for acquisition and tracking of GPS signals using open source GPS-SDR platform.
A software control system for the ACTS high-burst-rate link evaluation terminal
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Daugherty, Elaine S.
1991-01-01
Control and performance monitoring of NASA's High Burst Rate Link Evaluation Terminal (HBR-LET) is accomplished by using several software control modules. Different software modules are responsible for controlling remote radio frequency (RF) instrumentation, supporting communication between a host and a remote computer, controlling the output power of the Link Evaluation Terminal and data display. Remote commanding of microwave RF instrumentation and the LET digital ground terminal allows computer control of various experiments, including bit error rate measurements. Computer communication allows system operators to transmit and receive from the Advanced Communications Technology Satellite (ACTS). Finally, the output power control software dynamically controls the uplink output power of the terminal to compensate for signal loss due to rain fade. Included is a discussion of each software module and its applications.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Medical Signal-Conditioning and Data-Interface System
NASA Technical Reports Server (NTRS)
Braun, Jeffrey; Jacobus, charles; Booth, Scott; Suarez, Michael; Smith, Derek; Hartnagle, Jeffrey; LePrell, Glenn
2006-01-01
A general-purpose portable, wearable electronic signal-conditioning and data-interface system is being developed for medical applications. The system can acquire multiple physiological signals (e.g., electrocardiographic, electroencephalographic, and electromyographic signals) from sensors on the wearer s body, digitize those signals that are received in analog form, preprocess the resulting data, and transmit the data to one or more remote location(s) via a radiocommunication link and/or the Internet. The system includes a computer running data-object-oriented software that can be programmed to configure the system to accept almost any analog or digital input signals from medical devices. The computing hardware and software implement a general-purpose data-routing-and-encapsulation architecture that supports tagging of input data and routing the data in a standardized way through the Internet and other modern packet-switching networks to one or more computer(s) for review by physicians. The architecture supports multiple-site buffering of data for redundancy and reliability, and supports both real-time and slower-than-real-time collection, routing, and viewing of signal data. Routing and viewing stations support insertion of automated analysis routines to aid in encoding, analysis, viewing, and diagnosis.
GPCALMA: A Tool For Mammography With A GRID-Connected Distributed Database
NASA Astrophysics Data System (ADS)
Bottigli, U.; Cerello, P.; Cheran, S.; Delogu, P.; Fantacci, M. E.; Fauci, F.; Golosio, B.; Lauria, A.; Lopez Torres, E.; Magro, R.; Masala, G. L.; Oliva, P.; Palmiero, R.; Raso, G.; Retico, A.; Stumbo, S.; Tangaro, S.
2003-09-01
The GPCALMA (Grid Platform for Computer Assisted Library for MAmmography) collaboration involves several departments of physics, INFN (National Institute of Nuclear Physics) sections, and italian hospitals. The aim of this collaboration is developing a tool that can help radiologists in early detection of breast cancer. GPCALMA has built a large distributed database of digitised mammographic images (about 5500 images corresponding to 1650 patients) and developed a CAD (Computer Aided Detection) software which is integrated in a station that can also be used to acquire new images, as archive and to perform statistical analysis. The images (18×24 cm2, digitised by a CCD linear scanner with a 85 μm pitch and 4096 gray levels) are completely described: pathological ones have a consistent characterization with radiologist's diagnosis and histological data, non pathological ones correspond to patients with a follow up at least three years. The distributed database is realized throught the connection of all the hospitals and research centers in GRID tecnology. In each hospital local patients digital images are stored in the local database. Using GRID connection, GPCALMA will allow each node to work on distributed database data as well as local database data. Using its database the GPCALMA tools perform several analysis. A texture analysis, i.e. an automated classification on adipose, dense or glandular texture, can be provided by the system. GPCALMA software also allows classification of pathological features, in particular massive lesions (both opacities and spiculated lesions) analysis and microcalcification clusters analysis. The detection of pathological features is made using neural network software that provides a selection of areas showing a given "suspicion level" of lesion occurrence. The performance of the GPCALMA system will be presented in terms of the ROC (Receiver Operating Characteristic) curves. The results of GPCALMA system as "second reader" will also be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Y; Huang, H; Su, T
Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Seismic Travel Time Tomography in Modeling Low Velocity Anomalies between the Boreholes
NASA Astrophysics Data System (ADS)
Octova, A.; Sule, R.
2018-04-01
Travel time cross-hole seismic tomography is applied to describing the structure of the subsurface. The sources are placed at one borehole and some receivers are placed in the others. First arrival travel time data that received by each receiver is used as the input data in seismic tomography method. This research is devided into three steps. The first step is reconstructing the synthetic model based on field parameters. Field parameters are divided into 24 receivers and 45 receivers. The second step is applying inversion process for the field data that consists of five pairs bore holes. The last step is testing quality of tomogram with resolution test. Data processing using FAST software produces an explicit shape and resemble the initial model reconstruction of synthetic model with 45 receivers. The tomography processing in field data indicates cavities in several place between the bore holes. Cavities are identified on BH2A-BH1, BH4A-BH2A and BH4A-BH5 with elongated and rounded structure. In resolution tests using a checker-board, anomalies still can be identified up to 2 meter x 2 meter size. Travel time cross-hole seismic tomography analysis proves this mothod is very good to describing subsurface structure and boundary layer. Size and anomalies position can be recognized and interpreted easily.
A performance improvement plan to increase nurse adherence to use of medication safety software.
Gavriloff, Carrie
2012-08-01
Nurses can protect patients receiving intravenous (IV) medication by using medication safety software to program "smart" pumps to administer IV medications. After a patient safety event identified inconsistent use of medication safety software by nurses, a performance improvement team implemented the Deming Cycle performance improvement methodology. The combined use of improved direct care nurse communication, programming strategies, staff education, medication safety champions, adherence monitoring, and technology acquisition resulted in a statistically significant (p < .001) increase in nurse adherence to using medication safety software from 28% to above 85%, exceeding national benchmark adherence rates (Cohen, Cooke, Husch & Woodley, 2007; Carefusion, 2011). Copyright © 2012 Elsevier Inc. All rights reserved.
Proceedings of the 14th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1989-01-01
Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
Real-Time Spatio-Temporal Twice Whitening for MIMO Energy Detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; Mitra, Pramita; Barhen, Jacob
2010-01-01
While many techniques exist for local spectrum sensing of a primary user, each represents a computationally demanding task to secondary user receivers. In software-defined radio, computational complexity lengthens the time for a cognitive radio to recognize changes in the transmission environment. This complexity is even more significant for spatially multiplexed receivers, e.g., in SIMO and MIMO, where the spatio-temporal data sets grow in size with the number of antennae. Limits on power and space for the processor hardware further constrain SDR performance. In this report, we discuss improvements in spatio-temporal twice whitening (STTW) for real-time local spectrum sensing by demonstratingmore » a form of STTW well suited for MIMO environments. We implement STTW on the Coherent Logix hx3100 processor, a multicore processor intended for low-power, high-throughput software-defined signal processing. These results demonstrate how coupling the novel capabilities of emerging multicore processors with algorithmic advances can enable real-time, software-defined processing of large spatio-temporal data sets.« less
User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh
NASA Astrophysics Data System (ADS)
Jones, Craig H.
2002-12-01
"PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.
Analysis of the coupling efficiency of a tapered space receiver with a calculus mathematical model
NASA Astrophysics Data System (ADS)
Hu, Qinggui; Mu, Yining
2018-03-01
We establish a calculus mathematical model to study the coupling characteristics of tapered optical fibers in a space communications system, and obtained the coupling efficiency equation. Then, using MATLAB software, the solution was calculated. After this, the sample was produced by the mature flame-brush technique. The experiment was then performed, and the results were in accordance with the theoretical analysis. This shows that the theoretical analysis was correct and indicates that a tapered structure could improve its tolerance with misalignment. Project supported by The National Natural Science Foundation of China (grant no. 61275080); 2017 Jilin Province Science and Technology Development Plan-Science and Technology Innovation Fund for Small and Medium Enterprises (20170308029HJ); ‘thirteen five’ science and technology research project of the Department of Education of Jilin 2016 (16JK009).
Digital interactive image analysis by array processing
NASA Technical Reports Server (NTRS)
Sabels, B. E.; Jennings, J. D.
1973-01-01
An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.
NASA Technical Reports Server (NTRS)
Burhans, R. W.
1974-01-01
The details are presented of methods for providing OMEGA navigational information including the receiver problem at the antenna and informational display and housekeeping systems based on some 4 bit data processing concepts. Topics discussed include the problem of limiters, zero crossing detectors, signal envelopes, internal timing circuits, phase counters, lane position displays, signal integrators, and software mapping problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
RIECK, C.A.
1999-02-23
This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less
Heat receivers for solar dynamic space power systems
NASA Astrophysics Data System (ADS)
Perez-Davis, Marla Esther
A review of state-of-the-art technology is presented and discussed for phase change materials. Some of the advanced solar dynamic designs developed as part of the Advanced Heat Receiver Conceptual Design Study performed for LeRC are discussed. The heat receivers are analyzed and several recommendations are proposed, including two new concepts. The first concept evaluated the effect of tube geometries inside the heat receiver. It was found that a triangular configuration would provide better heat transfer to the working fluid, although not necessarily with a reduction in receiver size. A sensible heat receiver considered in this study uses vapor grown graphite fiber-carbon (VGCF/C) composite as the thermal storage media and was designed for a 7 kW Brayton engine. The proposed heat receiver stores the required energy to power the system during eclipse in the VGCF/C composite. The heat receiver analysis was conducted through the Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA) software package. The proposed heat receiver compares well with other latent and advanced sensible heat receivers while avoiding the problems associated with latent heat storage salts and liquid metal heat pipes. The weight and size of the system can be optimized by changes in geometry and technology advances for this new material. In addition to the new concepts, the effect of atomic oxygen on several materials is reviewed. A test was conducted for atomic oxygen attack on boron nitride, which experienced a negligible mass loss when exposed to an atomic oxygen fluence of 5 x 10 exp 21 atoms/sq cm. This material could be used to substitute the graphite aperture plate of the heat receiver.
NASA Astrophysics Data System (ADS)
Sousa, Maria A. Z.; Bakic, Predrag R.; Schiabel, Homero; Maidment, Andrew D. A.
2017-03-01
Digital breast tomosynthesis (DBT) has been shown to be an effective imaging tool for breast cancer diagnosis as it provides three-dimensional images of the breast with minimal tissue overlap. The quality of the reconstructed image depends on many factors that can be assessed using uniform or realistic phantoms. In this paper, we created four models of phantoms using an anthropomorphic software breast phantom and compared four methods to evaluate the gray scale response in terms of the contrast, noise and detectability of adipose and glandular tissues binarized according to phantom ground truth. For each method, circular regions of interest (ROIs) were selected with various sizes, quantity and positions inside a square area in the phantom. We also estimated the percent density of the simulated breast and the capability of distinguishing both tissues by receiver operating characteristic (ROC) analysis. Results shows a sensitivity of the methods to the ROI size, placement and to the slices considered.
Developing Signal-Pattern-Recognition Programs
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Hammen, David
2006-01-01
Pattern Interpretation and Recognition Application Toolkit Environment (PIRATE) is a block-oriented software system that aids the development of application programs that analyze signals in real time in order to recognize signal patterns that are indicative of conditions or events of interest. PIRATE was originally intended for use in writing application programs to recognize patterns in space-shuttle telemetry signals received at Johnson Space Center's Mission Control Center: application programs were sought to (1) monitor electric currents on shuttle ac power busses to recognize activations of specific power-consuming devices, (2) monitor various pressures and infer the states of affected systems by applying a Kalman filter to the pressure signals, (3) determine fuel-leak rates from sensor data, (4) detect faults in gyroscopes through analysis of system measurements in the frequency domain, and (5) determine drift rates in inertial measurement units by regressing measurements against time. PIRATE can also be used to develop signal-pattern-recognition software for different purposes -- for example, to monitor and control manufacturing processes.
Semantic Metrics for Analysis of Software
NASA Technical Reports Server (NTRS)
Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara
2005-01-01
A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.
ERIC Educational Resources Information Center
van Reijswoud, Victor; Mulo, Emmanuel
2006-01-01
Over recent years the issue of free and open source software (FOSS) for development in less developed countries (LDCs) has received increasing attention. In the beginning the benefits of FOSS for lower developed countries was only stressed by small groups of idealists like Richard Stallman. Now, however, it is moving into the hands of large…
Results of a Flight Simulation Software Methods Survey
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce
1995-01-01
A ten-page questionnaire was mailed to members of the AIAA Flight Simulation Technical Committee in the spring of 1994. The survey inquired about various aspects of developing and maintaining flight simulation software, as well as a few questions dealing with characterization of each facility. As of this report, 19 completed surveys (out of 74 sent out) have been received. This paper summarizes those responses.
Rengifo Valbuena, Carlos Augusto; Ávila Rodríguez, Marco Fidel; Céspedes Rubio, Angel
2013-01-01
Introduction: The pathophysiology of cerebral ischemia is essential for early diagnosis, neurologic recovery, the early onset of drug treatment and the prognosis of ischemic events. Experimental models of cerebral ischemia can be used to evaluate the cellular response phenomena and possible neurological protection by drugs. Objective: To characterize the cellular changes in the neuronal population and astrocytic response by the effect of Dimethyl Sulfoxide (DMSO) on a model of ischemia caused by cerebral embolism. Methods: Twenty Wistar rats were divided into four groups (n= 5). The infarct was induced with α-bovine thrombin (40 NIH/Unit.). The treated group received 90 mg (100 μL) of DMSO in saline (1:1 v/v) intraperitoneally for 5 days; ischemic controls received only NaCl (placebo) and two non-ischemic groups (simulated) received NaCl and DMSO respectively. We evaluated the neuronal (anti-NeuN) and astrocytic immune-reactivity (anti-GFAP). The results were analyzed by densitometry (NIH Image J-Fiji 1.45 software) and analysis of variance (ANOVA) with the Graph pad software (Prism 5). Results: Cerebral embolism induced reproducible and reliable lesions in the cortex and hippocampus (CA1)., similar to those of focal models. DMSO did not reverse the loss of post-ischemia neuronal immune-reactivity, but prevented the morphological damage of neurons, and significantly reduced astrocytic hyperactivity in the somato-sensory cortex and CA1 (p <0.001). Conclusions: The regulatory effect of DMSO on astrocyte hyperreactivity and neuronal-astroglial cytoarchitecture , gives it potential neuroprotective properties for the treatment of thromboembolic cerebral ischemia in the acute phase. PMID:24892319
Ostomy patients’ perception of the health care received
Nieves, Candela Bonill-de las; Díaz, Concepción Capilla; Celdrán-Mañas, Miriam; Morales-Asencio, José Miguel; Hernández-Zambrano, Sandra Milena; Hueso-Montoro, César
2017-01-01
ABSTRACT Aim: to describe ostomy patient’s perception about health care received, as well as their needs and suggestions for healthcare system improvement. Method: qualitative phenomenological study was conducted, involving individual and semi-structured interviews on the life experiences of 21 adults who had a digestive stoma. Participants were selected following a purposive sampling approach. The analysis was based on the constant comparison of the data, the progressive incorporation of subjects and triangulation among researchers and stoma therapy nurses. The software Atlas.ti was used. Results: perception of health care received is closely related to the information process, as well as training for caring the stoma from peristomal skin to diet. It is worthy to point out the work performed by stoma care nurses ensuring support during all stages of the process. Conclusion: findings contribute to address the main patients’ needs (better prepared nurses, shorter waiting lists, information about sexual relation, inclusion of family members all along the process) and recommendations for improving health care to facilitate their adaptation to a new status of having a digestive stoma. PMID:29236839
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Kubios HRV--heart rate variability analysis software.
Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A
2014-01-01
Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Thieman, J.; Higgins, C.; Lauffer, G.; Ulivastro, R.; Flagg, R.; Sky, J.
2003-04-01
The Radio JOVE project (http://radiojove.gsfc.nasa.gov) began over four years ago as an education-centered program to inspire secondary school students' interest in space science through hands-on radio astronomy. Students build a radio receiver and antenna kit capable of receiving Jovian, solar, and galactic emissions at a frequency of 20.1 MHz. More than 500 of these kits have been distributed to students and interested observers (ages 10 through adult) in 24 countries. Many students and teachers do not have the time or feel comfortable building a kit of their own. The Radio JOVE project has made it possible to monitor data and streaming audio from professional radio telescopes in Florida (16 element 10-40 MHz log spiral array - http://jupiter.kochi-ct.jp) and Hawaii (17-30 MHz log periodic antenna - http://jupiter.wcc.hawaii.edu/newradiojove/main.html) using standard web browsers and/or freely downloadable software. Radio-Skypipe software (http://radiosky.com) emulates a chart recorder for ones own radio telescope. It will also display the signals being received by other observers worldwide who send out their data over the Internet using the same software package. A built-in chat feature allows the users to discuss their observations and results in real time. New software is being developed to allow network users to interactively view a multi-frequency spectroscopic display of the Hawaii radio telescope. This software may also be useful for research applications. Observers in the U.S. and Europe have been contributing data to a central archive of Jupiter and Solar observations (http://jovearchive.gsfc.nasa.gov/). We believe these data to be of value to the research community and would like to have students more directly connected to ongoing research projects to enhance their interest in participating. We welcome ideas for expanding the application of these data.
An online database for plant image analysis software tools.
Lobet, Guillaume; Draye, Xavier; Périlleux, Claire
2013-10-09
Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.
GWAMA: software for genome-wide association meta-analysis.
Mägi, Reedik; Morris, Andrew P
2010-05-28
Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.
[Software-based visualization of patient flow at a university eye clinic].
Greb, O; Abou Moulig, W; Hufendiek, K; Junker, B; Framme, C
2017-03-01
This article presents a method for visualization and navigation of patient flow in outpatient eye clinics with a high level of complexity. A network-based software solution was developed targeting long-term process optimization by structural analysis and temporal coordination of process navigation. Each examination unit receives a separate waiting list of patients in which the patient flow for every patient is recorded in a timeline. Time periods and points in time can be executed by mouse clicks and the desired diagnostic procedure can be entered. Recent progress in any of these diagnostic requests, as well as a variety of information on patient progress are collated and drawn into the corresponding timeline which can be viewed by any of the personnel involved. The software called TimeElement has been successfully tested in the practical implemenation for several months. As an example the patient flow regarding time stamps of defined events for intravitreous injections on 250 patients was recorded and an average attendance time of 169.71 min was found, whereby the time was also automatically recorded for each individual stage. Recording of patient flow data is a fundamental component of patient flow management, waiting time reduction, patient flow navigation with time and coordination in particular regarding timeline-based visualization for each individual patient. Long-term changes in process management can be planned and evaluated by comparing patient flow data. As using the software itself causes structural changes within the organization, a questionnaire is being planned for appraisal by the personnel involved.
Moćko, Paweł; Kawalec, Paweł; Smela-Lipińska, Beata; Pilc, Andrzej
2016-10-01
The aim of this systematic review (SR) and meta-analysis was to assess the efficacy and safety of vedolizumab in the treatment of Crohn's disease (CD). A systematic literature search was conducted in Medline/PubMed, Embase and Cochrane Library until 25 January, 2015. Included studies were critically appraised according to the PRISMA protocol. Assessment in specified subgroups of CD patients and meta-analysis with Revman software were performed. Two randomized controlled trial (RCTs) were included in a meta-analysis for the induction phase of therapy: GEMINI II and GEMINI III. The clinical response was significantly higher for patients who received vedolizumab compared to placebo in the general population (risk benefit (RB) = 1.48; p = 0.0006) and in both analyzed subgroups: patients with previous failure of anti-TNFs treatment (RB = 1.51; p = 0.006) and patients naive to earlier anti-TNFs (RB = 1.41; p = 0.001). The clinical remission in the general population and subpopulation of TNF-antagonist naive patients was significantly higher for patients who received vedolizumab compared to placebo (RB = 1.77; p = 0.003; RB = 2.29; p = 0.0004; respectively). Meta-analysis for adverse events, serious adverse events (SAEs) and serious infections, revealed that vedolizumab was as safe as placebo in the induction phase of therapy. The clinical response was significantly higher for patients who received vedolizumab in the general population and in both analyzed subgroups of patients. The clinical remission in the general population and subpopulation of TNF-antagonist naive patients was significantly higher for vedolizumab, but no significant differences were revealed in the subgroup of patients with previous TNF antagonist failure.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
NASA Astrophysics Data System (ADS)
Moldovan, Iren-Adelina; Petruta Constantin, Angela; Emilian Toader, Victorin; Toma-Danila, Dragos; Biagi, Pier Francesco; Maggipinto, Tommaso; Dolea, Paul; Septimiu Moldovan, Adrian
2014-05-01
Based on scientific evidences supporting the causality between earthquake preparatory stages, space weather and solar activity and different types of electromagnetic (EM) disturbances together with the benefit of having full access at ground and space based EM data, INFREP proposes a complex and cross correlated investigation of phenomena that occur in the coupled system Lithosphere-Atmosphere-Ionsophere in order to identify possible causes responsible for anomalous effects observed in the propagation characteristics of radio waves, especially at low (LF) and very low frequency (VLF). INFREP, a network of VLF (20-60 kHz) and LF (150-300 kHz) radio receivers, was put into operation in Europe in 2009, having as principal goal, the study of disturbances produced by the earthquakes on the propagation properties of these signals. The Romanian NIEP VLF / LF monitoring system consisting in a radio receiver -made by Elettronika S.R.L. (Italy) and provided by the Bari University- and the infrastructure that is necessary to record and transmit the collected data, is a part of the international initiative INFREP. The NIEP VLF / LF receiver installed in Romania was put into operation in February 2009 in Bucharest and relocated to the Black-Sea shore (Dobruja Seismologic Observatory) in December 2009. The first development of the Romanian EM monitoring system was needed because after changing the receiving site from Bucharest to Eforie we obtained unsatisfactory monitoring data, characterized by large fluctuations of the received signals' intensities. Trying to understand this behavior has led to the conclusion that the electric component of the electromagnetic field was possibly influenced by the local conditions. Starting from this observation we have run some tests and changed the vertical antenna with a loop-type antenna that is more appropriate in highly electric-field polluted environments. Since the amount of recorded data is huge, for streamlining the research process we have realized the automation of the transfer, storage and initial processing of data using the LabView software platform. The special designed LabVIEW application, which accesses the VLF/LF receiver through internet, opens the receiver's web-page and automatically retrieves the list of data files to synchronize the user-side data with the receiver's data. Missing zipped files are also automatically downloaded. The application performs primary, statistical correlation and spectral analysis, appends daily files into monthly and annual files and performs 3D color-coded maps with graphic representations of VLF and LF signals' intensities versus the minute-of-the-day and the day-of-the-month, facilitating a near real-time observation of VLF and LF electromagnetic waves' propagation. Another feature of the software is the correlation of the daily recorded files for the studied frequencies by overlaying the 24 hours radio activity and taking into account the sunrise and sunset. The next step in developing the Romanian EM recording system is to enlarge the INFREP network with new VLF/LF receivers for a better coverage and separation of European seismogenic zones. This will be done in the future by using national resources. The unitary seismotectonic zoning of Romania and the whole Europe is a very important step for this goal.
Zhang, Ying-Shi; Li, Qing; He, Bo-Sai; Liu, Ran; Li, Zuo-Jing
2015-01-01
AIM: To compare the therapeutic effects of proton pump inhibitors vs H2 receptor antagonists for upper gastrointestinal bleeding in patients after successful endoscopy. METHODS: We searched the Cochrane library, MEDLINE, EMBASE and PubMed for randomized controlled trials until July 2014 for this study. The risk of bias was evaluated by the Cochrane Collaboration’s tool and all of the studies had acceptable quality. The main outcomes included mortality, re-bleeding, received surgery rate, blood transfusion units and hospital stay time. These outcomes were estimated using odds ratios (OR) and mean difference with 95% confidence interval (CI). RevMan 5.3.3 software and Stata 12.0 software were used for data analyses. RESULTS: Ten randomized controlled trials involving 1283 patients were included in this review; 678 subjects were in the proton pump inhibitors (PPI) group and the remaining 605 subjects were in the H2 receptor antagonists (H2RA) group. The meta-analysis results revealed that after successful endoscopic therapy, compared with H2RA, PPI therapy had statistically significantly decreased the recurrent bleeding rate (OR = 0.36; 95%CI: 0.25-0.51) and receiving surgery rate (OR = 0.29; 95%CI: 0.09-0.96). There were no statistically significant differences in mortality (OR = 0.46; 95%CI: 0.17-1.23). However, significant heterogeneity was present in both the numbers of patients requiring blood transfusion after treatment [weighted mean difference (WMD), -0.70 unit; 95%CI: -1.64 - 0.25] and the time that patients remained hospitalized [WMD, -0.77 d; 95%CI: -1.87 - 0.34]. The Begg’s test (P = 0.283) and Egger’s test (P = 0.339) demonstrated that there was no publication bias in our meta-analysis. CONCLUSION: In patients with upper gastrointestinal bleeding after successful endoscopic therapy, compared with H2RA, PPI may be a more effective therapy. PMID:26034370
Zhang, Ying-Shi; Li, Qing; He, Bo-Sai; Liu, Ran; Li, Zuo-Jing
2015-05-28
To compare the therapeutic effects of proton pump inhibitors vs H₂ receptor antagonists for upper gastrointestinal bleeding in patients after successful endoscopy. We searched the Cochrane library, MEDLINE, EMBASE and PubMed for randomized controlled trials until July 2014 for this study. The risk of bias was evaluated by the Cochrane Collaboration's tool and all of the studies had acceptable quality. The main outcomes included mortality, re-bleeding, received surgery rate, blood transfusion units and hospital stay time. These outcomes were estimated using odds ratios (OR) and mean difference with 95% confidence interval (CI). RevMan 5.3.3 software and Stata 12.0 software were used for data analyses. Ten randomized controlled trials involving 1283 patients were included in this review; 678 subjects were in the proton pump inhibitors (PPI) group and the remaining 605 subjects were in the H₂ receptor antagonists (H₂RA) group. The meta-analysis results revealed that after successful endoscopic therapy, compared with H₂RA, PPI therapy had statistically significantly decreased the recurrent bleeding rate (OR = 0.36; 95%CI: 0.25-0.51) and receiving surgery rate (OR = 0.29; 95%CI: 0.09-0.96). There were no statistically significant differences in mortality (OR = 0.46; 95%CI: 0.17-1.23). However, significant heterogeneity was present in both the numbers of patients requiring blood transfusion after treatment [weighted mean difference (WMD), -0.70 unit; 95%CI: -1.64 - 0.25] and the time that patients remained hospitalized [WMD, -0.77 d; 95%CI: -1.87 - 0.34]. The Begg's test (P = 0.283) and Egger's test (P = 0.339) demonstrated that there was no publication bias in our meta-analysis. In patients with upper gastrointestinal bleeding after successful endoscopic therapy, compared with H₂RA, PPI may be a more effective therapy.
Multicore Hardware Experiments in Software Producibility
2009-06-01
processors. 15. SUBJECT TERMS Multi-core, Real - time Systems , Testing, Software Modernization 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF... real ‐ time systems . The inputs to the dgclocalnav component are the path plan (received from highlevelplanner, discussed next), the drivable grid... time systems , robotics, and software. As frequently observed in cyber‐physical systems, the system designers may need experience in multiple
NASA Astrophysics Data System (ADS)
Brandt, Douglas; Hiller, John R.; Moloney, Michael J.
1995-10-01
The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.
Tools Automate Spacecraft Testing, Operation
NASA Technical Reports Server (NTRS)
2010-01-01
"NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."
Tank Monitoring and Document control System (TMACS) As Built Software Design Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
GLASSCOCK, J.A.
This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.
High temperature helical tubular receiver for concentrating solar power system
NASA Astrophysics Data System (ADS)
Hossain, Nazmul
In the field of conventional cleaner power generation technology, concentrating solar power systems have introduced remarkable opportunity. In a solar power tower, solar energy concentrated by the heliostats at a single point produces very high temperature. Falling solid particles or heat transfer fluid passing through that high temperature region absorbs heat to generate electricity. Increasing the residence time will result in more heat gain and increase efficiency. A novel design of solar receiver for both fluid and solid particle is approached in this paper which can increase residence time resulting in higher temperature gain in one cycle compared to conventional receivers. The helical tubular solar receiver placed at the focused sunlight region meets the higher outlet temperature and efficiency. A vertical tubular receiver is modeled and analyzed for single phase flow with molten salt as heat transfer fluid and alloy625 as heat transfer material. The result is compared to a journal paper of similar numerical and experimental setup for validating our modeling. New types of helical tubular solar receivers are modeled and analyzed with heat transfer fluid turbulent flow in single phase, and granular particle and air plug flow in multiphase to observe the temperature rise in one cyclic operation. The Discrete Ordinate radiation model is used for numerical analysis with simulation software Ansys Fluent 15.0. The Eulerian granular multiphase model is used for multiphase flow. Applying the same modeling parameters and boundary conditions, the results of vertical and helical receivers are compared. With a helical receiver, higher temperature gain of heat transfer fluid is achieved in one cycle for both single phase and multiphase flow compared to the vertical receiver. Performance is also observed by varying dimension of helical receiver.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2002-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time
NASA Technical Reports Server (NTRS)
2010-01-01
Topics covered include: Active and Passive Hybrid Sensor; Quick-Response Thermal Actuator for Use as a Heat Switch; System for Hydrogen Sensing; Method for Detecting Perlite Compaction in Large Cryogenic Tanks; Using Thin-Film Thermometers as Heaters in Thermal Control Applications; Directional Spherical Cherenkov Detector; AlGaN Ultraviolet Detectors for Dual-Band UV Detection; K-Band Traveling-Wave Tube Amplifier; Simplified Load-Following Control for a Fuel Cell System; Modified Phase-meter for a Heterodyne Laser Interferometer; Loosely Coupled GPS-Aided Inertial Navigation System for Range Safety; Sideband-Separating, Millimeter-Wave Heterodyne Receiver; Coaxial Propellant Injectors With Faceplate Annulus Control; Adaptable Diffraction Gratings With Wavefront Transformation; Optimizing a Laser Process for Making Carbon Nanotubes; Thermogravimetric Analysis of Single-Wall Carbon Nanotubes; Robotic Arm Comprising Two Bending Segments; Magnetostrictive Brake; Low-Friction, Low-Profile, High-Moment Two-Axis Joint; Foil Gas Thrust Bearings for High-Speed Turbomachinery; Miniature Multi-Axis Mechanism for Hand Controllers; Digitally Enhanced Heterodyne Interferometry; Focusing Light Beams To Improve Atomic-Vapor Optical Buffers; Landmark Detection in Orbital Images Using Salience Histograms; Efficient Bit-to-Symbol Likelihood Mappings; Capacity Maximizing Constellations; Natural-Language Parser for PBEM; Policy Process Editor for P(sup 3)BM Software; A Quality System Database; Trajectory Optimization: OTIS 4; and Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator.
Visualization for Molecular Dynamics Simulation of Gas and Metal Surface Interaction
NASA Astrophysics Data System (ADS)
Puzyrkov, D.; Polyakov, S.; Podryga, V.
2016-02-01
The development of methods, algorithms and applications for visualization of molecular dynamics simulation outputs is discussed. The visual analysis of the results of such calculations is a complex and actual problem especially in case of the large scale simulations. To solve this challenging task it is necessary to decide on: 1) what data parameters to render, 2) what type of visualization to choose, 3) what development tools to use. In the present work an attempt to answer these questions was made. For visualization it was offered to draw particles in the corresponding 3D coordinates and also their velocity vectors, trajectories and volume density in the form of isosurfaces or fog. We tested the way of post-processing and visualization based on the Python language with use of additional libraries. Also parallel software was developed that allows processing large volumes of data in the 3D regions of the examined system. This software gives the opportunity to achieve desired results that are obtained in parallel with the calculations, and at the end to collect discrete received frames into a video file. The software package "Enthought Mayavi2" was used as the tool for visualization. This visualization application gave us the opportunity to study the interaction of a gas with a metal surface and to closely observe the adsorption effect.
Greenville Bridge Reach, Bendway Weirs
2006-09-01
However, these receivers are more expensive and heavier due to the radio and batteries. For this study, two Magellan GPS ProMARK X-CP receivers were...used to collect float data. The Magellan GPS ProMARK X-CP is a small, robust, light receiver that can log 9 hr of both pseudorange and car- rier phase...require a high degree of accu- racy. Using post-processing software, pseudorange GPS data recorded by the ProMARK X-CP can be post-processed
NASA Technical Reports Server (NTRS)
Smith, Kelly M.; Gay, Robert S.; Stachowiak, Susan J.
2013-01-01
In late 2014, NASA will fly the Orion capsule on a Delta IV-Heavy rocket for the Exploration Flight Test-1 (EFT-1) mission. For EFT-1, the Orion capsule will be flying with a new GPS receiver and new navigation software. Given the experimental nature of the flight, the flight software must be robust to the loss of GPS measurements. Once the high-speed entry is complete, the drogue parachutes must be deployed within the proper conditions to stabilize the vehicle prior to deploying the main parachutes. When GPS is available in nominal operations, the vehicle will deploy the drogue parachutes based on an altitude trigger. However, when GPS is unavailable, the navigated altitude errors become excessively large, driving the need for a backup barometric altimeter to improve altitude knowledge. In order to increase overall robustness, the vehicle also has an alternate method of triggering the parachute deployment sequence based on planet-relative velocity if both the GPS and the barometric altimeter fail. However, this backup trigger results in large altitude errors relative to the targeted altitude. Motivated by this challenge, this paper demonstrates how logistic regression may be employed to semi-automatically generate robust triggers based on statistical analysis. Logistic regression is used as a ground processor pre-flight to develop a statistical classifier. The classifier would then be implemented in flight software and executed in real-time. This technique offers improved performance even in the face of highly inaccurate measurements. Although the logistic regression-based trigger approach will not be implemented within EFT-1 flight software, the methodology can be carried forward for future missions and vehicles.
NASA Technical Reports Server (NTRS)
Smith, Kelly; Gay, Robert; Stachowiak, Susan
2013-01-01
In late 2014, NASA will fly the Orion capsule on a Delta IV-Heavy rocket for the Exploration Flight Test-1 (EFT-1) mission. For EFT-1, the Orion capsule will be flying with a new GPS receiver and new navigation software. Given the experimental nature of the flight, the flight software must be robust to the loss of GPS measurements. Once the high-speed entry is complete, the drogue parachutes must be deployed within the proper conditions to stabilize the vehicle prior to deploying the main parachutes. When GPS is available in nominal operations, the vehicle will deploy the drogue parachutes based on an altitude trigger. However, when GPS is unavailable, the navigated altitude errors become excessively large, driving the need for a backup barometric altimeter to improve altitude knowledge. In order to increase overall robustness, the vehicle also has an alternate method of triggering the parachute deployment sequence based on planet-relative velocity if both the GPS and the barometric altimeter fail. However, this backup trigger results in large altitude errors relative to the targeted altitude. Motivated by this challenge, this paper demonstrates how logistic regression may be employed to semi-automatically generate robust triggers based on statistical analysis. Logistic regression is used as a ground processor pre-flight to develop a statistical classifier. The classifier would then be implemented in flight software and executed in real-time. This technique offers improved performance even in the face of highly inaccurate measurements. Although the logistic regression-based trigger approach will not be implemented within EFT-1 flight software, the methodology can be carried forward for future missions and vehicles
NASA Technical Reports Server (NTRS)
Smith, Kelly M.; Gay, Robert S.; Stachowiak, Susan J.
2013-01-01
In late 2014, NASA will fly the Orion capsule on a Delta IV-Heavy rocket for the Exploration Flight Test-1 (EFT-1) mission. For EFT-1, the Orion capsule will be flying with a new GPS receiver and new navigation software. Given the experimental nature of the flight, the flight software must be robust to the loss of GPS measurements. Once the high-speed entry is complete, the drogue parachutes must be deployed within the proper conditions to stabilize the vehicle prior to deploying the main parachutes. When GPS is available in nominal operations, the vehicle will deploy the drogue parachutes based on an altitude trigger. However, when GPS is unavailable, the navigated altitude errors become excessively large, driving the need for a backup barometric altimeter. In order to increase overall robustness, the vehicle also has an alternate method of triggering the drogue parachute deployment based on planet-relative velocity if both the GPS and the barometric altimeter fail. However, this velocity-based trigger results in large altitude errors relative to the targeted altitude. Motivated by this challenge, this paper demonstrates how logistic regression may be employed to automatically generate robust triggers based on statistical analysis. Logistic regression is used as a ground processor pre-flight to develop a classifier. The classifier would then be implemented in flight software and executed in real-time. This technique offers excellent performance even in the face of highly inaccurate measurements. Although the logistic regression-based trigger approach will not be implemented within EFT-1 flight software, the methodology can be carried forward for future missions and vehicles.
Wall, Stephen P; Mayorga, Oliver; Banfield, Christine E; Wall, Mark E; Aisic, Ilan; Auerbach, Carl; Gennis, Paul
2006-11-01
To develop software that categorizes electronic head computed tomography (CT) reports into groups useful for clinical decision rule research. Data were obtained from the Second National Emergency X-Radiography Utilization Study, a cohort of head injury patients having received head CT. CT reports were reviewed manually for presence or absence of clinically important subdural or epidural hematoma, defined as greater than 1.0 cm in width or causing mass effect. Manual categorization was done by 2 independent researchers blinded to each other's results. A third researcher adjudicated discrepancies. A random sample of 300 reports with radiologic abnormalities was selected for software development. After excluding reports categorized manually or by software as indeterminate (neither positive nor negative), we calculated sensitivity and specificity by using manual categorization as the standard. System efficiency was defined as the percentage of reports categorized as positive or negative, regardless of accuracy. Software was refined until analysis of the training data yielded sensitivity and specificity approximating 95% and efficiency exceeding 75%. To test the system, we calculated sensitivity, specificity, and efficiency, using the remaining 1,911 reports. Of the 1,911 reports, 160 had clinically important subdural or epidural hematoma. The software exhibited good agreement with manual categorization of all reports, including indeterminate ones (weighted kappa 0.62; 95% confidence interval [CI] 0.58 to 0.65). Sensitivity, specificity, and efficiency of the computerized system for identifying manual positives and negatives were 96% (95% CI 91% to 98%), 98% (95% CI 98% to 99%), and 79% (95% CI 77% to 80%), respectively. Categorizing head CT reports by computer for clinical decision rule research is feasible.
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
ERIC Educational Resources Information Center
Zhang, Xuesong; Dorn, Bradley
2012-01-01
Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…
2007-10-28
Software Engineering, FASE, volume 3442 of Lecture Notes in Computer Science, pages 175--189. Springer, 2005. Andreas Bauer, Martin Leucker, and Jonathan ...of Personnel receiving masters degrees NAME Markus Strohmeier Gerrit Hanselmann Jonathan Streit Ernst Sassen 4Total Number: Names of personnel...developed and documented mainly within the master thesis by Jonathan Streit [Str06]: • Jonathan Streit. Development of a programming language like tem
How Does One Manage ’Information? Making Sense of the Information Being Received
2012-12-01
to manage. (Photo by PFC Franklin E. Mercado .) Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...in choosing the right application. Appli- cation software is written to perform a specific task or function, and it becomes increasingly difficult...common data, virtu- alizing machines for all software (using one computer/server, but dividing it into logical segments), and standardizing
Mahnke, Peter
2018-01-01
A commercial software defined radio based on a Rafael Micro R820T2 tuner is characterized for the use as a high-frequency lock-in amplifier for frequency modulation spectroscopy. The sensitivity limit of the receiver is 1.6 nV/Hz. Frequency modulation spectroscopy is demonstrated on the 6406.69 cm -1 absorption line of carbon monoxide.
NASA Astrophysics Data System (ADS)
Mahnke, Peter
2018-01-01
A commercial software defined radio based on a Rafael Micro R820T2 tuner is characterized for the use as a high-frequency lock-in amplifier for frequency modulation spectroscopy. The sensitivity limit of the receiver is 1.6 nV/√{Hz }. Frequency modulation spectroscopy is demonstrated on the 6406.69 cm-1 absorption line of carbon monoxide.
Optimizing the Remotely Piloted Aircraft Pilot Career Field
2011-10-01
Katana light aircraft trainers, receiving 30 to 38 hours of introductory, night, cross country and solo ...Power Journal 33, no. 2 (Summer 2009): 5-10. 51. Steve Lohr. "Software Progress Beats Moore’s Law." bits.blogs.nytimes.com. March 07, 2011. http...bits.blogs.nytimes.com/2011/03/07/software-progress- beats -moores-law/ 52. US Department of Defense. "United States Air Force Unmanned Aircraft
Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.
Sneak Analysis Application Guidelines
1982-06-01
Hardware Program Change Cost Trend, Airborne Environment ....... ....................... 111 3-11 Relative Software Program Change Costs...113 3-50 Derived Software Program Change Cost by Phase,* Airborne Environment ..... ............... 114 3-51 Derived Software Program Change...Cost by Phase, Ground/Water Environment ... ............. .... 114 3-52 Total Software Program Change Costs ................ 115 3-53 Sneak Analysis
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O
2013-06-01
Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Blume, F.; Berglund, H.; Estey, L.
2012-12-01
In December 2005, the L2C signal was introduced to improve the accuracy, tracking and redundancy of the GPS system for civilian users. The L2C signal also provides improved SNR data when compared with the L2P(Y) legacy signal. However, GNSS network operators have been hesitant to use the new signal as it is not well determined how positions derived from L2 carrier phase measurements are affected. L2C carrier phase is in quadrature with L2P(Y); some manufacturers correct for this when logging L2C phase while others do not. In cases where both L2C and L2P(Y) are logged simultaneously, translation software must be used carefully in order to select which phase is used in positioning. Modifications were made to UNAVCO's teqc pre-processing software to eliminate confusion, however GNSS networks such as the IGS still suffer occasional data loss due to improperly configured GPS receivers or data flow routines. To date L2C analyses have been restricted to special applications such as snow depth and soil moisture using SNR data, as some high-precision data analysis packages are not compatible with L2C. We use several different methods to determine the effect that tracking and logging L2C has on carrier phase measurements and positioning for various receiver models and configurations. Twenty-four hour zero-length baseline solutions using L2 show sub- millimeter differences in mean positions for both horizontal and vertical components. Direct comparisons of the L2 phase observable from RINEX files with and without the L2C observable show sub-millicycle differences. The magnitude of the variations increased at low elevations. The behavior of the L2P(Y) phase observations or positions from a given receiver were not affected by the enabling of L2C tracking. We find that the use of the L2C-derived carrier phase in real-time applications can be disastrous in cases where receiver brands are mixed between those that correct for quadrature and those that do not (Figure 1). Until standards are implemented for universal phase corrections in either receivers or software the use of L2C should be avoided by real-time network operators. The complexity involved in the adoption of a single new signal on an existing GPS frequency over a period of 7 years has implications for the use of multi-GNSS systems and modernized GPS in geodetic networks.
A ring transducer system for medical ultrasound research.
Waag, Robert C; Fedewa, Russell J
2006-10-01
An ultrasonic ring transducer system has been developed for experimental studies of scattering and imaging. The transducer consists of 2048 rectangular elements with a 2.5-MHz center frequency, a 67% -6 dB bandwidth, and a 0.23-mm pitch arranged in a 150-mm-diameter ring with a 25-mm elevation. At the center frequency, the element size is 0.30lambda x 42lambda and the pitch is 0.38lambda. The system has 128 parallel transmit channels, 16 parallel receive channels, a 2048:128 transmit multiplexer, a 2048:16 receive multiplexer, independently programmable transmit waveforms with 8-bit resolution, and receive amplifiers with time variable gain independently programmable over a 40-dB range. Receive signals are sampled at 20 MHz with 12-bit resolution. Arbitrary transmit and receive apertures can be synthesized. Calibration software minimizes system nonidealities caused by noncircularity of the ring and element-to-element response differences. Application software enables the system to be used by specification of high-level parameters in control files from which low-level hardware-dependent parameters are derived by specialized code. Use of the system is illustrated by producing focused and steered beams, synthesizing a spatially limited plane wave, measuring angular scattering, and forming b-scan images.
Verri, Fellippo Ramos; Santiago, Joel Ferreira; Almeida, Daniel Augusto; de Souza Batista, Victor Eduardo; Araujo Lemos, Cleidiel Aparecido; Mello, Caroline Cantieri; Pellizzer, Eduardo Piza
The aim of this study was to use three-dimensional finite element analysis to analyze the stress distribution transferred by single implant-supported prostheses placed in the anterior maxilla using different connections (external hexagon, internal hexagon, or Morse taper), inclinations of the load (0, 30, or 60 degrees), and surgical techniques for placement (monocortical/conventional, bicortical, or bicortical with nasal floor elevation). Nine models representing a bone block of this region were simulated by computer-aided design software (InVesalius, Rhinoceros, SolidWorks). Each model received one implant, which supported a cemented metalloceramic crown. Using FEMAP software, finite elements were discretized while simulating a 178-N load at 0, 30, and 60 degrees relative to the long axis of the implant. The problem was solved in NEi Nastran software, and postprocessing was performed in FEMAP. Von Mises stress and maximum principal stress maps were made. The von Mises stress analysis revealed that stress increased with increasing inclination of the load, from 0 to 30 to 60 degrees. Morse taper implants showed less stress concentration around the cervical and apical areas of the implant. The bicortical technique, associated or not with nasal floor elevation, contributed to decreasing the stress concentration in the apical area of the implant. Maximum principal stress analysis showed that the increase in inclination was proportional to the increase in stress on the bone tissue in the cervical area. Lower stress concentrations in the cortical bone were obtained with Morse taper implants and the bicortical technique compared with other connections and surgical techniques, respectively. Increasing the inclination of the applied force relative to the long axis of the implant tended to overload the peri-implant bone tissue and the internal structure of the implants. The Morse taper connection and bicortical techniques seemed to be more favorable than other connections or techniques, respectively, for restoring the anterior maxilla.
NASA Astrophysics Data System (ADS)
Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel
2015-04-01
We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion), which provides a generalized interface to arbitrary external forward modelling codes. So far, the 3D spectral-element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework are supported. The creation of interfaces to further forward codes is planned in the near future. ASKI is freely available under the terms of the GPL at www.rub.de/aski . Since the independent modules of ASKI must communicate via file output/input, large storage capacities need to be accessible conveniently. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion. In the presentation, we will show some aspects of the theory behind the full waveform inversion method and its practical realization by the software package ASKI, as well as synthetic and real-data applications from different scales and geometries.
Operational Use of GPS Navigation for Space Shuttle Entry
NASA Technical Reports Server (NTRS)
Goodman, John L.; Propst, Carolyn A.
2008-01-01
The STS-118 flight of the Space Shuttle Endeavour was the first shuttle mission flown with three Global Positioning System (GPS) receivers in place of the three legacy Tactical Air Navigation (TACAN) units. This marked the conclusion of a 15 year effort involving procurement, missionization, integration, and flight testing of a GPS receiver and a parallel effort to formulate and implement shuttle computer software changes to support GPS. The use of GPS data from a single receiver in parallel with TACAN during entry was successfully demonstrated by the orbiters Discovery and Atlantis during four shuttle missions in 2006 and 2007. This provided the confidence needed before flying the first all GPS, no TACAN flight with Endeavour. A significant number of lessons were learned concerning the integration of a software intensive navigation unit into a legacy avionics system. These lessons have been taken into consideration during vehicle design by other flight programs, including the vehicle that will replace the Space Shuttle, Orion.
PIC microcontroller-based RF wireless ECG monitoring system.
Oweis, R J; Barhoum, A
2007-01-01
This paper presents a radio-telemetry system that provides the possibility of ECG signal transmission from a patient detection circuit via an RF data link. A PC then receives the signal through the National Instrument data acquisition card (NIDAQ). The PC is equipped with software allowing the received ECG signals to be saved, analysed, and sent by email to another part of the world. The proposed telemetry system consists of a patient unit and a PC unit. The amplified and filtered ECG signal is sampled 360 times per second, and the A/D conversion is performed by a PIC16f877 microcontroller. The major contribution of the final proposed system is that it detects, processes and sends patients ECG data over a wireless RF link to a maximum distance of 200 m. Transmitted ECG data with different numbers of samples were received, decoded by means of another PIC microcontroller, and displayed using MATLAB program. The designed software is presented in a graphical user interface utility.
The changing landscape of astrostatistics and astroinformatics
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.
2017-06-01
The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.
Time Manager Software for a Flight Processor
NASA Technical Reports Server (NTRS)
Zoerne, Roger
2012-01-01
Data analysis is a process of inspecting, cleaning, transforming, and modeling data to highlight useful information and suggest conclusions. Accurate timestamps and a timeline of vehicle events are needed to analyze flight data. By moving the timekeeping to the flight processor, there is no longer a need for a redundant time source. If each flight processor is initially synchronized to GPS, they can freewheel and maintain a fairly accurate time throughout the flight with no additional GPS time messages received. How ever, additional GPS time messages will ensure an even greater accuracy. When a timestamp is required, a gettime function is called that immediately reads the time-base register.
New data processing for multichannel FIR laser interferometer
NASA Astrophysics Data System (ADS)
Jun-Ben, Chen; Xiang, Gao
1989-10-01
Usually, both the probing and reference signals received by LATGS detectors of FIR interferometer pass through hardware phase discriminator and the output phase difference--hence the electron line densities is collected for analysis and display with a computerized data acquisition system(DAS). In this paper, a new numerical method for computing the phase difference in software has been developed instead of hardware phase discriminator, the temporal resolution and stability is improved. An asymmetrical Abel inversion is applied to processing the data from a seven-channel FIR HCN laser interferometer and the space-time distributions of plasma electron density in the HT-6M tokamak are derived.
NASA Tech Briefs, November 2003
NASA Technical Reports Server (NTRS)
2003-01-01
Topics covered include: Computer Program Recognizes Patterns in Time-Series Data; Program for User-Friendly Management of Input and Output Data Sets; Noncoherent Tracking of a Source of a Data-Modulated Signal; Software for Acquiring Image Data for PIV; Detecting Edges in Images by Use of Fuzzy Reasoning; A Timer for Synchronous Digital Systems; Prototype Parts of a Digital Beam-Forming Wide-Band Receiver; High-Voltage Droplet Dispenser; Network Extender for MIL-STD-1553 Bus; MMIC HEMT Power Amplifier for 140 to 170 GHz; Piezoelectric Diffraction-Based Optical Switches; Numerical Modeling of Nanoelectronic Devices; Organizing Diverse, Distributed Project Information; Eigensolver for a Sparse, Large Hermitian Matrix; Modified Polar-Format Software for Processing SAR Data; e-Stars Template Builder; Software for Acoustic Rendering; Functionally Graded Nanophase Beryllium/Carbon Composites; Thin Thermal-Insulation Blankets for Very High Temperatures; Prolonging Microgravity on Parabolic Airplane Flights; Device for Locking a Control Knob; Cable-Dispensing Cart; Foam Sensor Structures Would be Self-Deployable and Survive Hard Landings; Real-Gas Effects on Binary Mixing Layers; Earth-Space Link Attenuation Estimation via Ground Radar Kdp; Wedge Heat-Flux Indicators for Flash Thermography; Measuring Diffusion of Liquids by Common-Path Interferometry; Zero-Shear, Low-Disturbance Optical Delay Line; Whispering-Gallery Mode-Locked Lasers; Spatial Light Modulators as Optical Crossbar Switches; Update on EMD and Hilbert-Spectra Analysis of Time Series; Quad-Tree Visual-Calculus Analysis of Satellite Coverage; Dyakonov-Perel Effect on Spin Dephasing in n-Type GaAs; Update on Area Production in Mixing of Supercritical Fluids; and Quasi-Sun-Pointing of Spacecraft Using Radiation Pressure.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Using recurrence plot analysis for software execution interpretation and fault detection
NASA Astrophysics Data System (ADS)
Mosdorf, M.
2015-09-01
This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
Interference-Detection Module in a Digital Radar Receiver
NASA Technical Reports Server (NTRS)
Fischman, Mark; Berkun, Andrew; Chu, Anhua; Freedman, Adam; Jourdan, Michael; McWatters, Dalia; Paller, Mimi
2009-01-01
A digital receiver in a 1.26-GHz spaceborne radar scatterometer now undergoing development includes a module for detecting radio-frequency interference (RFI) that could contaminate scientific data intended to be acquired by the scatterometer. The role of the RFI-detection module is to identify time intervals during which the received signal is likely to be contaminated by RFI and thereby to enable exclusion, from further scientific data processing, of signal data acquired during those intervals. The underlying concepts of detection of RFI and rejection of RFI-contaminated signal data are also potentially applicable in advanced terrestrial radio receivers, including software-defined radio receivers in general, receivers in cellular telephones and other wireless consumer electronic devices, and receivers in automotive collision-avoidance radar systems.
Jenson, Susan K.; Domingue, Julia O.
1988-01-01
The first phase of analysis is a conditioning phase that generates three data sets: the original OEM with depressions filled, a data set indicating the flow direction for each cell, and a flow accumulation data set in which each cell receives a value equal to the total number of cells that drain to it. The original OEM and these three derivative data sets can then be processed in a variety of ways to optionally delineate drainage networks, overland paths, watersheds for userspecified locations, sub-watersheds for the major tributaries of a drainage network, or pour point linkages between watersheds. The computer-generated drainage lines and watershed polygons and the pour point linkage information can be transferred to vector-based geographic information systems for futher analysis. Comparisons between these computergenerated features and their manually delineated counterparts generally show close agreement, indicating that these software tools will save analyst time spent in manual interpretation and digitizing.
An Analysis of Mission Critical Computer Software in Naval Aviation
1991-03-01
No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a
Enhancement of computer system for applications software branch
NASA Technical Reports Server (NTRS)
Bykat, Alex
1987-01-01
Presented is a compilation of the history of a two-month project concerned with a survey, evaluation, and specification of a new computer system for the Applications Software Branch of the Software and Data Management Division of Information and Electronic Systems Laboratory of Marshall Space Flight Center, NASA. Information gathering consisted of discussions and surveys of branch activities, evaluation of computer manufacturer literature, and presentations by vendors. Information gathering was followed by evaluation of their systems. The criteria of the latter were: the (tentative) architecture selected for the new system, type of network architecture supported, software tools, and to some extent the price. The information received from the vendors, as well as additional research, lead to detailed design of a suitable system. This design included considerations of hardware and software environments as well as personnel issues such as training. Design of the system culminated in a recommendation for a new computing system for the Branch.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, W. A.; Lepicovsky, J.
1992-01-01
The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, William A.
1992-01-01
The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
NASA Technical Reports Server (NTRS)
1976-01-01
The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.
A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors
Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun
2011-01-01
Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116
A real-time capable software-defined receiver using GPU for adaptive anti-jam GPS sensors.
Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun
2011-01-01
Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.
NASA Astrophysics Data System (ADS)
Chen, R.; Xi, X.; Zhao, X.; He, L.; Yao, H.; Shen, R.
2016-12-01
Dense 3D magnetotelluric (MT) data acquisition owns the benefit of suppressing the static shift and topography effect, can achieve high precision and high resolution inversion for underground structure. This method may play an important role in mineral exploration, geothermal resources exploration, and hydrocarbon exploration. It's necessary to reduce the power consumption greatly of a MT signal receiver for large-scale 3D MT data acquisition while using sensor network to monitor data quality of deployed MT receivers. We adopted a series of technologies to realized above goal. At first, we designed an low-power embedded computer which can couple with other parts of MT receiver tightly and support wireless sensor network. The power consumption of our embedded computer is less than 1 watt. Then we designed 4-channel data acquisition subsystem which supports 24-bit analog-digital conversion, GPS synchronization, and real-time digital signal processing. Furthermore, we developed the power supply and power management subsystem for MT receiver. At last, a series of software, which support data acquisition, calibration, wireless sensor network, and testing, were developed. The software which runs on personal computer can monitor and control over 100 MT receivers on the field for data acquisition and quality control. The total power consumption of the receiver is about 2 watts at full operation. The standby power consumption is less than 0.1 watt. Our testing showed that the MT receiver can acquire good quality data at ground with electrical dipole length as 3 m. Over 100 MT receivers were made and used for large-scale geothermal exploration in China with great success.
Single-Receiver GPS Phase Bias Resolution
NASA Technical Reports Server (NTRS)
Bertiger, William I.; Haines, Bruce J.; Weiss, Jan P.; Harvey, Nathaniel E.
2010-01-01
Existing software has been modified to yield the benefits of integer fixed double-differenced GPS-phased ambiguities when processing data from a single GPS receiver with no access to any other GPS receiver data. When the double-differenced combination of phase biases can be fixed reliably, a significant improvement in solution accuracy is obtained. This innovation uses a large global set of GPS receivers (40 to 80 receivers) to solve for the GPS satellite orbits and clocks (along with any other parameters). In this process, integer ambiguities are fixed and information on the ambiguity constraints is saved. For each GPS transmitter/receiver pair, the process saves the arc start and stop times, the wide-lane average value for the arc, the standard deviation of the wide lane, and the dual-frequency phase bias after bias fixing for the arc. The second step of the process uses the orbit and clock information, the bias information from the global solution, and only data from the single receiver to resolve double-differenced phase combinations. It is called "resolved" instead of "fixed" because constraints are introduced into the problem with a finite data weight to better account for possible errors. A receiver in orbit has much shorter continuous passes of data than a receiver fixed to the Earth. The method has parameters to account for this. In particular, differences in drifting wide-lane values must be handled differently. The first step of the process is automated, using two JPL software sets, Longarc and Gipsy-Oasis. The resulting orbit/clock and bias information files are posted on anonymous ftp for use by any licensed Gipsy-Oasis user. The second step is implemented in the Gipsy-Oasis executable, gd2p.pl, which automates the entire process, including fetching the information from anonymous ftp
NASA Astrophysics Data System (ADS)
Sprinks, James Christopher; Wardlaw, Jessica; Houghton, Robert; Bamford, Steven; Marsh, Stuart
2016-10-01
Citizen science platforms allow untrained volunteers to take part in scientific research across a range of disciplines, and often involve the analysis of remotely sensed imagery. The data collected by increasingly advanced and automated instruments has made planetary science a prime candidate for, and user of, citizen science online platforms. In order to process this large volume of information, such systems are increasingly performed in conjunction with data-mining analysis software, with varying configurations of computer and volunteer contribution. Despite citizen science being a relatively new approach, there has been a growing field of research considering the practice in its own right beyond the scientific problems they address, with studies involving interface HCI, platform functionality, and motivation particularly adding to a growing body of citizen science scholarship.Through iterations of the FP7 iMars project's 'Mars in Motion' platform, the work presented studied the effect that guidance information had on volunteers' accuracy and trust. Whilst analysing imagery for change, volunteers were told whether automated change detection software or the consensus of other citizen scientists had found change, with this information varying in terms of accuracy. Results showed that volunteers' ability to both identify change and the type of feature undergoing change was improved when both the software result and crowd opinion guidance information provided had a greater accuracy. However, when guidance information was less accurate volunteers' level of trust fell at a sharper rate when it came from the crowd than when it came from the algorithm, and participants reported more frustration - a counter-intuitive result compared to existing research. Citizen science practitioners need to consider the information they provide to volunteers and how they present it; the results of software analysis or the consensus of a crowd need to be conclusive and above all accurate in order to improve both the performance and engagement of their volunteer community.The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under iMars grant agreement 607379.
Control system development for a 1 MW/e/ solar thermal power plant
NASA Technical Reports Server (NTRS)
Daubert, E. R.; Bergthold, F. M., Jr.; Fulton, D. G.
1981-01-01
The point-focusing distributed receiver power plant considered consists of a number of power modules delivering power to a central collection point. Each power module contains a parabolic dish concentrator with a closed-cycle receiver/turbine/alternator assembly. Currently, a single-module prototype plant is under construction. The major control system tasks required are related to concentrator pointing control, receiver temperature control, and turbine speed control. Attention is given to operational control details, control hardware and software, and aspects of CRT output display.
Design of a receiver operating characteristic (ROC) study of 10:1 lossy image compression
NASA Astrophysics Data System (ADS)
Collins, Cary A.; Lane, David; Frank, Mark S.; Hardy, Michael E.; Haynor, David R.; Smith, Donald V.; Parker, James E.; Bender, Gregory N.; Kim, Yongmin
1994-04-01
The digital archiving system at Madigan Army Medical Center (MAMC) uses a 10:1 lossy data compression algorithm for most forms of computed radiography. A systematic study on the potential effect of lossy image compression on patient care has been initiated with a series of studies focused on specific diagnostic tasks. The studies are based upon the receiver operating characteristic (ROC) method of analysis for diagnostic systems. The null hypothesis is that observer performance with approximately 10:1 compressed and decompressed images is not different from using original, uncompressed images for detecting subtle pathologic findings seen on computed radiographs of bone, chest, or abdomen, when viewed on a high-resolution monitor. Our design involves collecting cases from eight pathologic categories. Truth is determined by committee using confirmatory studies performed during routine clinical practice whenever possible. Software has been developed to aid in case collection and to allow reading of the cases for the study using stand-alone Siemens Litebox workstations. Data analysis uses two methods, ROC analysis and free-response ROC (FROC) methods. This study will be one of the largest ROC/FROC studies of its kind and could benefit clinical radiology practice using PACS technology. The study design and results from a pilot FROC study are presented.
Research of real-time communication software
NASA Astrophysics Data System (ADS)
Li, Maotang; Guo, Jingbo; Liu, Yuzhong; Li, Jiahong
2003-11-01
Real-time communication has been playing an increasingly important role in our work, life and ocean monitor. With the rapid progress of computer and communication technique as well as the miniaturization of communication system, it is needed to develop the adaptable and reliable real-time communication software in the ocean monitor system. This paper involves the real-time communication software research based on the point-to-point satellite intercommunication system. The object-oriented design method is adopted, which can transmit and receive video data and audio data as well as engineering data by satellite channel. In the real-time communication software, some software modules are developed, which can realize the point-to-point satellite intercommunication in the ocean monitor system. There are three advantages for the real-time communication software. One is that the real-time communication software increases the reliability of the point-to-point satellite intercommunication system working. Second is that some optional parameters are intercalated, which greatly increases the flexibility of the system working. Third is that some hardware is substituted by the real-time communication software, which not only decrease the expense of the system and promotes the miniaturization of communication system, but also aggrandizes the agility of the system.
GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data
NASA Astrophysics Data System (ADS)
Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.
2016-08-01
The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.
Li, Hui; Li, Ming; Zhang, Jianning; Li, Xiangcui; Tan, Junying; Ji, Bobo
2016-01-01
To evaluate the clinical value of lidocain in the treatment of tinnitus through three routes of administration (intravenous, intratympanic and acupoint injection) by analyzing literatures. Articles were collected through Hownet, Wanfang, VIP, Pubmed, SciVerse ScienceDirect, Springer and OVID, etc. The articles were strictly evaluated based on their quality. The Meta-analysis was performed to evaluate the outcomes by RevMan 5. 2 software. A total of 16 articles with 1203 patients were enrolled in the analysis. Their tinnitus history ranged from 7 hours to 20 years. Assessment methods include tinnitus loudness levels, severity scales and subjective feelings. None of articles refer to maintaining time, instead of "short-term", "short" and so on. A total of 133 cases received intravenous injection and the effective rate was 73.4% (98 cases). 50 cases and 332 cases received intratympanic and acupoint injection respectively and their effective rates were 74.0% and 87.7%, respectively. The effective rate ranged from 42.4% to 58.3% in control group. Meta-analysis results indicate that all three routes of lidocaine administrations are more effective than conventional methods (P < 0.05). Different routes of lidocaine administration have a good but short time effects on the tinnitus control. It can effectively reduce the time of tinnitus habituation as a complementary treatment. But its value still needs further evaluation.
NASA Technical Reports Server (NTRS)
Currit, P. A.
1983-01-01
The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.
Influence of vibration on the coupling efficiency in spatial receiver and its compensation method
NASA Astrophysics Data System (ADS)
Hu, Qinggui; Mu, Yining
2018-04-01
In order to analyze the loss of the free-space optical receiver caused by the vibration, we set up the coordinate systems on both the receiving lens surface and the receiving fiber surface, respectively. Then, with Gauss optics theory, the coupling efficiency equation is obtained. And the solution is calculated with MATLAB® software. To lower the impact of the vibration, the directional tapered communication fiber receiver is proposed. In the next step, the sample was produced and two experiments were done. The first experiment shows that the coupling efficiency of the receiver is higher than that of the traditional one. The second experiment shows the bit error rate of the new receiver is lower. Both of the experiments show the new receiver could improve the receiving system's tolerance with the vibration.
Monje-Agudo, Patricia; Borrego-Izquierdo, Yolanda; Robustillo-Cortés, Ma de Las Aguas; Jiménez-Galán, Rocio; Almeida-González, Carmen V; Morillo-Verdugo, Ramón A
2015-05-01
To design and to validate a questionnaire to assess satisfaction with pharmaceutical care (PC) received at the hospital pharmacy. Multicentric study in five andalusian hospital in January 2013. A bibliography search was performed in PUBMED; MESH term; pharmaceutical services, patients satisfaction and questionnaire. Next, the questionnaire was produced by Delphi methodology with ten items and with the following variables; demographics, socials, pharrmacologicals and clinics which the patient was asked for the consequences of the PC in his treatment and illness and for the acceptance with the received service. The patient could answer between one= very insufficient and five= excellent. Before the validation phase questionnaire, a pilot phase was carried out. Descriptive analysis, Cronbach's alpha coefficient and intraclass correlation coefficient (ICC) were performed in both phases. Data analysis was conducted using the SPSS statistical software package release 20.0. In the pilot phase were included 21 questionnaires and 154 of them in validation phase (response index of 100%). In the last phase, 62% (N=96) of patients were men. More than 50% of patients answered "excelent" in all items of questionnaire in both phases. The Cronbach's alpha coefficient and ICC were 0.921 and 0.915 (95%IC: 0.847-0.961) and 0.916 and 0,910 (95%IC: 0.886-0.931) in pilot and validation phases, respectively. A high reliability instrument was designed and validated to evaluate the patient satisfaction with PC received at hospital pharmacy. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Elements of strategic capability for software outsourcing enterprises based on the resource
NASA Astrophysics Data System (ADS)
Shi, Wengeng
2011-10-01
Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.
Signal processing and general purpose data acquisition system for on-line tomographic measurements
NASA Astrophysics Data System (ADS)
Murari, A.; Martin, P.; Hemming, O.; Manduchi, G.; Marrelli, L.; Taliercio, C.; Hoffmann, A.
1997-01-01
New analog signal conditioning electronics and data acquisition systems have been developed for the soft x-ray and bolometric tomography diagnostic in the reverse field pinch experiment (RFX). For the soft x-ray detectors the analog signal processing includes a fully differential current to voltage conversion, with up to a 200 kHz bandwidth. For the bolometers, a 50 kHz carrier frequency amplifier allows a maximum bandwidth of 10 kHz. In both cases the analog signals are digitized with a 1 MHz sampling rate close to the diagnostic and are transmitted via a transparent asynchronous xmitter/receiver interface (TAXI) link to purpose built Versa Module Europa (VME) modules which perform data acquisition. A software library has been developed for data preprocessing and tomographic reconstruction. It has been written in C language and is self-contained, i.e., no additional mathematical library is required. The package is therefore platform-free: in particular it can perform online analysis in a real-time application, such as continuous display and feedback, and is portable for long duration fusion or other physical experiments. Due to the modular organization of the library, new preprocessing and analysis modules can be easily integrated in the environment. This software is implemented in RFX over three different platforms: open VMS, digital Unix, and VME 68040 CPU.
The Comparison of VLBI Data Analysis Using Software Globl and Globk
NASA Astrophysics Data System (ADS)
Guangli, W.; Xiaoya, W.; Jinling, L.; Wenyao, Z.
The comparison of different geodetic data analysis software is one of the quite of- ten mentioned topics. In this paper we try to find out the difference between software GLOBL and GLOBK when use them to process the same set of VLBI data. GLOBL is a software developed by VLBI team, geodesy branch, GSFC/NASA to process geode- tic VLBI data using algorithm of arc-parameter-elimination, while GLOBK using al- gorithm of kalman filtering is mainly used in GPS data analysis, and it is also used in VLBI data analysis. Our work focus on whether there are significant difference when use the two softwares to analyze the same VLBI data set and investigate the reasons caused the difference.
ACES: Space shuttle flight software analysis expert system
NASA Technical Reports Server (NTRS)
Satterwhite, R. Scott
1990-01-01
The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
Development of a near-infrared spectroscopy instrument for applications in urology.
Macnab, Andrew J; Stothers, Lynn
2008-10-01
Near infrared spectroscopy (NIRS) is an established technology using photons of light in the near infrared spectrum to monitor changes in tissue of naturally occurring chromophores, including oxygenated and deoxygenated hemoglobin. Technology and methodology have been validated for measurement of a range of physiologic parameters. NIRS has been applied successfully in urology research; however current instruments are designed principally for brain and muscle study. To describe development of a NIRS instrument specifically designed for monitoring changes in chromophore concentration in the bladder detrusor in real time, to facilitate research to establish the role of this non-invasive technology in the evaluation of patients with voiding dysfunction The portable continuous wave NIRS instrument has a 3 laser diode light source (785, 808 and 830 nanometers), fiber optic cables for light transmission, a self adhesive patient interface patch with an emitter and sensor, and software to detect the difference between the light transmitted and received by the instrument. Software incorporated auto-attenuates the optical signals and converts raw optical data into chromophore concentrations displayed graphically. The prototype was designed, tested, and iteratively developed to achieve optimal suprapubic transcutaneous monitoring of the detrusor in human subjects during bladder filling and emptying. Evaluation with simultaneous invasive urodynamic measurement in men and women indicates good specificity and sensitivity of NIRS chromophore concentration changes by receiver operator curve analysis, and correlation between NIRS data and urodynamic pressures. Urological monitoring with this NIRS instrument is feasible and generates data of potential diagnostic value.
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
Teaching meta-analysis using MetaLight.
Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark
2012-10-18
Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.
Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software
NASA Technical Reports Server (NTRS)
Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole;
2014-01-01
STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.
NASA Technical Reports Server (NTRS)
Moran, Susanne I.
2004-01-01
The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.
NASA Technical Reports Server (NTRS)
Dunn, William R.; Corliss, Lloyd D.
1991-01-01
Paper examines issue of software safety. Presents four case histories of software-safety analysis. Concludes that, to be safe, software, for all practical purposes, must be free of errors. Backup systems still needed to prevent catastrophic software failures.
Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G
2013-01-16
Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.
Stress analysis in oral obturator prostheses: imaging photoelastic
NASA Astrophysics Data System (ADS)
Pesqueira, Aldiéris Alves; Goiato, Marcelo Coelho; dos Santos, Daniela Micheline; Haddad, Marcela Filié; Andreotti, Agda Marobo; Moreno, Amália
2013-06-01
Maxillary defects resulting from cancer, trauma, and congenital malformation affect the chewing efficiency and retention of dentures in these patients. The use of implant-retained palatal obturator dentures has improved the self-esteem and quality of life of several subjects. We evaluate the stress distribution of implant-retained palatal obturator dentures with different attachment systems by using the photoelastic analysis images. Two photoelastic models of the maxilla with oral-sinus-nasal communication were fabricated. One model received three implants on the left side of the alveolar ridge (incisive, canine, and first molar regions) and the other did not receive implants. Afterwards, a conventional palatal obturator denture (control) and two implant-retained palatal obturator dentures with different attachment systems (O-ring; bar-clip) were constructed. Models were placed in a circular polariscope and a 100-N axial load was applied in three different regions (incisive, canine, and first molar regions) by using a universal testing machine. The results were photographed and analyzed qualitatively using a software (Adobe Photoshop). The bar-clip system exhibited the highest stress concentration followed by the O-ring system and conventional denture (control). Images generated by the photoelastic method help in the oral rehabilitator planning.
NASA Technical Reports Server (NTRS)
Uber, James G.
1988-01-01
Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Analysis of a hardware and software fault tolerant processor for critical applications
NASA Technical Reports Server (NTRS)
Dugan, Joanne B.
1993-01-01
Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.
Software Defined GPS Receiver for International Space Station
NASA Technical Reports Server (NTRS)
Duncan, Courtney B.; Robison, David E.; Koelewyn, Cynthia Lee
2011-01-01
JPL is providing a software defined radio (SDR) that will fly on the International Space Station (ISS) as part of the CoNNeCT project under NASA's SCaN program. The SDR consists of several modules including a Baseband Processor Module (BPM) and a GPS Module (GPSM). The BPM executes applications (waveforms) consisting of software components for the embedded SPARC processor and logic for two Virtex II Field Programmable Gate Arrays (FPGAs) that operate on data received from the GPSM. GPS waveforms on the SDR are enabled by an L-Band antenna, low noise amplifier (LNA), and the GPSM that performs quadrature downconversion at L1, L2, and L5. The GPS waveform for the JPL SDR will acquire and track L1 C/A, L2C, and L5 GPS signals from a CoNNeCT platform on ISS, providing the best GPS-based positioning of ISS achieved to date, the first use of multiple frequency GPS on ISS, and potentially the first L5 signal tracking from space. The system will also enable various radiometric investigations on ISS such as local multipath or ISS dynamic behavior characterization. In following the software-defined model, this work will create a highly portable GPS software and firmware package that can be adapted to another platform with the necessary processor and FPGA capability. This paper also describes ISS applications for the JPL CoNNeCT SDR GPS waveform, possibilities for future global navigation satellite system (GNSS) tracking development, and the applicability of the waveform components to other space navigation applications.
Kudo, Kohsuke; Uwano, Ikuko; Hirai, Toshinori; Murakami, Ryuji; Nakamura, Hideo; Fujima, Noriyuki; Yamashita, Fumio; Goodwin, Jonathan; Higuchi, Satomi; Sasaki, Makoto
2017-04-10
The purpose of the present study was to compare different software algorithms for processing DSC perfusion images of cerebral tumors with respect to i) the relative CBV (rCBV) calculated, ii) the cutoff value for discriminating low- and high-grade gliomas, and iii) the diagnostic performance for differentiating these tumors. Following approval of institutional review board, informed consent was obtained from all patients. Thirty-five patients with primary glioma (grade II, 9; grade III, 8; and grade IV, 18 patients) were included. DSC perfusion imaging was performed with 3-Tesla MRI scanner. CBV maps were generated by using 11 different algorithms of four commercially available software and one academic program. rCBV of each tumor compared to normal white matter was calculated by ROI measurements. Differences in rCBV value were compared between algorithms for each tumor grade. Receiver operator characteristics analysis was conducted for the evaluation of diagnostic performance of different algorithms for differentiating between different grades. Several algorithms showed significant differences in rCBV, especially for grade IV tumors. When differentiating between low- (II) and high-grade (III/IV) tumors, the area under the ROC curve (Az) was similar (range 0.85-0.87), and there were no significant differences in Az between any pair of algorithms. In contrast, the optimal cutoff values varied between algorithms (range 4.18-6.53). rCBV values of tumor and cutoff values for discriminating low- and high-grade gliomas differed between software packages, suggesting that optimal software-specific cutoff values should be used for diagnosis of high-grade gliomas.
Ultrasonic image analysis and image-guided interventions.
Noble, J Alison; Navab, Nassir; Becher, H
2011-08-06
The fields of medical image analysis and computer-aided interventions deal with reducing the large volume of digital images (X-ray, computed tomography, magnetic resonance imaging (MRI), positron emission tomography and ultrasound (US)) to more meaningful clinical information using software algorithms. US is a core imaging modality employed in these areas, both in its own right and used in conjunction with the other imaging modalities. It is receiving increased interest owing to the recent introduction of three-dimensional US, significant improvements in US image quality, and better understanding of how to design algorithms which exploit the unique strengths and properties of this real-time imaging modality. This article reviews the current state of art in US image analysis and its application in image-guided interventions. The article concludes by giving a perspective from clinical cardiology which is one of the most advanced areas of clinical application of US image analysis and describing some probable future trends in this important area of ultrasonic imaging research.
ERIC Educational Resources Information Center
Van Swol, Lyn M.; Braun, Michael T.; Malhotra, Deepak
2012-01-01
The study used Linguistic Inquiry and Word Count and Coh-Metrix software to examine linguistic differences with deception in an ultimatum game. In the game, the Allocator was given an amount of money to divide with the Receiver. The Receiver did not know the precise amount the Allocator had to divide, and the Allocator could use deception.…
TriG: Next Generation Scalable Spaceborne GNSS Receiver
NASA Technical Reports Server (NTRS)
Tien, Jeffrey Y.; Okihiro, Brian Bachman; Esterhuizen, Stephan X.; Franklin, Garth W.; Meehan, Thomas K.; Munson, Timothy N.; Robison, David E.; Turbiner, Dmitry; Young, Lawrence E.
2012-01-01
TriG is the next generation NASA scalable space GNSS Science Receiver. It will track all GNSS and additional signals (i.e. GPS, GLONASS, Galileo, Compass and Doris). Scalable 3U architecture and fully software and firmware recofigurable, enabling optimization to meet specific mission requirements. TriG GNSS EM is currently undergoing testing and is expected to complete full performance testing later this year.
Information Systems Security Products and Services Catalogue.
1992-01-01
pricing information on the Motorola Portable DES Receiver Station and Portable DES Base Station, contact Motorola. The PX-300- S ranges in cost from...C2 Paul Smith (612) 482-2776 Tom Latterner (301) 220-3400 Jeffrey S . Bell (215) 986-6864 John Haggard (312) 714-7604 4-2d.2 GENERAL-PURPOSE...primary software security mechanism of the SCOMP system is the security kernel, based on the Center-approved Bell -LaPadula model of the software portion
State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation
2014-07-01
preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data
A Method for Populating the Knowledge Base of AFIT’s Domain-Oriented Application Composition System
1993-12-01
Analysis ( FODA ). The approach identifies prominent features (similarities) and distinctive features (differences) of software systems within an... analysis approaches we have summarized, the re- searchers described FODA in sufficient detail to use on large domain analysis projects (ones with...Software Technology Center, July 1991. 18. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Company profile: Complete Genomics Inc.
Reid, Clifford
2011-02-01
Complete Genomics Inc. is a life sciences company that focuses on complete human genome sequencing. It is taking a completely different approach to DNA sequencing than other companies in the industry. Rather than building a general-purpose platform for sequencing all organisms and all applications, it has focused on a single application - complete human genome sequencing. The company's Complete Genomics Analysis Platform (CGA™ Platform) comprises an integrated package of biochemistry, instrumentation and software that sequences human genomes at the highest quality, lowest cost and largest scale available. Complete Genomics offers a turnkey service that enables customers to outsource their human genome sequencing to the company's genome sequencing center in Mountain View, CA, USA. Customers send in their DNA samples, the company does all the library preparation, DNA sequencing, assembly and variant analysis, and customers receive research-ready data that they can use for biological discovery.
Determination of Earth orientation using the Global Positioning System
NASA Technical Reports Server (NTRS)
Freedman, A. P.
1989-01-01
Modern spacecraft tracking and navigation require highly accurate Earth-orientation parameters. For near-real-time applications, errors in these quantities and their extrapolated values are a significant error source. A globally distributed network of high-precision receivers observing the full Global Positioning System (GPS) configuration of 18 or more satellites may be an efficient and economical method for the rapid determination of short-term variations in Earth orientation. A covariance analysis using the JPL Orbit Analysis and Simulation Software (OASIS) was performed to evaluate the errors associated with GPS measurements of Earth orientation. These GPS measurements appear to be highly competitive with those from other techniques and can potentially yield frequent and reliable centimeter-level Earth-orientation information while simultaneously allowing the oversubscribed Deep Space Network (DSN) antennas to be used more for direct project support.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
FunRich proteomics software analysis, let the fun begin!
Benito-Martin, Alberto; Peinado, Héctor
2015-08-01
Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
Standardizing Activation Analysis: New Software for Photon Activation Analysis
NASA Astrophysics Data System (ADS)
Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.
2011-06-01
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.
Second Generation Product Line Engineering Takes Hold in the DoD
2014-01-01
Feature- Oriented Domain Analysis ( FODA ) Feasibility Study” (CMU/SEI-90- TR-021, ADA235785). Pittsburgh, PA: Software Engineering Institute...software product line engineering and software architecture documentation and analysis . Clements is co-author of three practitioner-oriented books about
Design and Construction of a Microcontroller-Based Ventilator Synchronized with Pulse Oximeter.
Gölcük, Adem; Işık, Hakan; Güler, İnan
2016-07-01
This study aims to introduce a novel device with which mechanical ventilator and pulse oximeter work in synchronization. Serial communication technique was used to enable communication between the pulse oximeter and the ventilator. The SpO2 value and the pulse rate read on the pulse oximeter were transmitted to the mechanical ventilator through transmitter (Tx) and receiver (Rx) lines. The fuzzy-logic-based software developed for the mechanical ventilator interprets these values and calculates the percentage of oxygen (FiO2) and Positive End-Expiratory Pressure (PEEP) to be delivered to the patient. The fuzzy-logic-based software was developed to check the changing medical states of patients and to produce new results (FiO2 ve PEEP) according to each new state. FiO2 and PEEP values delivered from the ventilator to the patient can be calculated in this way without requiring any arterial blood gas analysis. Our experiments and the feedbacks from physicians show that this device makes it possible to obtain more successful results when compared to the current practices.
Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.
Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T
2016-01-01
Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).
LLCEDATA and LLCECALC for Windows version 1.0, Volume 1: User`s manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFadden, J.G.
LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file (EDF) that represents a snapshot of both the LLCE and the tank it originates from. LLCECALC reads the EDF and a gamma assay (AV2) file that is produced by the Flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, and Volume 3 is a software verification and validation document.« less
WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadley, S; Kessler, M; Litzenberg, D
2015-06-15
Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those eventsmore » and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from Varian Medical Systems. Other quality projects involving her effort are funded by Blue Cross Blue Shield of Michigan, Breast Cancer Research Foundation, and the NIH.« less
2008-09-01
software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1
Software Defined Network Monitoring Scheme Using Spectral Graph Theory and Phantom Nodes
2014-09-01
networks is the emergence of software - defined networking ( SDN ) [1]. SDN has existed for the...Chapter III for network monitoring. A. SOFTWARE DEFINED NETWORKS SDNs provide a new and innovative method to simplify network hardware by logically...and R. Giladi, “Performance analysis of software - defined networking ( SDN ),” in Proc. of IEEE 21st International Symposium on Modeling, Analysis
Development of new vibration energy flow analysis software and its applications to vehicle systems
NASA Astrophysics Data System (ADS)
Kim, D.-J.; Hong, S.-Y.; Park, Y.-H.
2005-09-01
The Energy flow analysis (EFA) offers very promising results in predicting the noise and vibration responses of system structures in medium-to-high frequency ranges. We have developed the Energy flow finite element method (EFFEM) based software, EFADSC++ R4, for the vibration analysis. The software can analyze the system structures composed of beam, plate, spring-damper, rigid body elements and many other components developed, and has many useful functions in analysis. For convenient use of the software, the main functions of the whole software are modularized into translator, model-converter, and solver. The translator module makes it possible to use finite element (FE) model for the vibration analysis. The model-converter module changes FE model into energy flow finite element (EFFE) model, and generates joint elements to cover the vibrational attenuation in the complex structures composed of various elements and can solve the joint element equations by using the wave tra! nsmission approach very quickly. The solver module supports the various direct and iterative solvers for multi-DOF structures. The predictions of vibration for real vehicles by using the developed software were performed successfully.
Simmons, Elizabeth Schoen; Paul, Rhea; Shic, Frederick
2016-01-01
This study examined the acceptability of a mobile application, SpeechPrompts, designed to treat prosodic disorders in children with ASD and other communication impairments. Ten speech-language pathologists (SLPs) in public schools and 40 of their students, 5-19 years with prosody deficits participated. Students received treatment with the software over eight weeks. Pre- and post-treatment speech samples and student engagement data were collected. Feedback on the utility of the software was also obtained. SLPs implemented the software with their students in an authentic education setting. Student engagement ratings indicated students' attention to the software was maintained during treatment. Although more testing is warranted, post-treatment prosody ratings suggest that SpeechPrompts has potential to be a useful tool in the treatment of prosodic disorders.
Description of the GMAO OSSE for Weather Analysis Software Package: Version 3
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.;
2017-01-01
The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.
ERIC Educational Resources Information Center
Margerum-Leys, Jon; Kupperman, Jeff; Boyle-Heimann, Kristen
This paper presents perspectives on the use of data analysis software in the process of qualitative research. These perspectives were gained in the conduct of three qualitative research studies that differed in theoretical frames, areas of interests, and scope. Their common use of a particular data analysis software package allows the exploration…
ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273
ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.
General purpose optimization software for engineering design
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.
1990-01-01
The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
ATE accomplishes receiver specification testing with increased speed and throughput
NASA Astrophysics Data System (ADS)
Moser, S. A.
1982-12-01
The use of automatic test equipment (ATE) for receiver specifications testing can result in a 90-95% reduction of test time, with a corresponding reduction of labor costs due both to the reduction of personnel numbers and a simplification of tasks that permits less skilled personnel to be employed. These benefits free high-level technicians for more challenging system management assignments. Accuracy and repeatability also improve with the adoption of ATE, since no possibility of human error can be introduced into the readings that are taken by the system. A massive and expensive software design and development effort is identified as the most difficult aspect of ATE implementation, since programming is both time-consuming and labor intensive. An attempt is therefore made by system manufacturers to conduct an integrated development program for both ATE system hardware and software.
2002-12-01
radio and batteries. The procedures outlined in this CHETN will concentrate on the Magellan GPS ProMARK X-CP receiver as it was used to collect...The Magellan GPS ProMARK X-CP is a small robust light receiver that can log 9 hr of both pseudorange and carrier phase satellite data for post...post- processing software, pseudorange GPS data recorded by the ProMARK X-CP can be post-processed differential to achieve 1-3 m (3.3-9.8 ft) horizontal
Weather Information Processing
NASA Technical Reports Server (NTRS)
1991-01-01
Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.
Thickness Measurement of Surface Attachment on Plate with Lamb Wave
NASA Astrophysics Data System (ADS)
Ma, Xianglong; Zhang, Yinghong; Wen, Lichao; He, Yehu
2017-12-01
Aiming at the thickness detection of the plate surface attachment, a nondestructive testing method based on the Lamb wave is presented. This method utilizes Lamb wave propagation characteristics of signals in a bi-layer medium to measure the surface attachment plate thickness. Propagation of Lamb wave in bi-layer elastic is modeled and analyzed. The two-dimensional simulation model of electromagnetic ultrasonic plate - scale is established. The simulation is conducted by software COMSOL for simulation analysis under different boiler scale thickness wave form curve. Through this study, the thickness of the attached material can be judged by analyzing the characteristics of the received signal when the thickness of the surface of the plate is measured.
NASA Astrophysics Data System (ADS)
Delene, D. J.
2014-12-01
Research aircraft that conduct atmospheric measurements carry an increasing array of instrumentation. While on-board personnel constantly review instrument parameters and time series plots, there are an overwhelming number of items. Furthermore, directing the aircraft flight takes up much of the flight scientist time. Typically, a flight engineer is given the responsibility of reviewing the status of on-board instruments. While major issues like not receiving data are quickly identified during a flight, subtle issues like low but believable concentration measurements may go unnoticed. Therefore, it is critical to review data after a flight in near real time. The Airborne Data Processing and Analysis (ADPAA) software package used by the University of North Dakota automates the post-processing of aircraft flight data. Utilizing scripts to process the measurements recorded by data acquisition systems enables the generation of data files within an hour of flight completion. The ADPAA Cplot visualization program enables plots to be quickly generated that enable timely review of all recorded and processed parameters. Near real time review of aircraft flight data enables instrument problems to be identified, investigated and fixed before conducting another flight. On one flight, near real time data review resulted in the identification of unusually low measurements of cloud condensation nuclei, and rapid data visualization enabled the timely investigation of the cause. As a result, a leak was found and fixed before the next flight. Hence, with the high cost of aircraft flights, it is critical to find and fix instrument problems in a timely matter. The use of a automated processing scripts and quick visualization software enables scientists to review aircraft flight data in near real time to identify potential problems.
Lovato, Andrea; De Colle, Wladimiro; Giacomelli, Luciano; Piacente, Alessandro; Righetto, Lara; Marioni, Gino; de Filippis, Cosimo
2016-11-01
The aim of this study was to compare the discriminatory power of the Multi-Dimensional Voice Program (MDVP) and Praat in distinguishing the gender of euphonic adults. This is a cross-sectional study. The recordings of 100 euphonic volunteers (50 males and 50 females) producing a sustained vowel /a/ were analyzed with MDVP and Praat software. Both computer programs identified significant differences between male and female volunteers in absolute jitter (MDVP P < 0.00001 and Praat P < 0.00001) and in shimmer in decibel (dB) (MDVP P = 0.006 and Praat P = 0.001). Using the scale proposed by Hosmer and Lemeshow, we found no gender discrimination for shimmer in dB with either the MDVP (area under the receiver operating characteristics curve [AUC] = 0.658) or Praat (AUC = 0.682). In our series, on the other hand, MDVP absolute jitter achieved an acceptable discrimination between males and females (AUC = 0.752), and Praat absolute jitter achieved an outstanding discrimination (AUC = 0.901). The discriminatory power of Praat absolute jitter was significantly higher than that of the MDVP (P = 0.003). Absolute jitter sensitivity and specificity were also higher for Praat (83% and 80%) than for the MDVP (74% and 49%). Differences attributable to a subject's gender and to the software used to measure acoustic parameters should be carefully considered in both research and clinical settings. Further studies are needed to test the discriminatory power of different voice analysis programs when differentiating between normal and dysphonic voices. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Data acquisition architecture and online processing system for the HAWC gamma-ray observatory
NASA Astrophysics Data System (ADS)
Abeysekara, A. U.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Ayala Solares, H. A.; Barber, A. S.; Baughman, B. M.; Bautista-Elivar, N.; Becerra Gonzalez, J.; Belmont-Moreno, E.; BenZvi, S. Y.; Berley, D.; Bonilla Rosales, M.; Braun, J.; Caballero-Lopez, R. A.; Caballero-Mora, K. S.; Carramiñana, A.; Castillo, M.; Cotti, U.; Cotzomi, J.; de la Fuente, E.; De León, C.; DeYoung, T.; Diaz-Cruz, J.; Diaz Hernandez, R.; Díaz-Vélez, J. C.; Dingus, B. L.; DuVernois, M. A.; Ellsworth, R. W.; Fiorino, D. W.; Fraija, N.; Galindo, A.; Garfias, F.; González, M. M.; Goodman, J. A.; Grabski, V.; Gussert, M.; Hampel-Arias, Z.; Harding, J. P.; Hui, C. M.; Hüntemeyer, P.; Imran, A.; Iriarte, A.; Karn, P.; Kieda, D.; Kunde, G. J.; Lara, A.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; León Vargas, H.; Linares, E. C.; Linnemann, J. T.; Longo Proper, M.; Luna-García, R.; Malone, K.; Marinelli, A.; Marinelli, S. S.; Martinez, O.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A. J.; McEnery, J.; Mendoza Torres, E.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Noriega-Papaqui, R.; Oceguera-Becerra, T.; Patricelli, B.; Pelayo, R.; Pérez-Pérez, E. G.; Pretz, J.; Rivière, C.; Rosa-González, D.; Ruiz-Velasco, E.; Ryan, J.; Salazar, H.; Salesa Greus, F.; Sanchez, F. E.; Sandoval, A.; Schneider, M.; Silich, S.; Sinnis, G.; Smith, A. J.; Sparks Woodle, K.; Springer, R. W.; Taboada, I.; Toale, P. A.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Villaseñor, L.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Yodh, G. B.; Younk, P. W.; Zaborov, D.; Zepeda, A.; Zhou, H.
2018-04-01
The High Altitude Water Cherenkov observatory (HAWC) is an air shower array devised for TeV gamma-ray astronomy. HAWC is located at an altitude of 4100 m a.s.l. in Sierra Negra, Mexico. HAWC consists of 300 Water Cherenkov Detectors, each instrumented with 4 photomultiplier tubes (PMTs). HAWC re-uses the Front-End Boards from the Milagro experiment to receive the PMT signals. These boards are used in combination with Time to Digital Converters (TDCs) to record the time and the amount of light in each PMT hit (light flash). A set of VME TDC modules (128 channels each) is operated in a continuous (dead time free) mode. The TDCs are read out via the VME bus by Single-Board Computers (SBCs), which in turn are connected to a gigabit Ethernet network. The complete system produces ≈500 MB/s of raw data. A high-throughput data processing system has been designed and built to enable real-time data analysis. The system relies on off-the-shelf hardware components, an open-source software technology for data transfers (ZeroMQ) and a custom software framework for data analysis (AERIE). Multiple trigger and reconstruction algorithms can be combined and run on blocks of data in a parallel fashion, producing a set of output data streams which can be analyzed in real time with minimal latency (<5 s). This paper provides an overview of the hardware set-up and an in-depth description of the software design, covering both the TDC data acquisition system and the real-time data processing system. The performance of these systems is also discussed.
NASA Technical Reports Server (NTRS)
Jeutter, Dean C.
1996-01-01
The closed loop prototype has operational bi-directional wireless links. The Wideband PCM-FSK receiver has been designed and characterized. Now that both links function, communication performance can be addressed. For example, noise problems with the received outlink signal that caused the PC program to lockup were just recently revealed and minimized by software "enhancements" to the Windows based PC program. A similar problem with inlink communication was uncovered several days before this report: A noise spike or dropout (expected events in the animal Habitat) caused an interrupt to the implant microcontroller which halted outlink transmission. Recovery of outlink transmission did not reliably occur. The problem has been defined and implant software is being modified to better recognize noise from data by changing the timing associated with valid data packet identification and by better utilizing the error flags generated by the microcontroller's SCI circuits. Excellent inlink performance will also require improvements in the implant's receiver. The biggest performance improvement can be provided by antenna design for the Habitat. The quarter wavelength whip antennas used with the demo prototype inlink leave much to be desired.
Software-type Wave-Particle Interaction Analyzer (SWPIA) by RPWI for JUICE
NASA Astrophysics Data System (ADS)
Katoh, Y.; Kojima, H.; Asamura, K.; Kasaba, Y.; Tsuchiya, F.; Kasahara, Y.; Ishisaka, S.; Kimura, T.; Miyoshi, Y.; Santolik, O.; Bergman, J.; Puccio, W.; Gill, R.; Wieser, M.; Schmidt, W.; Barabash, S.; Wahlund, J.-E.
2017-09-01
Software-type Wave-Particle Interaction Analyzer (SWPIA) will be realized as a software function of Low-Frequency receiver (LF) running on the DPU of RPWI (Radio and Plasma Waves Investigation) for the ESA JUICE mission. SWPIA conducts onboard computations of physical quantities indicating the energy exchange between plasma waves and energetic ions. Onboard inter-instruments communications are necessary to realize SWPIA, which will be implemented by efforts of RPWI, PEP (Particle Environment Package) and J-MAG (JUICE Magnetometer). By providing the direct evidence of ion energization processes by plasma waves around Jovian satellites, SWPIA contributes scientific output of JUICE as much as possible with keeping its impact on the telemetry data size to a minimum.
STRS Radio Service Software for NASA's SCaN Testbed
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.
2012-01-01
NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.
STRS Radio Service Software for NASA's SCaN Testbed
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.
2013-01-01
NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.
Using cognitive work analysis to explore activity allocation within military domains.
Jenkins, D P; Stanton, N A; Salmon, P M; Walker, G H; Young, M S
2008-06-01
Cognitive work analysis (CWA) is frequently advocated as an approach for the analysis of complex socio-technical systems. Much of the current CWA literature within the military domain pays particular attention to its initial phases; work domain analysis and contextual task analysis. Comparably, the analysis of the social and organisational constraints receives much less attention. Through the study of a helicopter mission planning system software tool, this paper describes an approach for investigating the constraints affecting the distribution of work. The paper uses this model to evaluate the potential benefits of the social and organisational analysis phase within a military context. The analysis shows that, through its focus on constraints, the approach provides a unique description of the factors influencing the social organisation within a complex domain. This approach appears to be compatible with existing approaches and serves as a validation of more established social analysis techniques. As part of the ergonomic design of mission planning systems, the social organisation and cooperation analysis phase of CWA provides a constraint-based description informing allocation of function between key actor groups. This approach is useful because it poses questions related to the transfer of information and optimum working practices.
Software selection based on analysis and forecasting methods, practised in 1C
NASA Astrophysics Data System (ADS)
Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.
2015-09-01
The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.
Haramija, Marko
2018-03-01
Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.
Kaufmann, Sascha; Russo, Giorgio I; Thaiss, Wolfgang; Notohamiprodjo, Mike; Bamberg, Fabian; Bedke, Jens; Morgia, Giuseppe; Nikolaou, Konstantin; Stenzl, Arnulf; Kruck, Stephan
2018-04-03
Multiparametric magnetic resonance imaging (mpMRI) is gaining acceptance to guide targeted biopsy (TB) in prostate cancer (PC) diagnosis. We aimed to compare the detection rate of software-assisted fusion TB (SA-TB) versus cognitive fusion TB (COG-TB) for PC and to evaluate potential clinical features in detecting PC and clinically significant PC (csPC) at TB. This was a retrospective cohort study of patients with rising and/or persistently elevated prostate-specific antigen (PSA) undergoing mpMRI followed by either transperineal SA-TB or transrectal COG-TB. The analysis showed a matched-paired analysis between SA-TB versus COG-TB without differences in clinical or radiological characteristics. Differences among detection of PC/csPC among groups were analyzed. A multivariable logistic regression model predicting PC at TB was fitted. The model was evaluated using the receiver operating characteristic-derived area under the curve, goodness of fit test, and decision-curve analyses. One hundred ninety-one and 87 patients underwent SA-TB or COG-TB, respectively. The multivariate logistic analysis showed that SA-TB was associated with overall PC (odds ratio [OR], 5.70; P < .01) and PC at TB (OR, 3.00; P < .01) but not with overall csPC (P = .40) and csPC at TB (P = .40). A nomogram predicting PC at TB was constructed using the Prostate Imaging Reporting and Data System version 2.0, age, PSA density and biopsy technique, showing improved clinical risk prediction against a threshold probability of 10% with a c-index of 0.83. In patients with suspected PC, software-assisted biopsy detects most cancers and outperforms the cognitive approach in targeting magnetic resonance imaging-visible lesions. Furthermore, we introduced a prebiopsy nomogram for the probability of PC in TB. Copyright © 2018 Elsevier Inc. All rights reserved.
Thon, Anika; Teichgräber, Ulf; Tennstedt-Schenk, Cornelia; Hadjidemetriou, Stathis; Winzler, Sven; Malich, Ansgar; Papageorgiou, Ismini
2017-01-01
Prostate cancer (PCa) diagnosis by means of multiparametric magnetic resonance imaging (mpMRI) is a current challenge for the development of computer-aided detection (CAD) tools. An innovative CAD-software (Watson Elementary™) was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade. To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies. The evaluation was retrospective for 104 lesions (47 PCa, 57 benign) from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC) maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI). The analysis focused on (i) the CAD sensitivity and specificity to classify suspect lesions and (ii) the MAI correlation with the histopathological ground truth. The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test). Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02), which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation). The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis.
Thon, Anika; Teichgräber, Ulf; Tennstedt-Schenk, Cornelia; Hadjidemetriou, Stathis; Winzler, Sven; Malich, Ansgar
2017-01-01
Background Prostate cancer (PCa) diagnosis by means of multiparametric magnetic resonance imaging (mpMRI) is a current challenge for the development of computer-aided detection (CAD) tools. An innovative CAD-software (Watson Elementary™) was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade. Aim/Objective To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies. Methods The evaluation was retrospective for 104 lesions (47 PCa, 57 benign) from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC) maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI). The analysis focused on (i) the CAD sensitivity and specificity to classify suspect lesions and (ii) the MAI correlation with the histopathological ground truth. Results The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test). Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02), which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation). Conclusion The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis. PMID:29023572
New software for statistical analysis of Cambridge Structural Database data
Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.
2011-01-01
A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784
NASA Astrophysics Data System (ADS)
Virgen, Matthew Miguel
Two significant goals in solar plant operation are lower cost and higher efficiencies. To achieve those goals, a combined cycle gas turbine (CCGT) system, which uses the hot gas turbine exhaust to produce superheated steam for a bottoming Rankine cycle by way of a heat recovery steam generator (HRSG), is investigated in this work. Building off of a previous gas turbine model created at the Combustion and Solar Energy Laboratory at SDSU, here are added the HRSG and steam turbine model, which had to handle significant change in the mass flow and temperature of air exiting the gas turbine due to varying solar input. A wide range of cases were run to explore options for maximizing both power and efficiency from the proposed CSP CCGT plant. Variable guide vanes (VGVs) were found in the earlier model to be an effective tool in providing operational flexibility to address the variable nature of solar input. Combined cycle efficiencies in the range of 50% were found to result from this plant configuration. However, a combustor inlet temperature (CIT) limit leads to two distinct Modes of operation, with a sharp drop in both plant efficiency and power occurring when the air flow through the receiver exceeded the CIT limit. This drawback can be partially addressed through strategic use of the VGVs. Since system response is fully established for the relevant range of solar input and variable guide vane angles, the System Advisor Model (SAM) from NREL can be used to find what the actual expected solar input would be over the course of the day, and plan accordingly. While the SAM software is not yet equipped to model a Brayton cycle cavity receiver, appropriate approximations were made in order to produce a suitable heliostat field to fit this system. Since the SPHER uses carbon nano-particles as the solar absorbers, questions of particle longevity and how the particles might affect the flame behavior in the combustor were addressed using the chemical kinetics software ChemkinPro by modeling the combustion characteristics both with and without the particles. This work is presented in the Appendix.
Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.
Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko
2017-11-01
To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.
Sugammadex Improves Neuromuscular Function in Patients Receiving Perioperative Steroids.
Ozer, A B; Bolat, E; Erhan, O L; Kilinc, M; Demirel, I; Toprak, G Caglar
2018-02-01
Sugammadex has steroid-encapsulating effect. This study was undertaken to assess whether the clinical efficacy of sugammadex was altered by the administration of steroids. Sixty patients between 18 and 60 years of age with the American Society of Anesthesiologists I-IV and undergoing elective direct laryngoscopy/biopsy were included in this study. Patients were assigned to two groups based on the intraoperative steroid use: those who received steroid (Group S) and who did not (Group C). After standard general anesthesia, patients were monitored with the train of four (TOF) monitoring. The preferred steroid and its dose, timing of steroid administration, and TOF value before and after sugammadex as well as the time to recovery (TOF of 0.9) were recorded. SPSS software version 17.0 was used for statistical analysis. There is no statistically significant difference between groups in terms of age, gender, preoperative medication use, and TOF ratio just before administering sugammadex. The reached time to TOF 0.9 after sugammadex administration was significantly shorter in Group S than Group C (P < 0.05). A within-group comparison in Group S showed no difference in TOF ratio immediately before sugammadex as well as the dose of sugammadex in those who received prednisolone; time to TOF 0.9 was higher in prednisolone receivers as compared to dexamethasone receivers (P < 0.05). In patients receiving steroids, and particularly dexamethasone, an earlier reversal of neuromuscular block by sugammadex was found, in contrast with what one expect. Further studies are required to determine the cause of this effect which is probably due to a potential interaction between sugammadex and steroids.
Ground-Based GPS Sensing of Azimuthal Variations in Precipitable Water Vapor
NASA Technical Reports Server (NTRS)
Kroger, P. M.; Bar-Sever, Y. E.
1997-01-01
Current models for troposphere delay employed by GPS software packages map the total zenith delay to the line-of-sight delay of the individual satellite-receiver link under the assumption of azimuthal homogeneity. This could be a poor approximation for many sites, in particular, those located at an ocean front or next to a mountain range. We have modified the GIPSY-OASIS II software package to include a simple non-symmetric mapping function (MacMillan, 1995) which introduces two new parameters.
INRstar: computerised decision support software for anticoagulation management in primary care.
Jones, Robert Treharne; Sullivan, Mark; Barrett, David
2005-01-01
Computerised decision support software (CDSS) for anticoagulation management has become established practice in the UK, offering significant advantages for patients and clinicians over traditional methods of dose calculation. The New GMS Contract has been partly responsible for this shift of management from secondary to primary care, in which INRstar has been the market leader for many years. In September 2004, INRstar received the John Perry Prize, awarded by the PHCSG for excellence and innovation in medical applications of information technology.
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Using Combined SFTA and SFMECA Techniques for Space Critical Software
NASA Astrophysics Data System (ADS)
Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.
2012-01-01
This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.
Software development predictors, error analysis, reliability models and software metric analysis
NASA Technical Reports Server (NTRS)
Basili, Victor
1983-01-01
The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.
Implementation of a vector-based tracking loop receiver in a pseudolite navigation system.
So, Hyoungmin; Lee, Taikjin; Jeon, Sanghoon; Kim, Chongwon; Kee, Changdon; Kim, Taehee; Lee, Sanguk
2010-01-01
We propose a vector tracking loop (VTL) algorithm for an asynchronous pseudolite navigation system. It was implemented in a software receiver and experiments in an indoor navigation system were conducted. Test results show that the VTL successfully tracks signals against the near-far problem, one of the major limitations in pseudolite navigation systems, and could improve positioning availability by extending pseudolite navigation coverage.
UWB Tracking Software Development
NASA Technical Reports Server (NTRS)
Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda
2006-01-01
An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishihara, T
Currently, the problem at hand is in distributing identical copies of OEP and filter software to a large number of farm nodes. One of the common methods used to transfer these softwares is through unicast. Unicast protocol faces the problem of repetitiously sending the same data over the network. Since the sending rate is limited, this process poses to be a bottleneck. Therefore, one possible solution to this problem lies in creating a reliable multicast protocol. A specific type of multicast protocol is the Bulk Multicast Protocol [4]. This system consists of one sender distributing data to many receivers. Themore » sender delivers data at a given rate of data packets. In response to that, the receiver replies to the sender with a status packet which contains information about the packet loss in terms of Negative Acknowledgment. The probability of the status packet sent back to the sender is+, where N is the number of receivers. The protocol is designed to have approximately 1 status packet for each data packet sent. In this project, we were able to show that the time taken for the complete transfer of a file to multiple receivers was about 12 times faster with multicast than by the use of unicast. The implementation of this experimental protocol shows remarkable improvement in mass data transfer to a large number of farm machines.« less
Automated daily quality control analysis for mammography in a multi-unit imaging center.
Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli
2018-01-01
Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.
Sreenan, J J; Tbakhi, A; Edinger, M G; Tubbs, R R
1997-02-01
Isotypic control reagents are defined as irrelevant antibodies of the same immunoglobulin class as the relevant reagent antibody in a flow cytometry panel. The use of the isotypic control antibody has been advocated as a necessary quality control measure in analysis of flow cytometry. The purpose of this study was to determine the necessity of an isotypic control antibody in the analysis of CD3+ and CD3+, CD4+ lymphocyte subsets. We performed a prospective study of 46 consecutive patient samples received for lymphocyte subset analysis to determine the need for the isotypic control. For each sample, a sham buffer (autocontrol) and isotypic control reagent were stained for three-color immunofluorescence, processed, and identically analyzed with Attractors software. The Attractors software allowed independent, multiparametric, simultaneous gating; was able to identically and reproducibly process each list mode file; and yielded population data in spreadsheet form. Statistical analysis (Fisher's z test) revealed no difference between the CD3+ autocontrol and CD3+ isotypic control (correlation = 1, P < .0001) or between the CD3+, CD4+ autocontrol and the CD3+, CD4+ isotypic control (correlation = 1, P < .0001). The elimination of the isotypic control reagent resulted in a total cost savings of $3.36 per test. Additionally, the subtraction of isotypic background can artifactually depress population enumeration. The use of an isotypic control antibody is not necessary to analyze flow cytometric data that result in discrete cell populations, such as CD3+ and CD3+, CD4+ lymphocyte subsets. The elimination of this unnecessary quality control measure results in substantial cost savings.
NASA Astrophysics Data System (ADS)
Richter, Dale A.; Higdon, N. S.; Ponsardin, Patrick L.; Sanchez, David; Chyba, Thomas H.; Temple, Doyle A.; Gong, Wei; Battle, Russell; Edmondson, Mika; Futrell, Anne; Harper, David; Haughton, Lincoln; Johnson, Demetra; Lewis, Kyle; Payne-Baggott, Renee S.
2002-01-01
ITTs Advanced Engineering and Sciences Division and the Hampton University Center for Lidar and Atmospheric Sciences Students (CLASS) team have worked closely to design, fabricate and test an eye-safe, scanning aerosol-lidar system that can be safely deployed and used by students form a variety of disciplines. CLASS is a 5-year undergraduate- research training program funded by NASA to provide hands-on atmospheric-science and lidar-technology education. The system is based on a 1.5 micron, 125 mJ, 20 Hz eye-safe optical parametric oscillator (OPO) and will be used by the HU researchers and students to evaluate the biological impact of aerosols, clouds, and pollution a variety of systems issues. The system design tasks we addressed include the development of software to calculate eye-safety levels and to model lidar performance, implementation of eye-safety features in the lidar transmitter, optimization of the receiver using optical ray tracing software, evaluation of detectors and amplifiers in the near RI, test of OPO and receiver technology, development of hardware and software for laser and scanner control and video display of the scan region.
21 CFR 876.1300 - Ingestible telemetric gastrointestinal capsule imaging system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... images of the small bowel with a wireless camera contained in a capsule. This device includes an... receiving/recording unit, a data storage device, computer software to process the images, and accessories...
21 CFR 876.1300 - Ingestible telemetric gastrointestinal capsule imaging system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... images of the small bowel with a wireless camera contained in a capsule. This device includes an... receiving/recording unit, a data storage device, computer software to process the images, and accessories...
21 CFR 876.1300 - Ingestible telemetric gastrointestinal capsule imaging system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... images of the small bowel with a wireless camera contained in a capsule. This device includes an... receiving/recording unit, a data storage device, computer software to process the images, and accessories...
21 CFR 876.1300 - Ingestible telemetric gastrointestinal capsule imaging system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... images of the small bowel with a wireless camera contained in a capsule. This device includes an... receiving/recording unit, a data storage device, computer software to process the images, and accessories...
Analysis of Software Systems for Specialized Computers,
computer) with given computer hardware and software . The object of study is the software system of a computer, designed for solving a fixed complex of...purpose of the analysis is to find parameters that characterize the system and its elements during operation, i.e., when servicing the given requirement flow. (Author)
Adaptive reconfigurable V-BLAST type equalizer for cognitive MIMO-OFDM radios
NASA Astrophysics Data System (ADS)
Ozden, Mehmet Tahir
2015-12-01
An adaptive channel shortening equalizer design for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) radio receivers is considered in this presentation. The proposed receiver has desirable features for cognitive and software defined radio implementations. It consists of two sections: MIMO decision feedback equalizer (MIMO-DFE) and adaptive multiple Viterbi detection. In MIMO-DFE section, a complete modified Gram-Schmidt orthogonalization of multichannel input data is accomplished using sequential processing multichannel Givens lattice stages, so that a Vertical Bell Laboratories Layered Space Time (V-BLAST) type MIMO-DFE is realized at the front-end section of the channel shortening equalizer. Matrix operations, a major bottleneck for receiver operations, are accordingly avoided, and only scalar operations are used. A highly modular and regular radio receiver architecture that has a suitable structure for digital signal processing (DSP) chip and field programable gate array (FPGA) implementations, which are important for software defined radio realizations, is achieved. The MIMO-DFE section of the proposed receiver can also be reconfigured for spectrum sensing and positioning functions, which are important tasks for cognitive radio applications. In connection with adaptive multiple Viterbi detection section, a systolic array implementation for each channel is performed so that a receiver architecture with high computational concurrency is attained. The total computational complexity is given in terms of equalizer and desired response filter lengths, alphabet size, and number of antennas. The performance of the proposed receiver is presented for two-channel case by means of mean squared error (MSE) and probability of error evaluations, which are conducted for time-invariant and time-variant channel conditions, orthogonal and nonorthogonal transmissions, and two different modulation schemes.
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
Influence analysis of Github repositories.
Hu, Yan; Zhang, Jun; Bai, Xiaomei; Yu, Shuo; Yang, Zhuo
2016-01-01
With the support of cloud computing techniques, social coding platforms have changed the style of software development. Github is now the most popular social coding platform and project hosting service. Software developers of various levels keep entering Github, and use Github to save their public and private software projects. The large amounts of software developers and software repositories on Github are posing new challenges to the world of software engineering. This paper tries to tackle one of the important problems: analyzing the importance and influence of Github repositories. We proposed a HITS based influence analysis on graphs that represent the star relationship between Github users and repositories. A weighted version of HITS is applied to the overall star graph, and generates a different set of top influential repositories other than the results from standard version of HITS algorithm. We also conduct the influential analysis on per-month star graph, and study the monthly influence ranking of top repositories.
Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J
2014-02-01
A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.
Retrieval and Validation of Zenith and Slant Path Delays From the Irish GPS Network
NASA Astrophysics Data System (ADS)
Hanafin, Jennifer; Jennings, S. Gerard; O'Dowd, Colin; McGrath, Ray; Whelan, Eoin
2010-05-01
Retrieval of atmospheric integrated water vapour (IWV) from ground-based GPS receivers and provision of this data product for meteorological applications has been the focus of a number of Europe-wide networks and projects, most recently the EUMETNET GPS water vapour programme. The results presented here are from a project to provide such information about the state of the atmosphere around Ireland for climate monitoring and improved numerical weather prediction. Two geodetic reference GPS receivers have been deployed at Valentia Observatory in Co. Kerry and Mace Head Atmospheric Research Station in Co. Galway, Ireland. These two receivers supplement the existing Ordnance Survey Ireland active network of 17 permanent ground-based receivers. A system to retrieve column-integrated atmospheric water vapour from the data provided by this network has been developed, based on the GPS Analysis at MIT (GAMIT) software package. The data quality of the zenith retrievals has been assessed using co-located radiosondes at the Valentia site and observations from a microwave profiling radiometer at the Mace Head site. Validation of the slant path retrievals requires a numerical weather prediction model and HIRLAM (High-Resolution Limited Area Model) version 7.2, the current operational forecast model in use at Met Éireann for the region, has been used for this validation work. Results from the data processing and comparisons with the independent observations and model will be presented.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
In-shoe plantar pressure measurement and analysis system based on fabric pressure sensing array.
Shu, Lin; Hua, Tao; Wang, Yangyong; Qiao Li, Qiao; Feng, David Dagan; Tao, Xiaoming
2010-05-01
Spatial and temporal plantar pressure distributions are important and useful measures in footwear evaluation, athletic training, clinical gait analysis, and pathology foot diagnosis. However, present plantar pressure measurement and analysis systems are more or less uncomfortable to wear and expensive. This paper presents an in-shoe plantar pressure measurement and analysis system based on a textile fabric sensor array, which is soft, light, and has a high-pressure sensitivity and a long service life. The sensors are connected with a soft polymeric board through conductive yarns and integrated into an insole. A stable data acquisition system interfaces with the insole, wirelessly transmits the acquired data to remote receiver through Bluetooth path. Three configuration modes are incorporated to gain connection with desktop, laptop, or smart phone, which can be configured to comfortably work in research laboratories, clinics, sport ground, and other outdoor environments. A real-time display and analysis software is presented to calculate parameters such as mean pressure, peak pressure, center of pressure (COP), and shift speed of COP. Experimental results show that this system has stable performance in both static and dynamic measurements.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...
FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting
NASA Astrophysics Data System (ADS)
Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.
2009-10-01
The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.
ERIC Educational Resources Information Center
Rudner, Lawrence M.; Glass Gene V.; Evartt, David L.; Emery, Patrick J.
This manual and the accompanying software are intended to provide a step-by-step guide to conducting a meta-analytic study along with references for further reading and free high-quality software, "Meta-Stat.""Meta-Stat" is a comprehensive package designed to help in the meta-analysis of research studies in the social and behavioral sciences.…
Development of an automated asbestos counting software based on fluorescence microscopy.
Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio
2015-01-01
An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.
IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.
Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M
2016-04-01
Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.
Desai, Trunil S; Srivastava, Shireesh
2018-01-01
13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses
Desai, Trunil S.
2018-01-01
13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347
Padalino, Saverio; Sfondrini, Maria Francesca; Chenuil, Laura; Scudeller, Luigia; Gandini, Paola
2014-12-01
The aim of this study was to assess the feasibility of skeletal maturation analysis using the Cervical Vertebrae Maturation (CVM) method by means of dedicated software, developed in collaboration with Outside Format (Paullo-Milan), as compared with manual analysis. From a sample of patients aged 7-21 years, we gathered 100 lateral cephalograms, 20 for each of the five CVM stages. For each cephalogram, we traced cervical vertebrae C2, C3 and C4 by hand using a lead pencil and an acetate sheet and dedicated software. All the tracings were made by an experienced operator (a dentofacial orthopedics resident) and by an inexperienced operator (a student in dental surgery). Each operator recorded the time needed to make each tracing in order to demonstrate differences in the times taken. Concordance between the manual analysis and the analysis performed using the dedicated software was 94% for the resident and 93% for the student. Interobserver concordance was 99%. The hand-tracing was quicker than that performed by means of the software (28 seconds more on average). The cervical vertebrae analysis software offers excellent clinical performance, even if the method takes longer than the manual technique. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Suh, Soon-Rim; Lee, Myung Kyung
2017-07-01
To evaluate the effects of nurse-led telephone-based supportive interventions (NTSIs) for patients with cancer. . Electronic databases, including EMBASE®, MEDLINE, Google Scholar, Cochrane Library CENTRAL, ProQuest Medical Library, and CINAHL®, were searched through February 2016. . 239 studies were identified; 16 were suitable for meta-analysis. Cochrane's risk of bias tool and the Comprehensive Meta-Analysis software were used. . The authors performed a meta-analysis of 16 trials that met eligibility criteria. Thirteen randomized, controlled trials (RCTs) and three non-RCTs examined a total of 2,912 patients with cancer. Patients who received NTSIs were compared with those who received attentional control or usual care (no intervention). . Telephone interventions delivered by a nurse in an oncology care setting reduced cancer symptoms with a moderate effect size (ES) (-0.33) and emotional distress with a small ES (-0.12), and improved self-care with a large ES (0.64) and health-related quality of life (HRQOL) with a small ES (0.3). Subgroup analyses indicated that the significant effects of NTSIs on cancer symptoms, emotional distress, and HRQOL were larger for studies that combined an application of a theoretical framework, had a control group given usual care, and used an RTC design. . The findings suggest that an additional tiered evaluation that has a theoretical underpinning and high-quality methodology is required to confirm the efficacy of NTSI for adoption of specific care models.
Intelligent sensor and controller framework for the power grid
Akyol, Bora A.; Haack, Jereme Nathan; Craig, Jr., Philip Allen; Tews, Cody William; Kulkarni, Anand V.; Carpenter, Brandon J.; Maiden, Wendy M.; Ciraci, Selim
2015-07-28
Disclosed below are representative embodiments of methods, apparatus, and systems for monitoring and using data in an electric power grid. For example, one disclosed embodiment comprises a sensor for measuring an electrical characteristic of a power line, electrical generator, or electrical device; a network interface; a processor; and one or more computer-readable storage media storing computer-executable instructions. In this embodiment, the computer-executable instructions include instructions for implementing an authorization and authentication module for validating a software agent received at the network interface; instructions for implementing one or more agent execution environments for executing agent code that is included with the software agent and that causes data from the sensor to be collected; and instructions for implementing an agent packaging and instantiation module for storing the collected data in a data container of the software agent and for transmitting the software agent, along with the stored data, to a next destination.
Ada education in a software life-cycle context
NASA Technical Reports Server (NTRS)
Clough, Anne J.
1986-01-01
Some of the experience gained from a comprehensive educational program undertaken at The Charles Stark Draper Lab. to introduce the Ada language and to transition modern software engineering technology into the development of Ada and non-Ada applications is described. Initially, a core group, which included manager, engineers and programmers, received training in Ada. An Ada Office was established to assume the major responsibility for training, evaluation, acquisition and benchmarking of tools, and consultation on Ada projects. As a first step in this process, and in-house educational program was undertaken to introduce Ada to the Laboratory. Later, a software engineering course was added to the educational program as the need to address issues spanning the entire software life cycle became evident. Educational efforts to date are summarized, with an emphasis on the educational approach adopted. Finally, lessons learned in administering this program are addressed.
Low Power, Low Mass, Modular, Multi-band Software-defined Radios
NASA Technical Reports Server (NTRS)
Haskins, Christopher B. (Inventor); Millard, Wesley P. (Inventor)
2013-01-01
Methods and systems to implement and operate software-defined radios (SDRs). An SDR may be configured to perform a combination of fractional and integer frequency synthesis and direct digital synthesis under control of a digital signal processor, which may provide a set of relatively agile, flexible, low-noise, and low spurious, timing and frequency conversion signals, and which may be used to maintain a transmit path coherent with a receive path. Frequency synthesis may include dithering to provide additional precision. The SDR may include task-specific software-configurable systems to perform tasks in accordance with software-defined parameters or personalities. The SDR may include a hardware interface system to control hardware components, and a host interface system to provide an interface to the SDR with respect to a host system. The SDR may be configured for one or more of communications, navigation, radio science, and sensors.
Applications of software-defined radio (SDR) technology in hospital environments.
Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko
2013-01-01
A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.
Intelligent sensor and controller framework for the power grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Haack, Jereme Nathan; Craig, Jr., Philip Allen
Disclosed below are representative embodiments of methods, apparatus, and systems for monitoring and using data in an electric power grid. For example, one disclosed embodiment comprises a sensor for measuring an electrical characteristic of a power line, electrical generator, or electrical device; a network interface; a processor; and one or more computer-readable storage media storing computer-executable instructions. In this embodiment, the computer-executable instructions include instructions for implementing an authorization and authentication module for validating a software agent received at the network interface; instructions for implementing one or more agent execution environments for executing agent code that is included with themore » software agent and that causes data from the sensor to be collected; and instructions for implementing an agent packaging and instantiation module for storing the collected data in a data container of the software agent and for transmitting the software agent, along with the stored data, to a next destination.« less
Three Years of Global Positioning System Experience on International Space Station
NASA Technical Reports Server (NTRS)
Gomez, Susan
2005-01-01
The International Space Station global positioning systems (GPS) receiver was activated in April 2002. Since that time, numerous software anomalies surfaced that had to be worked around. Some of the software problems required waivers, such as the time function, while others required extensive operator intervention, such as numerous power cycles. Eventually, enough anomalies surfaced that the three pieces of code included in the GPS unit have been re-written and the GPS units were upgraded. The technical aspects of the problems are discussed, as well as the underlying causes that led to the delivery of a product that has had numerous problems. The technical aspects of the problems included physical phenomena that were not well understood, such as the affect that the ionosphere would have on the GPS measurements. The underlying causes were traced to inappropriate use of legacy software, changing requirements, inadequate software processes, unrealistic schedules, incorrect contract type, and unclear ownership responsibilities.
Three Years of Global Positioning System Experience on International Space Station
NASA Technical Reports Server (NTRS)
Gomez, Susan
2006-01-01
The International Space Station global positioning system (GPS) receiver was activated in April 2002. Since that time, numerous software anomalies surfaced that had to be worked around. Some of the software problems required waivers, such as the time function, while others required extensive operator intervention, such as numerous power cycles. Eventually enough anomalies surfaced that the three pieces of code included in the GPS unit have been re-written and the GPS units upgraded. The technical aspects of the problems are discussed, as well as the underlying causes that led to the delivery of a product that has had so many problems. The technical aspects of the problems included physical phenomena that were not well understood, such as the affect that the ionosphere would have on the GPS measurements. The underlying causes were traced to inappropriate use of legacy software, changing requirements, inadequate software processes, unrealistic schedules, incorrect contract type, and unclear ownership responsibilities..
Software design of a remote real-time ECG monitoring system
NASA Astrophysics Data System (ADS)
Yu, Chengbo; Tao, Hongyan
2005-12-01
Heart disease is one of the main diseases that threaten the health and lives of human beings. At present, the normal remote ECG monitoring system has the disadvantages of a short testing distance and limitation of monitoring lines. Because of accident and paroxysmal disease, ECG monitoring has extended from the hospital to the family. Therefore, remote ECG monitoring through the Internet has the actual value and significance. The principle and design method of software of the remote dynamic ECG monitor was presented and discussed. The monitoring software is programmed with Delphi software based on client-sever interactive mode. The application program of the system, which makes use of multithreading technology, is shown to perform in an excellent manner. The program includes remote link users and ECG processing, i.e. ECG data's receiving, real-time displaying, recording and replaying. The system can connect many clients simultaneously and perform real-time monitoring to patients.
New software for 3D fracture network analysis and visualization
NASA Astrophysics Data System (ADS)
Song, J.; Noh, Y.; Choi, Y.; Um, J.; Hwang, S.
2013-12-01
This study presents new software to perform analysis and visualization of the fracture network system in 3D. The developed software modules for the analysis and visualization, such as BOUNDARY, DISK3D, FNTWK3D, CSECT and BDM, have been developed using Microsoft Visual Basic.NET and Visualization TookKit (VTK) open-source library. Two case studies revealed that each module plays a role in construction of analysis domain, visualization of fracture geometry in 3D, calculation of equivalent pipes, production of cross-section map and management of borehole data, respectively. The developed software for analysis and visualization of the 3D fractured rock mass can be used to tackle the geomechanical problems related to strength, deformability and hydraulic behaviors of the fractured rock masses.
Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D
2018-01-01
Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757
Development problem analysis of correlation leak detector’s software
NASA Astrophysics Data System (ADS)
Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.
2018-05-01
In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.
Endocrown restorations: Influence of dental remnant and restorative material on stress distribution.
Tribst, João Paulo Mendes; Dal Piva, Amanda Maria de Oliveira; Madruga, Camila Ferreira Leite; Valera, Marcia Carneiro; Borges, Alexandre Luiz Souto; Bresciani, Eduardo; de Melo, Renata Marques
2018-06-20
The goal of this study was to evaluate the stress distribution in a tooth/restoration system according to the factors "amount of dental remnant" (3 levels) and "restorative material" (2 levels). Three endodontically treated maxillary molars were modeled with CAD software for conducting non-linear finite element analysis (FEA), each with a determined amount of dental remnant of 1.5, 3, or 4.5mm. Models were duplicated, and half received restorations in lithium disilicate (IPS e.max CAD), while the other half received leucite ceramic restorations (IPS Empress CAD), both from Ivoclar Vivadent (Schaan, Liechtenstein). The solids were imported to analysis software (ANSYS 17.2, ANSYS Inc., Houston, TX, USA) in STEP format. All contacts involving the resin cement were considered no-separation, whereas between teeth and fixation cylinder, the contact was considered perfectly bonded. The mechanical properties of each structure were reported, and the materials were considered isotropic, linearly elastic, and homogeneous. An axial load (300N) was applied at the occlusal surface (triploidism area). Results were determined by colorimetric graphs of maximum principal stress (MPS) on tooth remnant, cement line, and restoration. MPS revealed that both factors influenced the stress distribution for all structures; the higher the material's elastic modulus, the higher the stress concentration on the restoration and the lower the stress concentration on the cement line. Moreover, the greater the dental crown remnant, the higher the stress concentration on the restoration. Thus, the remaining dental tissue should always be preserved. In situations in which few dental remnants are available, the thicker the restoration, the higher the concentration of stresses in its structure, protecting the adhesive interface from potential adhesive failures. Results are more promising when the endocrown is fabricated with lithium disilicate ceramic. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.
Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R
2011-06-01
Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.
Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna
2016-01-01
Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.
García-Pérez, M A
2001-11-01
This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.
Soares, Mariana Quirino Silveira; Van Dessel, Jeroen; Jacobs, Reinhilde; da Silva Santos, Paulo Sérgio; Cestari, Tania Mary; Garlet, Gustavo Pompermaier; Duarte, Marco Antonio Hungaro; Imada, Thaís Sumie Nozu; Lambrichts, Ivo; Rubira-Bullen, Izabel Regina Fischer
2018-03-15
The aim was to assess the effect of a relevant regimen of zoledronic acid (ZA) treatment for the study of bisphosphonate-related osteonecrosis of the jaw on alveolar bone microstructure and vasculature. A sub-objective was to use 3-dimensional imaging to describe site-specific changes induced by ZA in the alveolar bone. Five Wistar rats received ZA (0.6 mg/kg) and five (controls) received saline solution in the same volume. The compounds were administered intraperitoneally in 5 doses every 28 days. The rats were euthanized 150 days after therapy onset. The mandibles were scanned using high-resolution (14-μm) micro-computed tomography (micro-CT), decalcified, cut into slices for histologic analysis (5 μm), and stained with hematoxylin-eosin. Bone quality parameters were calculated using CT-Analyser software (Bruker, Kontich, Belgium) in 2 different volumes of interest (VOIs): the region between the first molar roots (VOI-1) and the periapical region under the first and second molars' apex (VOI-2). Blood vessel density and bone histomorphometric parameters were calculated only for the region between the roots of the first molar using AxioVision Imaging software (version 4.8; Carl Zeiss, Gottingen, Germany). ZA-treated rats showed a significant increase in percentage of bone volume and density (P < .05), with thicker and more connected trabeculae. Furthermore, the ZA group showed a significant decrease in the size of the marrow spaces and nutritive canals and in blood vessel density (P < .05). In the micro-CT evaluation, VOI-2 showed better outcomes in measuring the effect of ZA on alveolar bone. ZA treatment induced bone corticalization and decreased alveolar bone vascularization. VOI-2 should be preferred for micro-CT evaluation of the effect of bisphosphonates on alveolar bone. This analysis allowed the effect of ZA on alveolar bone and its vascularization to be characterized. The results of this analysis may add further knowledge to the understanding of the physiopathology of osteonecrosis of the jaw. Copyright © 2018 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Toward Baseline Software Anomalies in NASA Missions
NASA Technical Reports Server (NTRS)
Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.
2012-01-01
In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.
Hardware in-the-Loop Demonstration of Real-Time Orbit Determination in High Earth Orbits
NASA Technical Reports Server (NTRS)
Moreau, Michael; Naasz, Bo; Leitner, Jesse; Carpenter, J. Russell; Gaylor, Dave
2005-01-01
This paper presents results from a study conducted at Goddard Space Flight Center (GSFC) to assess the real-time orbit determination accuracy of GPS-based navigation in a number of different high Earth orbital regimes. Measurements collected from a GPS receiver (connected to a GPS radio frequency (RF) signal simulator) were processed in a navigation filter in real-time, and resulting errors in the estimated states were assessed. For the most challenging orbit simulated, a 12 hour Molniya orbit with an apogee of approximately 39,000 km, mean total position and velocity errors were approximately 7 meters and 3 mm/s respectively. The study also makes direct comparisons between the results from the above hardware in-the-loop tests and results obtained by processing GPS measurements generated from software simulations. Care was taken to use the same models and assumptions in the generation of both the real-time and software simulated measurements, in order that the real-time data could be used to help validate the assumptions and models used in the software simulations. The study makes use of the unique capabilities of the Formation Flying Test Bed at GSFC, which provides a capability to interface with different GPS receivers and to produce real-time, filtered orbit solutions even when less than four satellites are visible. The result is a powerful tool for assessing onboard navigation performance in a wide range of orbital regimes, and a test-bed for developing software and procedures for use in real spacecraft applications.