The Trial Software version for DEMETER power spectrum files visualization and mapping
NASA Astrophysics Data System (ADS)
Lozbin, Anatoliy; Inchin, Alexander; Shpadi, Maxim
2010-05-01
In the frame of Kazakhstan's Scientific Space System creation for earthquakes precursors research, the hardware and software of DEMETER satellite was investigated. The data processing Software of DEMETER is based on package SWAN under IDL Virtual machine and realizes many features, but we can't find an important tool for the spectrograms analysis - space-time visualization of power spectrum files from electromagnetic devices as ICE and IMSC. For elimination of this problem we have developed Software which is offered to use. The DeSS (DEMETER Spectrogram Software) - it is Software for visualization, analysis and a mapping of power spectrum data from electromagnetic devices ICE and IMSC. The Software primary goal is to give the researcher friendly tool for the analysis of electromagnetic data from DEMETER Satellite for earthquake precursors and other ionosphere events researches. The Input data for DeSS Software is a power spectrum files: - Power spectrum of 1 component of the electric field in the VLF range (APID 1132); - Power spectrum of 1 component of the electric field in the HF range (APID 1134); - Power spectrum of 1 component of the magnetic field in the VLF range (APID 1137). The main features and operations of the software is possible: - various time and frequency filtration; - visualization of time dependence of signal intensity on fixed frequency; - spectral density visualization for fixed frequency range; - spectrogram autosize and smooth spectrogram; - the information in each point of the spectrogram: time, frequency and intensity; - the spectrum information in the separate window, consisting of 4 blocks; - data mapping with 6 range scale. On the map we can browse next information: - satellite orbit; - conjugate point at the satellite altitude; - north conjugate point at the altitude 110 km; - south conjugate point at the altitude 110 km. This is only trial software version to help the researchers and we always ready collaborate with scientists for software improvement. References: 1. D.Lagoutte, J.Y. Brochot, D. de Carvalho, L.Madrias and M. Parrot. DEMETER Microsatellite. Scientific Mission Center. Data product description. DMT-SP-9-CM-6054-LPC. 2. D.Lagoutte, J.Y. Brochot, P.Latremoliere. SWAN - Software for Waveform Analysis. LPCE/NI/003.E - Part 1 (User's guide), Part 2 (Analysis tools), Part 3 (User's project interface).
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
Usage of "Powergraph" software at laboratory lessons of "general physics" department of MEPhI
NASA Astrophysics Data System (ADS)
Klyachin, N. A.; Matronchik, A. Yu.; Khangulyan, E. V.
2017-01-01
One considers usage of "PowerGraph" software in laboratory exercise "Study of sodium spectrum" of physical experiment lessons. Togethe with the design of experiment setup, one discusses the sodium spectra digitized with computer audio chip. Usage of "PowerGraph" software in laboratory experiment "Study of sodium spectrum" allows an efficient visualization of the sodium spectrum and analysis of its fine structure. In particular, it allows quantitative measurements of the wavelengths and line relative intensities.
Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang
2012-03-01
Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora
2018-06-15
Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.
An Overview of the XGAM Code and Related Software for Gamma-ray Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younes, W.
2014-11-13
The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-raymore » data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.« less
[Design of flat field holographic concave grating for near-infrared spectrophotometer].
Xiang, Xian-Yi; Wen, Zhi-Yu
2008-07-01
Near-infrared spectrum analysis can be used to determine the nature or test quantitatively some chemical compositions by detecting molecular double frequency and multiple frequency absorption. It has been used in agriculture, biology, petrifaction, foodstuff, medicament, spinning and other fields. Near-infrared spectrophotometer is the main apparatus for near-infrared spectrum analysis, and the grating is the most important part of the apparatus. Based on holographic concave grating theory and optic design software CODE V, a flat field holographic concave grating for near-infrared spectrophotometer was designed from primary structure, which relied on global optimization of the software. The contradiction between wide spectrum bound and limited spectrum extension was resolved, aberrations were reduced successfully, spectrum information was utilized fully, and the optic structure of spectrometer was highly efficient. Using CODE V software, complex high-order aberration equations need not be solved, the result can be evaluated quickly, flat field and resolving power can be kept in balance, and the work efficiency is also enhanced. A paradigm of flat field holographic concave grating is given, it works between 900 nm to 1 700 nm, the diameter of the concave grating is 25 mm, and F/ # is 1. 5. The design result was analyzed and evaluated. It was showed that if the slit source, whose width is 50 microm, is used to reconstruction, the theoretic resolution capacity is better than 6.3 nm.
NASA Astrophysics Data System (ADS)
Hayrapetyan, David B.; Hovhannisyan, Levon; Mantashyan, Paytsar A.
2013-04-01
The analysis of complex spectra is an actual problem for modern science. The work is devoted to the creation of a software package, which analyzes spectrum in the different formats, possesses by dynamic knowledge database and self-study mechanism, performs automated analysis of the spectra compound based on knowledge database by application of certain algorithms. In the software package as searching systems, hyper-spherical random search algorithms, gradient algorithms and genetic searching algorithms were used. The analysis of Raman and IR spectrum of diamond-like carbon (DLC) samples were performed by elaborated program. After processing the data, the program immediately displays all the calculated parameters of DLC.
Android application and REST server system for quasar spectrum presentation and analysis
NASA Astrophysics Data System (ADS)
Wasiewicz, P.; Pietralik, K.; Hryniewicz, K.
2017-08-01
This paper describes the implementation of a system consisting of a mobile application and RESTful architecture server intended for the analysis and presentation of quasars' spectrum. It also depicts the quasar's characteristics and significance to the scientific community, the source for acquiring astronomical objects' spectral data, used software solutions as well as presents the aspect of Cloud Computing and various possible deployment configurations.
Woynaroski, Tiffany; Oller, D. Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-01-01
Theory and research suggest that vocal development predicts “useful speech” in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently “in development” and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. PMID:27459107
Standardizing Activation Analysis: New Software for Photon Activation Analysis
NASA Astrophysics Data System (ADS)
Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.
2011-06-01
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharibyan, N.
In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less
A Paradigm Shift in Nuclear Spectrum Analysis
NASA Astrophysics Data System (ADS)
Westmeier, Wolfram; Siemon, Klaus
2012-08-01
An overview of the latest developments in quantitative spectrometry software is presented. New strategies and algorithms introduced are characterized by buzzwords “Physics, no numerology”, “Fuzzy logic” and “Repeated analyses”. With the implementation of these new ideas one arrives at software capabilities that were unreachable before and which are now realized in the GAMMA-W, SODIGAM and ALPS packages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Z. J.; Wells, D.; Green, J.
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switchingmore » the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.« less
Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM
NASA Astrophysics Data System (ADS)
Kaiya, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji
Quality requirements are scattered over a requirements specification, thus it is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs. We proposed a technique called “spectrum analysis for quality requirements” which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification. In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature of the specification with respect to quality. Therefore, we can compare a specification of a system to another one with respect to quality. As a result, we can validate such a specification because we can check whether the specification has common quality features and know its specific features against specifications of existing similar systems. However, our first spectrum analysis for quality requirements required a lot of effort and knowledge of a problem domain and it was hard to reuse such knowledge to reduce the effort. We thus introduce domain knowledge called term-characteristic map (TCM) to reuse the knowledge for our quality spectrum analysis. Through several experiments, we evaluate our spectrum analysis, and main finding are as follows. First, we confirmed specifications of similar systems have similar quality spectra. Second, results of spectrum analysis using TCM are objective, i.e., different analysts can generate almost the same spectra when they analyze the same specification.
The 1943 K emission spectrum of H216O between 6600 and 7050 cm-1
NASA Astrophysics Data System (ADS)
Czinki, Eszter; Furtenbacher, Tibor; Császár, Attila G.; Eckhardt, André K.; Mellau, Georg Ch.
2018-02-01
An emission spectrum of H216O has been recorded, with Doppler-limited resolution, at 1943 K using Hot Gas Molecular Emission (HOTGAME) spectroscopy. The wavenumber range covered is 6600 to 7050 cm-1. This work reports the analysis and subsequent assignment of close to 3700 H216O transitions out of a total of more than 6700 measured peaks. The analysis is based on the Measured Active Rotational-Vibrational Energy Levels (MARVEL) energy levels of H216O determined in 2013 and emission line intensities obtained from accurate variational nuclear-motion computations. The analysis of the spectrum yields about 1300 transitions not measured previously and 23 experimentally previously unidentified rovibrational energy levels. The accuracy of the line positions and intensities used in the analysis was improved with the spectrum deconvolution software SyMath via creating a peak list corresponding to the dense emission spectrum. The extensive list of labeled transitions and the new experimental energy levels obtained are deposited in the Supplementary Material of this article as well as in the ReSpecTh (http://www.respecth.hu) information system.
Woynaroski, Tiffany; Oller, D Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-03-01
Theory and research suggest that vocal development predicts "useful speech" in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently "in development" and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. Autism Res 2017, 10: 508-519. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.
Acoustic Emission Analysis Applet (AEAA) Software
NASA Technical Reports Server (NTRS)
Nichols, Charles T.; Roth, Don J.
2013-01-01
NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.
Sadygov, Rovshan G.; Zhao, Yingxin; Haidacher, Sigmund J.; Starkey, Jonathan M.; Tilton, Ronald G.; Denner, Larry
2010-01-01
We describe a method for ratio estimations in 18O-water labeling experiments acquired from low resolution isotopically resolved data. The method is implemented in a software package specifically designed for use in experiments making use of zoom-scan mode data acquisition. Zoom-scan mode data allows commonly used ion trap mass spectrometers to attain isotopic resolution, which make them amenable to use in labeling schemes such as 18O-water labeling, but algorithms and software developed for high resolution instruments may not be appropriate for the lower resolution data acquired in zoom-scan mode. The use of power spectrum analysis is proposed as a general approach which may be uniquely suited to these data types. The software implementation uses power spectrum to remove high-frequency noise, and band-filter contributions from co-eluting species of differing charge states. From the elemental composition of a peptide sequence we generate theoretical isotope envelopes of heavy-light peptide pairs in five different ratios; these theoretical envelopes are correlated with the filtered experimental zoom scans. To automate peptide quantification in high-throughput experiments, we have implemented our approach in a computer program, MassXplorer. We demonstrate the application of MassXplorer to two model mixtures of known proteins, and to a complex mixture of mouse kidney cortical extract. Comparison with another algorithm for ratio estimations demonstrates the increased precision and automation of MassXplorer. PMID:20568695
Software and database for the analysis of mutations in the human FBN1 gene.
Collod, G; Béroud, C; Soussi, T; Junien, C; Boileau, C
1996-01-01
Fibrillin is the major component of extracellular microfibrils. Mutations in the fibrillin gene on chromosome 15 (FBN1) were described at first in the heritable connective tissue disorder, Marfan syndrome (MFS). More recently, FBN1 has also been shown to harbor mutations related to a spectrum of conditions phenotypically related to MFS and many mutations will have to be accumulated before genotype/phenotype relationships emerge. To facilitate mutational analysis of the FBN1 gene, a software package along with a computerized database (currently listing 63 entries) have been created. PMID:8594563
NASA Astrophysics Data System (ADS)
Wang, Fu; Liu, Bo; Zhang, Lijia; Zhang, Qi; Tian, Qinghua; Tian, Feng; Rao, Lan; Xin, Xiangjun
2017-07-01
Elastic software-defined optical networks greatly improve the flexibility of the optical switching network while it has brought challenges to the routing and spectrum assignment (RSA). A multilayer virtual topology model is proposed to solve RSA problems. Two RSA algorithms based on the virtual topology are proposed, which are the ant colony optimization (ACO) algorithm of minimum consecutiveness loss and the ACO algorithm of maximum spectrum consecutiveness. Due to the computing power of the control layer in the software-defined network, the routing algorithm avoids the frequent link-state information between routers. Based on the effect of the spectrum consecutiveness loss on the pheromone in the ACO, the path and spectrum of the minimal impact on the network are selected for the service request. The proposed algorithms have been compared with other algorithms. The results show that the proposed algorithms can reduce the blocking rate by at least 5% and perform better in spectrum efficiency. Moreover, the proposed algorithms can effectively decrease spectrum fragmentation and enhance available spectrum consecutiveness.
Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research
2011-01-01
open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical
2011-08-01
EEG ; neurofeedback ; autism spectrum disorders 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF...Since PIRT or neurofeedback training is to be guided by a quantitative analysis of the EEG , it was...software for the neurofeedback training at UCSD and SLDC have been acquired, piloted, and are working • Training of Research Assistants has
NASA Technical Reports Server (NTRS)
Miller, E. F.
1982-01-01
Mathematical models used in the software package developed for use at the 1983 Regional Administrative Radio Conference on broadcasting satellites. The models described are those used in the Spectrum Orbit Utilization Program (SOUP) analysis. The geometric relationships necessary to model broadcasting satellite systems are discussed. Antenna models represent copolarized and cross polarized performance as functions of the off axis angle. The protection ratio is modelled as a co-channel value and a template representing systems with frequency offsets.
NASA Astrophysics Data System (ADS)
Russell, John L.; Campbell, John L.; Boyd, Nicholas I.; Dias, Johnny F.
2018-02-01
The newly developed GUMAP software creates element maps from OMDAQ list mode files, displays these maps individually or collectively, and facilitates on-screen definitions of specified regions from which a PIXE spectrum can be built. These include a free-hand region defined by moving the cursor. The regional charge is entered automatically into the spectrum file in a new GUPIXWIN-compatible format, enabling a GUPIXWIN analysis of the spectrum. The code defaults to the OMDAQ dead time treatment but also facilitates two other methods for dead time correction in sample regions with count rates different from the average.
CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.
Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi
2015-10-26
Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.
AN/UPX-41(C) Test Data Analysis of Impacts to Secondary Surveillance Radars.
DOT National Transportation Integrated Search
2015-02-01
In 2012, the Navy requested spectrum certification for the shipboard AN/UPX-41(C) Digital Interrogator System, Software Version 5.5 with Mode 5. Current operating conditions for the Navys AN/UPX-41(C) are the same as restrictions imposed on the AN...
Combining results of multiple search engines in proteomics.
Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W
2013-09-01
A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.
Combining Results of Multiple Search Engines in Proteomics*
Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.
2013-01-01
A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762
EML Gamma Spectrometry Data Evaluation Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Decker, Karin M.
2001-01-01
This report presents the results of the analyses for the third EML Gamma Spectrometry Data Evaluation Program (October 1999). This program assists laboratories in providing more accurate gamma spectra analysis results and provides a means for users of gamma data to assess how a laboratory performed on various types of gamma spectrometry analyses. This is accomplished through the use of synthetic gamma spectra. A calibration spectrum, a background spectrum, and three sample spectra are sent to each participant in the spectral file format requested by the laboratory. The calibration spectrum contains nuclides covering the energy range from 59.5 keV tomore » 1836 keV. The participants are told fallout and fission product nuclides could be present. The sample spectra are designed to test the ability of the software and user to properly resolve multiplets and to identify and quantify nuclides in a complicated fission product spectrum. The participants were asked to report values and uncertainties as Becquerel per sample with no decay correction. Thirty-one sets of results were reported from a total of 60 laboratories who received the spectra. Six foreign laboratories participated. The percentage of the results within 1 of the expected value was 68, 33, and 46 for samples 1, 2, and 3, respectively. From all three samples, 18% of the results were more than 3 from the expected value. Eighty-three (12%) values out of a total of 682 expected results were not reported for the three samples. Approximately 30% of these false negatives were due the laboratories not reporting 144Pr in sample 2 which was present at the minimum detectable activity level. There were 53 false positives reported with 25% of these responses due to problems with background subtraction. The results show improvement in the ability of the software or user to resolve peaks separated by 1 keV. Improvement is still needed either in the analysis report produced by the software or in the review of these results by the users.« less
Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.
Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R
2017-02-01
Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zemek, Peter G.; Plowman, Steven V.
2010-04-01
Advances in hardware have miniaturized the emissions spectrometer and associated optics, rendering them easily deployed in the field. Such systems are also suitable for vehicle mounting, and can provide high quality data and concentration information in minutes. Advances in software have accompanied this hardware evolution, enabling the development of portable point-and-click OP-FTIR systems that weigh less than 16 lbs. These systems are ideal for first-responders, military, law enforcement, forensics, and screening applications using optical remote sensing (ORS) methodologies. With canned methods and interchangeable detectors, the new generation of OP-FTIR technology is coupled to the latest forward reference-type model software to provide point-and-click technology. These software models have been established for some time. However, refined user-friendly models that use active, passive, and solar occultation methodologies now allow the user to quickly field-screen and quantify plumes, fence-lines, and combustion incident scenarios in high-temporal-resolution. Synthetic background generation is now redundant as the models use highly accurate instrument line shape (ILS) convolutions and several other parameters, in conjunction with radiative transfer model databases to model a single calibration spectrum to collected sample spectra. Data retrievals are performed directly on single beam spectra using non-linear classical least squares (NLCLS). Typically, the Hitran line database is used to generate the initial calibration spectrum contained within the software.
INFOS: spectrum fitting software for NMR analysis.
Smith, Albert A
2017-02-01
Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.
Frequency Estimator Performance for a Software-Based Beacon Receiver
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
Identifying technical aliases in SELDI mass spectra of complex mixtures of proteins
2013-01-01
Background Biomarker discovery datasets created using mass spectrum protein profiling of complex mixtures of proteins contain many peaks that represent the same protein with different charge states. Correlated variables such as these can confound the statistical analyses of proteomic data. Previously we developed an algorithm that clustered mass spectrum peaks that were biologically or technically correlated. Here we demonstrate an algorithm that clusters correlated technical aliases only. Results In this paper, we propose a preprocessing algorithm that can be used for grouping technical aliases in mass spectrometry protein profiling data. The stringency of the variance allowed for clustering is customizable, thereby affecting the number of peaks that are clustered. Subsequent analysis of the clusters, instead of individual peaks, helps reduce difficulties associated with technically-correlated data, and can aid more efficient biomarker identification. Conclusions This software can be used to pre-process and thereby decrease the complexity of protein profiling proteomics data, thus simplifying the subsequent analysis of biomarkers by decreasing the number of tests. The software is also a practical tool for identifying which features to investigate further by purification, identification and confirmation. PMID:24010718
Finite-Difference Time-Domain Analysis of Tapered Photonic Crystal Fiber
NASA Astrophysics Data System (ADS)
Ali, M. I. Md; Sanusidin, S. N.; Yusof, M. H. M.
2018-03-01
This paper brief about the simulation of tapered photonic crystal fiber (PCF) LMA-8 single-mode type based on correlation of scattering pattern at wavelength of 1.55 μm, analyzation of transmission spectrum at wavelength over the range of 1.0 until 2.5 μm and correlation of transmission spectrum with the refractive index change in photonic crystal holes with respect to taper size of 0.1 until 1.0 using Optiwave simulation software. The main objective is to simulate using Finite-Difference Time-Domain (FDTD) technique of tapered LMA-8 PCF and for sensing application by improving the capabilities of PCF without collapsing the crystal holes. The types of FDTD techniques used are scattering pattern and transverse transmission and principal component analysis (PCA) used as a mathematical tool to model the data obtained by MathCad software. The simulation results showed that there is no obvious correlation of scattering pattern at a wavelength of 1.55 μm, a correlation obtained between taper sizes with a transverse transmission and there is a parabolic relationship between the refractive index changes inside the crystal structure.
Remediation of Deficits in Recognition of Facial Emotions in Children with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Weinger, Paige M.; Depue, Richard A.
2011-01-01
This study evaluated the efficacy of the Mind Reading interactive computer software to remediate emotion recognition deficits in children with autism spectrum disorders (ASD). Six unmedicated children with ASD and 11 unmedicated non-clinical control subjects participated in the study. The clinical sample used the software for five sessions. The…
Hybrid PV/diesel solar power system design using multi-level factor analysis optimization
NASA Astrophysics Data System (ADS)
Drake, Joshua P.
Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.
AESOP 3.0 Highlights: Afloat Electromagnetic Spectrum Operations Program
2011-03-01
Restricted Frequency List (JRFL) MCEB Pub 8, Version 2.0.1 (1 July 2010); Tactical Information - JRFL Enhanced Mapping Capability 2-D and 3-D maps with...includes Joint Restricted Frequency List (JRFL) frequencies UNCLASSIFIED 14 Satellite Availability & Analysis (SA2) AESOP 3.0 – SA2 v5.7.2 Software
Interdisciplinary Investigations in Support of Project DI-MOD
NASA Technical Reports Server (NTRS)
Starks, Scott A. (Principal Investigator)
1996-01-01
Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.
The SPORT-NMR Software: A Tool for Determining Relaxation Times in Unresolved NMR Spectra
NASA Astrophysics Data System (ADS)
Geppi, Marco; Forte, Claudia
1999-03-01
A software package which allows the correct determination of individual relaxation times for all the nonequivalent nuclei in poorly resolved NMR spectra is described. The procedure used, based on the fitting of each spectrum in the series recorded in the relaxation experiment, should improve the analysis of relaxation data in terms of quantitative dynamic information, especially in anisotropic phases. Tests on simulated data and experimental examples concerning1H and13CT1ρmeasurement in a solid copolymer and2HT1ZandT1Qmeasurement in a liquid crystal are shown and discussed.
Frequency Estimator Performance for a Software-Based Beacon Receiver
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Morse, Jacquelynne R.; Nessel, James A.
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a Q/V-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
Analysis and fit of stellar spectra using a mega-database of CMFGEN models
NASA Astrophysics Data System (ADS)
Fierro-Santillán, Celia; Zsargó, Janos; Klapp, Jaime; Díaz-Azuara, Santiago Alfredo; Arrieta, Anabel; Arias, Lorena
2017-11-01
We present a tool for analysis and fit of stellar spectra using a mega database of 15,000 atmosphere models for OB stars. We have developed software tools, which allow us to find the model that best fits to an observed spectrum, comparing equivalent widths and line ratios in the observed spectrum with all models of the database. We use the Hα, Hβ, Hγ, and Hδ lines as criterion of stellar gravity and ratios of He II λ4541/He I λ4471, He II λ4200/(He I+He II λ4026), He II λ4541/He I λ4387, and He II λ4200/He I λ4144 as criterion of T eff.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Trade Study for Neutron Transport at Low Earth Orbit: Adding Fidelity to DIORAMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClanahan, Tucker Caden; Wakeford, Daniel Tyler
The Distributed Infrastructure Offering Real-Time Access to Modeling and Analysis (DIORAMA) software provides performance modeling capabilities of the United States Nuclear Detonation Detection System (USNDS) with a focus on the characterization of Space-Based Nuclear Detonation Detection (SNDD) instrument performance [1]. A case study was done to add the neutron propagation capabilities of DIORAMA to low earth orbit (LEO), and compare the back-calculated incident energy from the time-of- ight (TOF) spectrum with the scored incident energy spectrum. As the scoring altitude lowers, the time increase due to scattering takes up much more of the fraction of total TOF; whereas at geosynchronousmore » earth orbit (GEO), the time increase due to scattering is a negligible fraction of the total TOF [2]. The scattering smears out the TOF enough to make the back-calculation of the initial energy spectrum from the TOF spectrum very convoluted.« less
Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry
NASA Astrophysics Data System (ADS)
Kirillov, V. A.; Dubovsky, S. V.
2016-07-01
Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.
Monte Carlo Simulations for VLBI2010
NASA Astrophysics Data System (ADS)
Wresnik, J.; Böhm, J.; Schuh, H.
2007-07-01
Monte Carlo simulations are carried out at the Institute of Geodesy and Geophysics (IGG), Vienna, and at Goddard Space Flight Center (GSFC), Greenbelt (USA), with the goal to design a new geodetic Very Long Baseline Interferometry (VLBI) system. Influences of the schedule, the network geometry and the main stochastic processes on the geodetic results are investigated. Therefore schedules are prepared with the software package SKED (Vandenberg 1999), and different strategies are applied to produce temporally very dense schedules which are compared in terms of baseline length repeatabilities. For the simulation of VLBI observations a Monte Carlo Simulator was set up which creates artificial observations by randomly simulating wet zenith delay and clock values as well as additive white noise representing the antenna errors. For the simulation at IGG the VLBI analysis software OCCAM (Titov et al. 2004) was adapted. Random walk processes with power spectrum densities of 0.7 and 0.1 psec2/sec are used for the simulation of wet zenith delays. The clocks are simulated with Allan Standard Deviations of 1*10^-14 @ 50 min and 2*10^-15 @ 15 min and three levels of white noise, 4 psec, 8 psec and, 16 psec, are added to the artificial observations. The variations of the power spectrum densities of the clocks and wet zenith delays, and the application of different white noise levels show clearly that the wet delay is the critical factor for the improvement of the geodetic VLBI system. At GSFC the software CalcSolve is used for the VLBI analysis, therefore a comparison between the software packages OCCAM and CalcSolve was done with simulated data. For further simulations the wet zenith delay was modeled by a turbulence model. This data was provided by Nilsson T. and was added to the simulation work. Different schedules have been run.
Near-infrared face recognition utilizing open CV software
NASA Astrophysics Data System (ADS)
Sellami, Louiza; Ngo, Hau; Fowler, Chris J.; Kearney, Liam M.
2014-06-01
Commercially available hardware, freely available algorithms, and authors' developed software are synergized successfully to detect and recognize subjects in an environment without visible light. This project integrates three major components: an illumination device operating in near infrared (NIR) spectrum, a NIR capable camera and a software algorithm capable of performing image manipulation, facial detection and recognition. Focusing our efforts in the near infrared spectrum allows the low budget system to operate covertly while still allowing for accurate face recognition. In doing so a valuable function has been developed which presents potential benefits in future civilian and military security and surveillance operations.
NASA Astrophysics Data System (ADS)
Nahhas, Tariq M.
2011-03-01
This paper presents a comparison of the seismic forces generated from a Modal Response Spectrum Analysis (MRSA) by applying the provisions of two building codes, the 1997 Uniform Building Code (UBC) and the 2000-2009 International Building Code (IBC), to the most common ordinary residential buildings of standard occupancy. Considering IBC as the state of the art benchmark code, the primary concern is the safety of buildings designed using the UBC as compared to those designed using the IBC. A sample of four buildings with different layouts and heights was used for this comparison. Each of these buildings was assumed to be located at four different geographical sample locations arbitrarily selected to represent various earthquake zones on a seismic map of the USA, and was subjected to code-compliant response spectrum analyses for all sample locations and for five different soil types at each location. Response spectrum analysis was performed using the ETABS software package. For all the cases investigated, the UBC was found to be significantly more conservative than the IBC. The UBC design response spectra have higher spectral accelerations, and as a result, the response spectrum analysis provided a much higher base shear and moment in the structural members as compared to the IBC. The conclusion is that ordinary office and residential buildings designed using UBC 1997 are considered to be overdesigned, and therefore they are quite safe even according to the IBC provisions.
Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter
2014-12-01
Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions. Copyright © 2014 Elsevier Ltd. All rights reserved.
SolTrack: an automatic video processing software for in situ interface tracking.
Griesser, S; Pierer, R; Reid, M; Dippenaar, R
2012-10-01
High-Resolution in situ observation of solidification experiments has become a powerful technique to improve the fundamental understanding of solidification processes of metals and alloys. In the present study, high-temperature laser-scanning confocal microscopy (HTLSCM) was utilized to observe and capture in situ solidification and phase transformations of alloys for subsequent post processing and analysis. Until now, this analysis has been very time consuming as frame-by-frame manual evaluation of propagating interfaces was used to determine the interface velocities. SolTrack has been developed using the commercial software package MATLAB and is designed to automatically detect, locate and track propagating interfaces during solidification and phase transformations as well as to calculate interfacial velocities. Different solidification phenomena have been recorded to demonstrate a wider spectrum of applications of this software. A validation, through comparison with manual evaluation, is included where the accuracy is shown to be very high. © 2012 The Authors Journal of Microscopy © 2012 Royal Microscopical Society.
SpcAudace: Spectroscopic processing and analysis package of Audela software
NASA Astrophysics Data System (ADS)
Mauclaire, Benjamin
2017-11-01
SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.
Can Automated Facial Expression Analysis Show Differences Between Autism and Typical Functioning?
Borsos, Zsófia; Gyori, Miklos
2017-01-01
Exploratory analyses of emotional expressions using a commercially available facial expression recognition software are reported, from the context of a serious game for screening purposes. Our results are based on a comparative analysis of two matched groups of kindergarten-age children (high-functioning children with autism spectrum condition: n=13; typically developing children: n=13). Results indicate that this technology has the potential to identify autism-specific emotion expression features, and may play a role in affective diagnostic and assistive technologies.
Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon
2010-06-15
Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thoreson, Gregory G
PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
Scott, S. D.; Mumgaard, R. T.
2016-07-20
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, S. D.; Mumgaard, R. T.
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
NASA Technical Reports Server (NTRS)
Funk, Christie J.
2013-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees of freedom and allows for the calculation of various airplane responses due to a discrete one-minus-cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and output data so as to provide a more useful and accurate tool for gust load analysis. Revisions are made in the categories of aircraft geometry, computation of aerodynamic forces and moments, and implementation of horizontal tail mode shapes. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs in included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Experiences with Integrating Simulation into a Software Engineering Curriculum
ERIC Educational Resources Information Center
Bollin, Andreas; Hochmuller, Elke; Mittermeir, Roland; Samuelis, Ladislav
2012-01-01
Software Engineering education must account for a broad spectrum of knowledge and skills software engineers will be required to apply throughout their professional life. Covering all the topics in depth within a university setting is infeasible due to curricular constraints as well as due to the inherent differences between educational…
Computerized Doppler Tomography and Spectrum Analysis of Carotid Artery Flow
Morton, Paul; Goldman, Dave; Nichols, W. Kirt
1981-01-01
Contrast angiography remains the definitive study in the evaluation of atherosclerotic occlusive vascular disease. However, a safer technique for serial screening of symptomatic patients and for routine follow up is necessary. Computerized pulsed Doppler ultrasonic arteriography is a noninvasive technique developed by Miles6 for imaging lateral, antero-posterior and transverse sections of the carotid artery. We [ill] this system with new software and hardware to analyze the three-dimensional blood flow data. The system now provides information about the location of the occlusive process in the artery and a semi-quantitative evaluation of the degree of obstruction. In addition, we interfaced a digital signal analyzer to the system which permits spectrum analysis of the pulsed Doppler signal. This addition has allowed us to identify lesions which are not yet hemodynamically significant. ImagesFig. 2bFig. 2c
Heuristics to Evaluate Interactive Systems for Children with Autism Spectrum Disorder (ASD).
Khowaja, Kamran; Salim, Siti Salwah; Asemi, Adeleh
2015-01-01
In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen's set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen's heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system.
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
Frequency Spectrum Method-Based Stress Analysis for Oil Pipelines in Earthquake Disaster Areas
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline. PMID:25692790
Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.
Kansara, Seema; Blieden, Lauren S.; Chuang, Alice Z.; Baker, Laura A.; Bell, Nicholas P.; Mankiewicz, Kimberly A.; Feldman, Robert M.
2015-01-01
Purpose To evaluate the change in trabecular-iris circumference volume (TICV) after laser peripheral iridotomy (LPI) in primary angle closure (PAC) spectrum eyes Patients and Methods Forty-two chronic PAC spectrum eyes from 24 patients were enrolled. Eyes with anterior chamber abnormalities affecting angle measurement were excluded. Intraocular pressure, slit lamp exam, and gonioscopy were recorded at each visit. Anterior segment optical coherence tomography (ASOCT) with 3D mode angle analysis scans were taken with the CASIA SS-1000 (Tomey Corp., Nagoya, Japan) before and after LPI. Forty-two pre-LPI ASOCT scans and 34 post-LPI ASOCT scans were analyzed using the Anterior Chamber Analysis and Interpretation (ACAI, Houston, TX) software. A mixed-effect model analysis was used to compare the trabecular-iris space area (TISA) changes among 4 quadrants, as well as to identify potential factors affecting TICV. Results There was a significant increase in all average angle parameters after LPI (TISA500, TISA750, TICV500, and TICV750). The magnitude of change in TISA500 in the superior angle was significantly less than the other angles. The changes in TICV500 and TICV750 were not associated with any demographic or ocular characteristics. Conclusion TICV is a useful parameter to quantitatively measure the effectiveness of LPI in the treatment of eyes with PAC spectrum disease. PMID:26066504
Chang, Yen-Liang; Hung, Chao-Ho; Chen, Po-Yueh; Chen, Wei-Chang; Hung, Shih-Han
2015-10-01
Acoustic analysis is often used in speech evaluation but seldom for the evaluation of oral prostheses designed for reconstruction of surgical defect. This study aimed to introduce the application of acoustic analysis for patients with velopharyngeal insufficiency (VPI) due to oral surgery and rehabilitated with oral speech-aid prostheses. The pre- and postprosthetic rehabilitation acoustic features of sustained vowel sounds from two patients with VPI were analyzed and compared with the acoustic analysis software Praat. There were significant differences in the octave spectrum of sustained vowel speech sound between the pre- and postprosthetic rehabilitation. Acoustic measurements of sustained vowels for patients before and after prosthetic treatment showed no significant differences for all parameters of fundamental frequency, jitter, shimmer, noise-to-harmonics ratio, formant frequency, F1 bandwidth, and band energy difference. The decrease in objective nasality perceptions correlated very well with the decrease in dips of the spectra for the male patient with a higher speech bulb height. Acoustic analysis may be a potential technique for evaluating the functions of oral speech-aid prostheses, which eliminates dysfunctions due to the surgical defect and contributes to a high percentage of intelligible speech. Octave spectrum analysis may also be a valuable tool for detecting changes in nasality characteristics of the voice during prosthetic treatment of VPI. Copyright © 2014. Published by Elsevier B.V.
Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver
2016-09-02
Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.
Completely automated open-path FT-IR spectrometry.
Griffiths, Peter R; Shao, Limin; Leytem, April B
2009-01-01
Atmospheric analysis by open-path Fourier-transform infrared (OP/FT-IR) spectrometry has been possible for over two decades but has not been widely used because of the limitations of the software of commercial instruments. In this paper, we describe the current state-of-the-art of the hardware and software that constitutes a contemporary OP/FT-IR spectrometer. We then describe advances that have been made in our laboratory that have enabled many of the limitations of this type of instrument to be overcome. These include not having to acquire a single-beam background spectrum that compensates for absorption features in the spectra of atmospheric water vapor and carbon dioxide. Instead, an easily measured "short path-length" background spectrum is used for calculation of each absorbance spectrum that is measured over a long path-length. To accomplish this goal, the algorithm used to calculate the concentrations of trace atmospheric molecules was changed from classical least-squares regression (CLS) to partial least-squares regression (PLS). For calibration, OP/FT-IR spectra are measured in pristine air over a wide variety of path-lengths, temperatures, and humidities, ratioed against a short-path background, and converted to absorbance; the reference spectrum of each analyte is then multiplied by randomly selected coefficients and added to these background spectra. Automatic baseline correction for small molecules with resolved rotational fine structure, such as ammonia and methane, is effected using wavelet transforms. A novel method of correcting for the effect of the nonlinear response of mercury cadmium telluride detectors is also incorporated. Finally, target factor analysis may be used to detect the onset of a given pollutant when its concentration exceeds a certain threshold. In this way, the concentration of atmospheric species has been obtained from OP/FT-IR spectra measured at intervals of 1 min over a period of many hours with no operator intervention.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
The Role of Multiphysics Simulation in Multidisciplinary Analysis
NASA Technical Reports Server (NTRS)
Rifai, Steven M.; Ferencz, Robert M.; Wang, Wen-Ping; Spyropoulos, Evangelos T.; Lawrence, Charles; Melis, Matthew E.
1998-01-01
This article describes the applications of the Spectrum(Tm) Solver in Multidisciplinary Analysis (MDA). Spectrum, a multiphysics simulation software based on the finite element method, addresses compressible and incompressible fluid flow, structural, and thermal modeling as well as the interaction between these disciplines. Multiphysics simulation is based on a single computational framework for the modeling of multiple interacting physical phenomena. Interaction constraints are enforced in a fully-coupled manner using the augmented-Lagrangian method. Within the multiphysics framework, the finite element treatment of fluids is based on Galerkin-Least-Squares (GLS) method with discontinuity capturing operators. The arbitrary-Lagrangian-Eulerian method is utilized to account for deformable fluid domains. The finite element treatment of solids and structures is based on the Hu-Washizu variational principle. The multiphysics architecture lends itself naturally to high-performance parallel computing. Aeroelastic, propulsion, thermal management and manufacturing applications are presented.
Introduction to multifractal detrended fluctuation analysis in matlab.
Ihlen, Espen A F
2012-01-01
Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra.
Introduction to Multifractal Detrended Fluctuation Analysis in Matlab
Ihlen, Espen A. F.
2012-01-01
Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra. PMID:22675302
User manual of the CATSS system (version 1.0) communication analysis tool for space station
NASA Technical Reports Server (NTRS)
Tsang, C. S.; Su, Y. T.; Lindsey, W. C.
1983-01-01
The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.
NASA Astrophysics Data System (ADS)
Papers are presented on local area networks; formal methods for communication protocols; computer simulation of communication systems; spread spectrum and coded communications; tropical radio propagation; VLSI for communications; strategies for increasing software productivity; multiple access communications; advanced communication satellite technologies; and spread spectrum systems. Topics discussed include Space Station communication and tracking development and design; transmission networks; modulation; data communications; computer network protocols and performance; and coding and synchronization. Consideration is given to free space optical communications systems; VSAT communication networks; network topology design; advances in adaptive filtering echo cancellation and adaptive equalization; advanced signal processing for satellite communications; the elements, design, and analysis of fiber-optic networks; and advances in digital microwave systems.
Ringkob, T P; Swartz, D R; Greaser, M L
2004-05-01
Image analysis procedures for immunofluorescence microscopy were developed to measure muscle thin filament lengths of beef, rabbit, and chicken myofibrils. Strips of beef cutaneous trunci, rectus abdominis, psoas, and masseter; chicken pectoralis; and rabbit psoas muscles were excised 5 to 30 min postmortem. Fluorescein phalloidin and rhodamine myosin subfragment-1 (S1) were used to probe the myofibril structure. Digital images were recorded with a cooled charge-coupled device controlled with IPLab Spectrum software (Signal Analytics Corp.) on a Macintosh operating system. The camera was attached to an inverted microscope, using both the phase-contrast and fluorescence illumination modes. Unfixed myofibrils incubated with fluorescein phalloidin showed fluorescence primarily at the Z-line and the tips of the thin filaments in the overlap region. Images were processed using IPLab and the National Institutes of Health's Image software. A region of interest was selected and scaled by a factor of 18.18, which enlarged the image from 11 pixels/microm to approximately 200 pixels/microm. An X-Y plot was exported to Spectrum 1.1 (Academic Software Development Group), where the signal was processed with a second derivative routine, so a cursor function could be used to measure length. Fixation before phalloidin incubation resulted in greatest intensity at the Z lines but a more-uniform staining over the remainder of the thin filament zone. High-resolution image capture and processing showed that thin filament lengths were significantly different (P < 0.01) among beef, rabbit, and chicken, with lengths of 1.28 to 1.32 microm, 1.16 microm, and 1.05 microm, respectively. Measurements using the S1 signal confirmed the phalloidin results. Fluorescent probes may be useful to study sarcomere structure and help explain species and muscle differences in meat texture.
Heuristics to Evaluate Interactive Systems for Children with Autism Spectrum Disorder (ASD)
Khowaja, Kamran; Salim, Siti Salwah
2015-01-01
In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen’s set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen’s heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system. PMID:26196385
Collod-Béroud, G; Béroud, C; Adès, L; Black, C; Boxer, M; Brock, D J; Godfrey, M; Hayward, C; Karttunen, L; Milewicz, D; Peltonen, L; Richards, R I; Wang, M; Junien, C; Boileau, C
1997-01-01
Fibrillin is the major component of extracellular microfibrils. Mutations in the fibrillin gene on chromosome 15 (FBN1) were described at first in the heritable connective tissue disorder, Marfan syndrome (MFS). More recently, FBN1 has also been shown to harbor mutations related to a spectrum of conditions phenotypically related to MFS. These mutations are private, essentially missense, generally non-recurrent and widely distributed throughout the gene. To date no clear genotype/phenotype relationship has been observed excepted for the localization of neonatal mutations in a cluster between exons 24 and 32. The second version of the computerized Marfan database contains 89 entries. The software has been modified to accomodate new functions and routines. PMID:9016526
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
A synthetic method of solar spectrum based on LED
NASA Astrophysics Data System (ADS)
Wang, Ji-qiang; Su, Shi; Zhang, Guo-yu; Zhang, Jian
2017-10-01
A synthetic method of solar spectrum which based on the spectral characteristics of the solar spectrum and LED, and the principle of arbitrary spectral synthesis was studied by using 14 kinds of LED with different central wavelengths.The LED and solar spectrum data were selected by Origin Software firstly, then calculated the total number of LED for each center band by the transformation relation between brightness and illumination and Least Squares Curve Fit in Matlab.Finally, the spectrum curve of AM1.5 standard solar spectrum was obtained. The results met the technical indexes of the solar spectrum matching with ±20% and the solar constant with >0.5.
Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven
2013-05-01
Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.
NASA Astrophysics Data System (ADS)
Srivastava, Mayuri; Singh, N. P.; Yadav, R. A.
2014-08-01
Vibrational spectrum of Pantothenic acid has been investigated using experimental IR and Raman spectroscopies and density functional theory methods available with the Gaussian 09 software. Vibrational assignments of the observed IR and Raman bands have been proposed in light of the results obtained from computations. In order to assign the observed IR and Raman frequencies the potential energy distributions (PEDs) have also been computed using GAR2PED software. Optimized geometrical parameters suggest that the overall symmetry of the molecule is C1. The molecule is found to possess eight conformations. Conformational analysis was carried out to obtain the most stable configuration of the molecule. In the present paper the vibrational features of the lowest energy conformer C-I have been studied. The two methyl groups have slightly distorted symmetries from C3V. The acidic Osbnd H bond is found to be the smallest one. To investigate molecular stability and bond strength we have used natural bond orbital analysis (NBO). Charge transfer occurs in the molecule have been shown by the calculated highest occupied molecular orbital-lowest unoccupied molecular orbital (HOMO-LUMO) energies. The mapping of electron density iso-surface with electrostatic potential (ESP), has been carried out to get the information about the size, shape, charge density distribution and site of chemical reactivity of the molecule.
Structural analysis consultation using artificial intelligence
NASA Technical Reports Server (NTRS)
Melosh, R. J.; Marcal, P. V.; Berke, L.
1978-01-01
The primary goal of consultation is definition of the best strategy to deal with a structural engineering analysis objective. The knowledge base to meet the need is designed to identify the type of numerical analysis, the needed modeling detail, and specific analysis data required. Decisions are constructed on the basis of the data in the knowledge base - material behavior, relations between geometry and structural behavior, measures of the importance of time and temperature changes - and user supplied specifics characteristics of the spectrum of analysis types, the relation between accuracy and model detail on the structure, its mechanical loadings, and its temperature states. Existing software demonstrated the feasibility of the approach, encompassing the 36 analysis classes spanning nonlinear, temperature affected, incremental analyses which track the behavior of structural systems.
NASA Technical Reports Server (NTRS)
Hayden, W. L.; Robinson, L. H.
1972-01-01
Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.
ERIC Educational Resources Information Center
Kendall, Leslie R.
2013-01-01
Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…
Online Videoconferencing Products: Update
ERIC Educational Resources Information Center
Burton, Douglas; Kitchen, Tim
2011-01-01
Software allowing real-time online video connectivity is rapidly evolving. The ability to connect students, staff, and guest speakers instantaneously carries great benefits for the online distance education classroom. This evaluation report compares four software applications at opposite ends of the cost spectrum: "DimDim", "Elluminate VCS",…
Principal component analysis of Raman spectra for TiO2 nanoparticle characterization
NASA Astrophysics Data System (ADS)
Ilie, Alina Georgiana; Scarisoareanu, Monica; Morjan, Ion; Dutu, Elena; Badiceanu, Maria; Mihailescu, Ion
2017-09-01
The Raman spectra of anatase/rutile mixed phases of Sn doped TiO2 nanoparticles and undoped TiO2 nanoparticles, synthesised by laser pyrolysis, with nanocrystallite dimensions varying from 8 to 28 nm, was simultaneously processed with a self-written software that applies Principal Component Analysis (PCA) on the measured spectrum to verify the possibility of objective auto-characterization of nanoparticles from their vibrational modes. The photo-excited process of Raman scattering is very sensible to the material characteristics, especially in the case of nanomaterials, where more properties become relevant for the vibrational behaviour. We used PCA, a statistical procedure that performs eigenvalue decomposition of descriptive data covariance, to automatically analyse the sample's measured Raman spectrum, and to interfere the correlation between nanoparticle dimensions, tin and carbon concentration, and their Principal Component values (PCs). This type of application can allow an approximation of the crystallite size, or tin concentration, only by measuring the Raman spectrum of the sample. The study of loadings of the principal components provides information of the way the vibrational modes are affected by the nanoparticle features and the spectral area relevant for the classification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarek Haddadin; Stephen Andrew Laraway; Arslan Majid
This paper proposes and presents the design and implementation of an underlay communication channel (UCC) for 5G cognitive mesh networks. The UCC builds its waveform based on filter bank multicarrier spread spectrum (FB-MCSS) signaling. The use of this novel spread spectrum signaling allows the device-to-device (D2D) user equipments (UEs) to communicate at a level well below noise temperature and hence, minimize taxation on macro-cell/small-cell base stations and their UEs in 5G wireless systems. Moreover, the use of filter banks allows us to avoid those portions of the spectrum that are in use by macro-cell and small-cell users. Hence, both D2D-to-cellularmore » and cellular-to-D2D interference will be very close to none. We propose a specific packet for UCC and develop algorithms for packet detection, timing acquisition and tracking, as well as channel estimation and equalization. We also present the detail of an implementation of the proposed transceiver on a software radio platform and compare our experimental results with those from a theoretical analysis of our packet detection algorithm.« less
Christner, Martin; Dressler, Dirk; Andrian, Mark; Reule, Claudia; Petrini, Orlando
2017-01-01
The fast and reliable characterization of bacterial and fungal pathogens plays an important role in infectious disease control and tracking of outbreak agents. DNA based methods are the gold standard for epidemiological investigations, but they are still comparatively expensive and time-consuming. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a fast, reliable and cost-effective technique now routinely used to identify clinically relevant human pathogens. It has been used for subspecies differentiation and typing, but its use for epidemiological tasks, e. g. for outbreak investigations, is often hampered by the complexity of data analysis. We have analysed publicly available MALDI-TOF mass spectra from a large outbreak of Shiga-Toxigenic Escherichia coli in northern Germany using a general purpose software tool for the analysis of complex biological data. The software was challenged with depauperate spectra and reduced learning group sizes to mimic poor spectrum quality and scarcity of reference spectra at the onset of an outbreak. With high quality formic acid extraction spectra, the software's built in classifier accurately identified outbreak related strains using as few as 10 reference spectra (99.8% sensitivity, 98.0% specificity). Selective variation of processing parameters showed impaired marker peak detection and reduced classification accuracy in samples with high background noise or artificially reduced peak counts. However, the software consistently identified mass signals suitable for a highly reliable marker peak based classification approach (100% sensitivity, 99.5% specificity) even from low quality direct deposition spectra. The study demonstrates that general purpose data analysis tools can effectively be used for the analysis of bacterial mass spectra.
Specdata: Automated Analysis Software for Broadband Spectra
NASA Astrophysics Data System (ADS)
Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.
2017-06-01
With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.
INEEL BNCT research program. Annual report, January 1, 1996--December 31, 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venhuizen, J.R.
1997-04-01
This report is a summary of the progress and research produced for the Idaho National Engineering and Environmental Laboratory (INEEL) Boron Neutron Capture Therapy (BNCT) Research Program for calendar year 1996. Contributions from the individual investigators about their projects are included, specifically, physics: treatment planning software, real-time neutron beam measurement dosimetry, measurement of the Finnish research reactor epithermal neutron spectrum, BNCT accelerator technology; and chemistry: analysis of biological samples and preparation of {sup 10}B enriched decaborane.
NASA Astrophysics Data System (ADS)
Hoch, Jeffrey C.
2017-10-01
Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development.
[Discrimination of Rice Syrup Adulterant of Acacia Honey Based Using Near-Infrared Spectroscopy].
Zhang, Yan-nan; Chen, Lan-zhen; Xue, Xiao-feng; Wu, Li-ming; Li, Yi; Yang, Juan
2015-09-01
At present, the rice syrup as a low price of the sweeteners was often adulterated into acacia honey and the adulterated honeys were sold in honey markets, while there is no suitable and fast method to identify honey adulterated with rice syrup. In this study, Near infrared spectroscopy (NIR) combined with chemometric methods were used to discriminate authenticity of honey. 20 unprocessed acacia honey samples from the different honey producing areas, mixed? with different proportion of rice syrup, were prepared of seven different concentration gradient? including 121 samples. The near infrared spectrum (NIR) instrument and spectrum processing software have been applied in the? spectrum? scanning and data conversion on adulterant samples, respectively. Then it was analyzed by Principal component analysis (PCA) and canonical discriminant analysis methods in order to discriminating adulterated honey. The results showed that after principal components analysis, the first two principal components accounted for 97.23% of total variation, but the regionalism of the score plot of the first two PCs was not obvious, so the canonical discriminant analysis was used to make the further discrimination, all samples had been discriminated correctly, the first two discriminant functions accounted for 91.6% among the six canonical discriminant functions, Then the different concentration of adulterant samples can be discriminated correctly, it illustrate that canonical discriminant analysis method combined with NIR spectroscopy is not only feasible but also practical for rapid and effective discriminate of the rice syrup adulterant of acacia honey.
Non destructive defect detection by spectral density analysis.
Krejcar, Ondrej; Frischer, Robert
2011-01-01
The potential nondestructive diagnostics of solid objects is discussed in this article. The whole process is accomplished by consecutive steps involving software analysis of the vibration power spectrum (eventually acoustic emissions) created during the normal operation of the diagnosed device or under unexpected situations. Another option is to create an artificial pulse, which can help us to determine the actual state of the diagnosed device. The main idea of this method is based on the analysis of the current power spectrum density of the received signal and its postprocessing in the Matlab environment with a following sample comparison in the Statistica software environment. The last step, which is comparison of samples, is the most important, because it is possible to determine the status of the examined object at a given time. Nowadays samples are compared only visually, but this method can't produce good results. Further the presented filter can choose relevant data from a huge group of data, which originate from applying FFT (Fast Fourier Transform). On the other hand, using this approach they can be subjected to analysis with the assistance of a neural network. If correct and high-quality starting data are provided to the initial network, we are able to analyze other samples and state in which condition a certain object is. The success rate of this approximation, based on our testing of the solution, is now 85.7%. With further improvement of the filter, it could be even greater. Finally it is possible to detect defective conditions or upcoming limiting states of examined objects/materials by using only one device which contains HW and SW parts. This kind of detection can provide significant financial savings in certain cases (such as continuous casting of iron where it could save hundreds of thousands of USD).
[Absorption spectrum of Quasi-continuous laser modulation demodulation method].
Shao, Xin; Liu, Fu-Gui; Du, Zhen-Hui; Wang, Wei
2014-05-01
A software phase-locked amplifier demodulation method is proposed in order to demodulate the second harmonic (2f) signal of quasi-continuous laser wavelength modulation spectroscopy (WMS) properly, based on the analysis of its signal characteristics. By judging the effectiveness of the measurement data, filter, phase-sensitive detection, digital filtering and other processing, the method can achieve the sensitive detection of quasi-continuous signal The method was verified by using carbon dioxide detection experiments. The WMS-2f signal obtained by the software phase-locked amplifier and the high-performance phase-locked amplifier (SR844) were compared simultaneously. The results show that the Allan variance of WMS-2f signal demodulated by the software phase-locked amplifier is one order of magnitude smaller than that demodulated by SR844, corresponding two order of magnitude lower of detection limit. And it is able to solve the unlocked problem caused by the small duty cycle of quasi-continuous modulation signal, with a small signal waveform distortion.
Interim Update to the AN/UPX-41(C) Spectrum Certification Conditions (SPS-18778/1).
DOT National Transportation Integrated Search
2015-11-01
In 2012, the Navy requested spectrum certification for the shipboard AN/UPX-41(C) Digital Interrogator System, Software Version 5.5 with Mode 5. Current operating conditions for the Navys AN/UPX-41(C) are the same as restrictions imposed on the AN...
NASA Astrophysics Data System (ADS)
Liu, Fushun; Liu, Chengcheng; Chen, Jiefeng; Wang, Bin
2017-08-01
The key concept of spectrum response estimation with commercial software, such as the SESAM software tool, typically includes two main steps: finding a suitable loading spectrum and computing the response amplitude operators (RAOs) subjected to a frequency-specified wave component. In this paper, we propose a nontraditional spectrum response estimation method that uses a numerical representation of the retardation functions. Based on estimated added mass and damping matrices of the structure, we decompose and replace the convolution terms with a series of poles and corresponding residues in the Laplace domain. Then, we estimate the power density corresponding to each frequency component using the improved periodogram method. The advantage of this approach is that the frequency-dependent motion equations in the time domain can be transformed into the Laplace domain without requiring Laplace-domain expressions for the added mass and damping. To validate the proposed method, we use a numerical semi-submerged pontoon from the SESAM. The numerical results show that the responses of the proposed method match well with those obtained from the traditional method. Furthermore, the estimated spectrum also matches well, which indicates its potential application to deep-water floating structures.
A real-time spectrum acquisition system design based on quantum dots-quantum well detector
NASA Astrophysics Data System (ADS)
Zhang, S. H.; Guo, F. M.
2016-01-01
In this paper, we studied the structure characteristics of quantum dots-quantum well photodetector with response wavelength range from 400 nm to 1000 nm. It has the characteristics of high sensitivity, low dark current and the high conductance gain. According to the properties of the quantum dots-quantum well photodetectors, we designed a new type of capacitive transimpedence amplifier (CTIA) readout circuit structure with the advantages of adjustable gain, wide bandwidth and high driving ability. We have implemented the chip packaging between CTIA-CDS structure readout circuit and quantum dots detector and tested the readout response characteristics. According to the timing signals requirements of our readout circuit, we designed a real-time spectral data acquisition system based on FPGA and ARM. Parallel processing mode of programmable devices makes the system has high sensitivity and high transmission rate. In addition, we realized blind pixel compensation and smoothing filter algorithm processing to the real time spectrum data by using C++. Through the fluorescence spectrum measurement of carbon quantum dots and the signal acquisition system and computer software system to realize the collection of the spectrum signal processing and analysis, we verified the excellent characteristics of detector. It meets the design requirements of quantum dot spectrum acquisition system with the characteristics of short integration time, real-time and portability.
Rocking-beam spectrum images and ALCHEMI of Ni{sub 50}Al{sub 40}Fe{sub 10}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, I.M.; Bentley, J.
1997-04-01
A rocking-beam energy-dispersive X-ray (EDX) spectrum image was acquired near the [035] zone axis of a B2-ordered alloy of composition Ni{sub 50}Al{sub 40}Fe{sub 10}. Images comparable to those acquired by Rossouw et al. were formed a posteriori by integrating the X-ray intensities in windows enclosing the Al-K, Fe-K{sub {alpha}}, and Ni-K{sub {alpha}} characteristic X-ray peaks for each pixel of the spectrum image. These images are shown along with a bright-field transmission channeling pattern (TCP), which records the signal from the bright-field STEM detector as the incident beam direction is varied with the beam-tilt coils, and an EDX spectrum from onemore » pixel of the image. The range of orientations from which the spectrum image was acquired is indicated by the square superimposed on the TCP. ALCHEMI (atom-location by channeling-enhanced microanalysis) was performed on a subset of the spectrum image using standard methods. Spectra from a series of {approximately}30 pixels along lines parallel to the (200) band were summed at each of 31 orientations relative to the band in the range 0 {le} {theta}/{theta}{sub 200} {le} 2.3. Characteristic X-ray intensities of the K-shell X-rays of Ni, Fe, and Al were extracted from the 31 summed spectra with the simplex fitting procedure of the DTSA spectral analysis software. The fraction of Fe on the `Ni`-site from this analysis, p{sub Fe`Ni`} = 23.8 {+-} 2.1%, is in excellent agreement with p{sub Fe`Ni`} = 23.7 {+-} 0.9%, which was determined by an analysis of a series of ten spectra acquired at orientations of the crystal carefully chosen so that the contributions of nonsystematic reflections are negligible.« less
A Moire Fringing Spectrometer for Extra-Solar Planet Searches
NASA Astrophysics Data System (ADS)
van Eyken, J. C.; Ge, J.; Mahadevan, S.; De Witt, C.; Ramsey, L. W.; Berger, D.; Shaklan, S.; Pan, X.
2001-12-01
We have developed a prototype moire fringing spectrometer for high precision radial velocity measurements for the detection of extra-solar planets. This combination of Michelson interferometer and spectrograph overlays an interferometer comb on a medium resolution stellar spectrum, producing Moire patterns. Small changes in the doppler shift of the spectrum lead to corresponding large shifts in the Moire pattern (Moire magnification). The sinusoidal shape of the Moire fringes enables much simpler measurement of these shifts than in standard echelle spectrograph techniques, facilitating high precision measurements with a low cost instrument. Current data analysis software we have developed has produced short-term repeatability (over a few hours) to 5-10m/s, and future planned improvements based on previous experiments should reduce this significantly. We plan eventually to carry out large scale surveys for low mass companions around other stars. This poster will present new results obtained in the lab and at the HET and Palomar 5m telescopes, the theory of the instrument, and data analysis techniques.
Nondestructive Analysis of MET-5 Paint Can at TA35 Building 2 A-Wing Vault
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desimone, David J.; Vo, Duc Ta
In Building 2 A-wing vault MET-5 has some drums and other packages they wanted NEN-1 help identifying nondestructively. Measurements using a mechanically cooled portable high-purity germanium HPGe Ortec detective were taken of a paint can container labeled DU-2A to determine if any radioactive material was inside. The HPGe detector measures the gamma rays emitted by radioactive material and displays it as a spectrum. The spectrum is used to identify this radioactive material by using appropriate analysis software and identifying the gamma ray peaks. A paint can container, DU-2A, was analyzed with PeakEasy 4.84 and FRAM 5.2. The FRAM report ismore » shown. The enrichment is 0.091% U235 and 99.907% U238. This material is depleted uranium. The measurement was performed in the near field, and to extract a mass a far field measurement will need to be taken.« less
PINS Spectrum Identification Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
A.J. Caffrey
2012-03-01
The Portable Isotopic Neutron Spectroscopy—PINS, for short—system identifies the chemicals inside munitions and containers without opening them, a decided safety advantage if the fill chemical is a hazardous substance like a chemical warfare agent or an explosive. The PINS Spectrum Identification Guide is intended as a reference for technical professionals responsible for the interpretation of PINS gamma-ray spectra. The guide is divided into two parts. The three chapters that constitute Part I cover the science and technology of PINS. Neutron activation analysis is the focus of Chapter 1. Chapter 2 explores PINS hardware, software, and related operational issues. Gamma-ray spectralmore » analysis basics are introduced in Chapter 3. The six chapters of Part II cover the identification of PINS spectra in detail. Like the PINS decision tree logic, these chapters are organized by chemical element: phosphorus-based chemicals, chlorine-based chemicals, etc. These descriptions of hazardous, toxic, and/or explosive chemicals conclude with a chapter on the identification of the inert chemicals, e.g. sand, used to fill practice munitions.« less
A data base and analysis program for shuttle main engine dynamic pressure measurements
NASA Technical Reports Server (NTRS)
Coffin, T.
1986-01-01
A dynamic pressure data base management system is described for measurements obtained from space shuttle main engine (SSME) hot firing tests. The data were provided in terms of engine power level and rms pressure time histories, and power spectra of the dynamic pressure measurements at selected times during each test. Test measurements and engine locations are defined along with a discussion of data acquisition and reduction procedures. A description of the data base management analysis system is provided and subroutines developed for obtaining selected measurement means, variances, ranges and other statistics of interest are discussed. A summary of pressure spectra obtained at SSME rated power level is provided for reference. Application of the singular value decomposition technique to spectrum interpolation is discussed and isoplots of interpolated spectra are presented to indicate measurement trends with engine power level. Program listings of the data base management and spectrum interpolation software are given. Appendices are included to document all data base measurements.
The new 3-(tert-butyl)-1-(2-nitrophenyl)-1H-pyrazol-5-amine: Experimental and computational studies
NASA Astrophysics Data System (ADS)
Cuenú, Fernando; Muñoz-Patiño, Natalia; Torres, John Eduard; Abonia, Rodrigo; Toscano, Rubén A.; Cobo, J.
2017-11-01
The molecular and supramolecular structure of the title compound, 3-(tertbutyl)-1-(2-nitrophenyl)-1H-pyrazol-5-amine (2NPz) from the single crystal X-ray diffraction (SC-XRD) and spectroscopic data analysis is reported. The computational analysis of the structure, geometry optimization, vibrational frequencies, nuclear magnetic resonance and UV-Vis is also described and compared with experimental data. Satisfactory theoretical aspects were made for the molecule using density functional theory (DFT), with B3LYP and B3PW91 functionals, and Hartree-Fock (HF), with 6-311++G(d,p) basis set, using GAUSSIAN 09 program package without any constraint on the geometry. With VEDA 4 software, vibrational frequencies were assigned in terms of the potential energy distribution while, with the GaussSum software, the percentage contribution of the frontier orbitals at each transition of the electronic absorption spectrum was established. The obtained results indicated that optimized geometry could well reflect the molecular structural parameters from SC-XRD. Theoretical data obtained for the vibrational analysis and NMR spectra are consistent with experimental data.
Benazzi, F; Gernaey, K V; Jeppsson, U; Katebi, R
2007-08-01
In this paper, a new approach for on-line monitoring and detection of abnormal readily biodegradable substrate (S(s)) and slowly biodegradable substrate (X(s)) concentrations, for example due to input of toxic loads from the sewer, or due to influent substrate shock load, is proposed. Considering that measurements of S(s) and X(s) concentrations are not available in real wastewater treatment plants, the S(s) / X(s) software sensor can activate an alarm with a response time of about 60 and 90 minutes, respectively, based on the dissolved oxygen measurement. The software sensor implementation is based on an extended Kalman filter observer and disturbances are modelled using fast Fourier transform and spectrum analyses. Three case studies are described. The first one illustrates the fast and accurate convergence of the extended Kalman filter algorithm, which is achieved in less than 2 hours. Furthermore, the difficulties of estimating X(s) when off-line analysis is not available are depicted, and the S(s) / X(s) software sensor performances when no measurements of S(s) and X(s) are available are illustrated. Estimation problems related to the death-regeneration concept of the activated sludge model no.1 and possible application of the software sensor in wastewater monitoring are discussed.
A Virtual Approach to Teaching Safety Skills to Children with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Self, Trisha; Scudder, Rosalind R.; Weheba, Gamal; Crumrine, Daiquirie
2007-01-01
Recent advancements in the development of hardware/software configurations for delivering virtual reality (VR) environments to individuals with disabilities have included approaches for children with autism spectrum disorder (ASD). This article describes a study comparing benefits of using VR to benefits of an integrated/visual treatment model…
Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo
2017-08-04
Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.
The Software Element of the NASA Portable Electronic Device Radiated Emissions Investigation
NASA Technical Reports Server (NTRS)
Koppen, Sandra V.; Williams, Reuben A. (Technical Monitor)
2002-01-01
NASA Langley Research Center's (LaRC) High Intensity Radiated Fields Laboratory (HIRF Lab) recently conducted a series of electromagnetic radiated emissions tests under a cooperative agreement with Delta Airlines and an interagency agreement with the FAA. The frequency spectrum environment at a commercial airport was measured on location. The environment survey provides a comprehensive picture of the complex nature of the electromagnetic environment present in those areas outside the aircraft. In addition, radiated emissions tests were conducted on portable electronic devices (PEDs) that may be brought onboard aircraft. These tests were performed in both semi-anechoic and reverberation chambers located in the HIRF Lab. The PEDs included cell phones, laptop computers, electronic toys, and family radio systems. The data generated during the tests are intended to support the research on the effect of radiated emissions from wireless devices on aircraft systems. Both tests systems relied on customized control and data reduction software to provide test and instrument control, data acquisition, a user interface, real time data reduction, and data analysis. The software executed on PC's running MS Windows 98 and 2000, and used Agilent Pro Visual Engineering Environment (VEE) development software, Common Object Model (COM) technology, and MS Excel.
Panicker, C Yohannan; Varghese, Hema Tresa; Nayak, Prakash S; Narayana, B; Sarojini, B K; Fun, H K; War, Javeed Ahamad; Srivastava, S K; Van Alsenoy, C
2015-09-05
FT-IR spectrum of (2E)-3-(3-nitrophenyl)-1-[4-piperidin-1-yl]prop-2-en-1-one was recorded and analyzed. The vibrational wavenumbers were computed using HF and DFT quantum chemical calculations. The data obtained from wavenumber calculations are used to assign IR bands. Potential energy distribution was done using GAR2PED software. The geometrical parameters of the title compound are in agreement with the XRD results. NBO analysis, HOMO-LUMO, first and second hyperpolarizability and molecular electrostatic potential results are also reported. The possible electrophile attacking sites of the title molecule is identified using MEP surface plot study. Molecular docking results predicted the anti-leishmanic activity for the compound. Copyright © 2015. Published by Elsevier B.V.
Software-Defined Architectures for Spectrally Efficient Cognitive Networking in Extreme Environments
NASA Astrophysics Data System (ADS)
Sklivanitis, Georgios
The objective of this dissertation is the design, development, and experimental evaluation of novel algorithms and reconfigurable radio architectures for spectrally efficient cognitive networking in terrestrial, airborne, and underwater environments. Next-generation wireless communication architectures and networking protocols that maximize spectrum utilization efficiency in congested/contested or low-spectral availability (extreme) communication environments can enable a rich body of applications with unprecedented societal impact. In recent years, underwater wireless networks have attracted significant attention for military and commercial applications including oceanographic data collection, disaster prevention, tactical surveillance, offshore exploration, and pollution monitoring. Unmanned aerial systems that are autonomously networked and fully mobile can assist humans in extreme or difficult-to-reach environments and provide cost-effective wireless connectivity for devices without infrastructure coverage. Cognitive radio (CR) has emerged as a promising technology to maximize spectral efficiency in dynamically changing communication environments by adaptively reconfiguring radio communication parameters. At the same time, the fast developing technology of software-defined radio (SDR) platforms has enabled hardware realization of cognitive radio algorithms for opportunistic spectrum access. However, existing algorithmic designs and protocols for shared spectrum access do not effectively capture the interdependencies between radio parameters at the physical (PHY), medium-access control (MAC), and network (NET) layers of the network protocol stack. In addition, existing off-the-shelf radio platforms and SDR programmable architectures are far from fulfilling runtime adaptation and reconfiguration across PHY, MAC, and NET layers. Spectrum allocation in cognitive networks with multi-hop communication requirements depends on the location, network traffic load, and interference profile at each network node. As a result, the development and implementation of algorithms and cross-layer reconfigurable radio platforms that can jointly treat space, time, and frequency as a unified resource to be dynamically optimized according to inter- and intra-network interference constraints is of fundamental importance. In the next chapters, we present novel algorithmic and software/hardware implementation developments toward the deployment of spectrally efficient terrestrial, airborne, and underwater wireless networks. In Chapter 1 we review the state-of-art in commercially available SDR platforms, describe their software and hardware capabilities, and classify them based on their ability to enable rapid prototyping and advance experimental research in wireless networks. Chapter 2 discusses system design and implementation details toward real-time evaluation of a software-radio platform for all-spectrum cognitive channelization in the presence of narrowband or wideband primary stations. All-spectrum channelization is achieved by designing maximum signal-to-interference-plus-noise ratio (SINR) waveforms that span the whole continuum of the device-accessible spectrum, while satisfying peak power and interference temperature (IT) constraints for the secondary and primary users, respectively. In Chapter 3, we introduce the concept of all-spectrum channelization based on max-SINR optimized sparse-binary waveforms, we propose optimal and suboptimal waveform design algorithms, and evaluate their SINR and bit-error-rate (BER) performance in an SDR testbed. Chapter 4 considers the problem of channel estimation with minimal pilot signaling in multi-cell multi-user multi-input multi-output (MIMO) systems with very large antenna arrays at the base station, and proposes a least-squares (LS)-type algorithm that iteratively extracts channel and data estimates from a short record of data measurements. Our algorithmic developments toward spectrally-efficient cognitive networking through joint optimization of channel access code-waveforms and routes in a multi-hop network are described in Chapter 5. Algorithmic designs are software optimized on heterogeneous multi-core general-purpose processor (GPP)-based SDR architectures by leveraging a novel software-radio framework that offers self-optimization and real-time adaptation capabilities at the PHY, MAC, and NET layers of the network protocol stack. Our system design approach is experimentally validated under realistic conditions in a large-scale hybrid ground-air testbed deployment. Chapter 6 reviews the state-of-art in software and hardware platforms for underwater wireless networking and proposes a software-defined acoustic modem prototype that enables (i) cognitive reconfiguration of PHY/MAC parameters, and (ii) cross-technology communication adaptation. The proposed modem design is evaluated in terms of effective communication data rate in both water tank and lake testbed setups. In Chapter 7, we present a novel receiver configuration for code-waveform-based multiple-access underwater communications. The proposed receiver is fully reconfigurable and executes (i) all-spectrum cognitive channelization, and (ii) combined synchronization, channel estimation, and demodulation. Experimental evaluation in terms of SINR and BER show that all-spectrum channelization is a powerful proposition for underwater communications. At the same time, the proposed receiver design can significantly enhance bandwidth utilization. Finally, in Chapter 8, we focus on challenging practical issues that arise in underwater acoustic sensor network setups where co-located multi-antenna sensor deployment is not feasible due to power, computation, and hardware limitations, and design, implement, and evaluate an underwater receiver structure that accounts for multiple carrier frequency and timing offsets in virtual (distributed) MIMO underwater systems.
Resource utilization during software development
NASA Technical Reports Server (NTRS)
Zelkowitz, Marvin V.
1988-01-01
This paper discusses resource utilization over the life cycle of software development and discusses the role that the current 'waterfall' model plays in the actual software life cycle. Software production in the NASA environment was analyzed to measure these differences. The data from 13 different projects were collected by the Software Engineering Laboratory at NASA Goddard Space Flight Center and analyzed for similarities and differences. The results indicate that the waterfall model is not very realistic in practice, and that as technology introduces further perturbations to this model with concepts like executable specifications, rapid prototyping, and wide-spectrum languages, we need to modify our model of this process.
Fast radio burst search: cross spectrum vs. auto spectrum method
NASA Astrophysics Data System (ADS)
Liu, Lei; Zheng, Weimin; Yan, Zhen; Zhang, Juan
2018-06-01
The search for fast radio bursts (FRBs) is a hot topic in current radio astronomy studies. In this work, we carry out a single pulse search with a very long baseline interferometry (VLBI) pulsar observation data set using both auto spectrum and cross spectrum search methods. The cross spectrum method, first proposed in Liu et al., maximizes the signal power by fully utilizing the fringe phase information of the baseline cross spectrum. The auto spectrum search method is based on the popular pulsar software package PRESTO, which extracts single pulses from the auto spectrum of each station. According to our comparison, the cross spectrum method is able to enhance the signal power and therefore extract single pulses from data contaminated by high levels of radio frequency interference (RFI), which makes it possible to carry out a search for FRBs in regular VLBI observations when RFI is present.
Hoch, Jeffrey C
2017-10-01
Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development. Copyright © 2017 Elsevier Inc. All rights reserved.
A study of land mobile satellite service multipath effects using SATLAB software
NASA Technical Reports Server (NTRS)
Campbell, Richard L.
1991-01-01
A software package is proposed that uses the known properties of signals received in multipath environments along with the mathematical relationships between signal characteristics to explore the effects of antenna pattern, vehicle velocity, shadowing of the direct wave, distributions of scatters around the moving vehicle and levels of scattered signals on the received complex envelope, fade rates and fade duration, Doppler spectrum, signal arrival angle spectrum, and spatial correlation. The data base may be either actual measured received signals entered as ASCII flat files or data synthesized using a built in model. An example illustrates the effect of using different antennas to receive signals in the same environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Jessica Sarah
2011-01-01
The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm 2 atm. and sin 2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details themore » results of a v μ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin 2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm 2| = 2.32 +0.12 -0.08×10 -3 eV 2 and sin 2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass splitting Δm 2. The alternative models of neutrino decay and decoherence are disfavoured by 7.8σ and 9.7σ respectively.« less
The effects of clutter-rejection filtering on estimating weather spectrum parameters
NASA Technical Reports Server (NTRS)
Davis, W. T.
1989-01-01
The effects of clutter-rejection filtering on estimating the weather parameters from pulse Doppler radar measurement data are investigated. The pulse pair method of estimating the spectrum mean and spectrum width of the weather is emphasized. The loss of sensitivity, a measure of the signal power lost due to filtering, is also considered. A flexible software tool developed to investigate these effects is described. It allows for simulated weather radar data, in which the user specifies an underlying truncated Gaussian spectrum, as well as for externally generated data which may be real or simulated. The filter may be implemented in either the time or the frequency domain. The software tool is validated by comparing unfiltered spectrum mean and width estimates to their true values, and by reproducing previously published results. The effects on the weather parameter estimates using simulated weather-only data are evaluated for five filters: an ideal filter, two infinite impulse response filters, and two finite impulse response filters. Results considering external data, consisting of weather and clutter data, are evaluated on a range cell by range cell basis. Finally, it is shown theoretically and by computer simulation that a linear phase response is not required for a clutter rejection filter preceeding pulse-pair parameter estimation.
Correlative and multivariate analysis of increased radon concentration in underground laboratory.
Maletić, Dimitrije M; Udovičić, Vladimir I; Banjanac, Radomir M; Joković, Dejan R; Dragić, Aleksandar L; Veselinović, Nikola B; Filipović, Jelena
2014-11-01
The results of analysis using correlative and multivariate methods, as developed for data analysis in high-energy physics and implemented in the Toolkit for Multivariate Analysis software package, of the relations of the variation of increased radon concentration with climate variables in shallow underground laboratory is presented. Multivariate regression analysis identified a number of multivariate methods which can give a good evaluation of increased radon concentrations based on climate variables. The use of the multivariate regression methods will enable the investigation of the relations of specific climate variable with increased radon concentrations by analysis of regression methods resulting in 'mapped' underlying functional behaviour of radon concentrations depending on a wide spectrum of climate variables. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Daily quality assurance software for a satellite radiometer system
NASA Technical Reports Server (NTRS)
Keegstra, P. B.; Smoot, G. F.; Bennett, C. L.; Aymon, J.; Backus, C.; Deamici, G.; Hinshaw, G.; Jackson, P. D.; Kogut, A.; Lineweaver, C.
1992-01-01
Six Differential Microwave Radiometers (DMR) on COBE (Cosmic Background Explorer) measure the large-angular-scale isotropy of the cosmic microwave background (CMB) at 31.5, 53, and 90 GHz. Quality assurance software analyzes the daily telemetry from the spacecraft to ensure that the instrument is operating correctly and that the data are not corrupted. Quality assurance for DMR poses challenging requirements. The data are differential, so a single bad point can affect a large region of the sky, yet the CMB isotropy requires lengthy integration times (greater than 1 year) to limit potential CMB anisotropies. Celestial sources (with the exception of the moon) are not, in general, visible in the raw differential data. A 'quicklook' software system was developed that, in addition to basic plotting and limit-checking, implements a collection of data tests as well as long-term trending. Some of the key capabilities include the following: (1) stability analysis showing how well the data RMS averages down with increased data; (2) a Fourier analysis and autocorrelation routine to plot the power spectrum and confirm the presence of the 3 mK 'cosmic' dipole signal; (3) binning of the data against basic spacecraft quantities such as orbit angle; (4) long-term trending; and (5) dipole fits to confirm the spacecraft attitude azimuth angle.
NASA Astrophysics Data System (ADS)
Gong, X.; Wu, Q.
2017-12-01
Network virtual instrument (VI) is a new development direction in current automated test. Based on LabVIEW, the software and hardware system of VI used for emission spectrum of pulsed high-voltage direct current (DC) discharge is developed and applied to investigate pulsed high-voltage DC discharge of nitrogen. By doing so, various functions are realized including real time collection of emission spectrum of nitrogen, monitoring operation state of instruments and real time analysis and processing of data. By using shared variables and DataSocket technology in LabVIEW, the network VI system based on field VI is established. The system can acquire the emission spectrum of nitrogen in the test site, monitor operation states of field instruments, realize real time face-to-face interchange of two sites, and analyze data in the far-end from the network terminal. By employing the network VI system, the staff in the two sites acquired the same emission spectrum of nitrogen and conducted the real time communication. By comparing with the previous results, it can be seen that the experimental data obtained by using the system are highly precise. This implies that the system shows reliable network stability and safety and satisfies the requirements for studying the emission spectrum of pulsed high-voltage discharge in high-precision fields or network terminals. The proposed architecture system is described and the target group gets the useful enlightenment in many fields including engineering remote users, specifically in control- and automation-related tasks.
Chen, Jonathan H K; Cheng, Vincent C C; Wong, Chun-Pong; Wong, Sally C Y; Yam, Wing-Cheong; Yuen, Kwok-Yung
2017-09-01
Haemophilus influenzae is associated with severe invasive disease, while Haemophilus haemolyticus is considered part of the commensal flora in the human respiratory tract. Although the addition of a custom mass spectrum library into the matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) system could improve identification of these two species, the establishment of such a custom database is technically complicated and requires a large amount of resources, which most clinical laboratories cannot afford. In this study, we developed a mass spectrum analysis model with 7 mass peak biomarkers for the identification of H. influenzae and H. haemolyticus using the ClinProTools software. We evaluated the diagnostic performance of this model using 408 H. influenzae and H. haemolyticus isolates from clinical respiratory specimens from 363 hospitalized patients and compared the identification results with those obtained with the Bruker IVD MALDI Biotyper. The IVD MALDI Biotyper identified only 86.9% of H. influenzae (311/358) and 98.0% of H. haemolyticus (49/50) clinical isolates to the species level. In comparison, the ClinProTools mass spectrum model could identify 100% of H. influenzae (358/358) and H. haemolyticus (50/50) clinical strains to the species level and significantly improved the species identification rate (McNemar's test, P < 0.0001). In conclusion, the use of ClinProTools demonstrated an alternative way for users lacking special expertise in mass spectrometry to handle closely related bacterial species when the proprietary spectrum library failed. This approach should be useful for the differentiation of other closely related bacterial species. Copyright © 2017 American Society for Microbiology.
Cheng, Vincent C. C.; Wong, Chun-Pong; Wong, Sally C. Y.; Yam, Wing-Cheong
2017-01-01
ABSTRACT Haemophilus influenzae is associated with severe invasive disease, while Haemophilus haemolyticus is considered part of the commensal flora in the human respiratory tract. Although the addition of a custom mass spectrum library into the matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) system could improve identification of these two species, the establishment of such a custom database is technically complicated and requires a large amount of resources, which most clinical laboratories cannot afford. In this study, we developed a mass spectrum analysis model with 7 mass peak biomarkers for the identification of H. influenzae and H. haemolyticus using the ClinProTools software. We evaluated the diagnostic performance of this model using 408 H. influenzae and H. haemolyticus isolates from clinical respiratory specimens from 363 hospitalized patients and compared the identification results with those obtained with the Bruker IVD MALDI Biotyper. The IVD MALDI Biotyper identified only 86.9% of H. influenzae (311/358) and 98.0% of H. haemolyticus (49/50) clinical isolates to the species level. In comparison, the ClinProTools mass spectrum model could identify 100% of H. influenzae (358/358) and H. haemolyticus (50/50) clinical strains to the species level and significantly improved the species identification rate (McNemar's test, P < 0.0001). In conclusion, the use of ClinProTools demonstrated an alternative way for users lacking special expertise in mass spectrometry to handle closely related bacterial species when the proprietary spectrum library failed. This approach should be useful for the differentiation of other closely related bacterial species. PMID:28637909
Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R.; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W.; Moritz, Robert L.
2016-01-01
Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contributes to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), that enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the following iterations. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website. PMID:26419769
Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W; Moritz, Robert L
2015-11-01
Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contribute to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), which enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post-search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the iterations that follow. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R.; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W.; Moritz, Robert L.
2015-11-01
Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contribute to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), which enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post-search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the iterations that follow. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website.
Analysis of normal human retinal vascular network architecture using multifractal geometry
Ţălu, Ştefan; Stach, Sebastian; Călugăru, Dan Mihai; Lupaşcu, Carmen Alina; Nicoară, Simona Delia
2017-01-01
AIM To apply the multifractal analysis method as a quantitative approach to a comprehensive description of the microvascular network architecture of the normal human retina. METHODS Fifty volunteers were enrolled in this study in the Ophthalmological Clinic of Cluj-Napoca, Romania, between January 2012 and January 2014. A set of 100 segmented and skeletonised human retinal images, corresponding to normal states of the retina were studied. An automatic unsupervised method for retinal vessel segmentation was applied before multifractal analysis. The multifractal analysis of digital retinal images was made with computer algorithms, applying the standard box-counting method. Statistical analyses were performed using the GraphPad InStat software. RESULTS The architecture of normal human retinal microvascular network was able to be described using the multifractal geometry. The average of generalized dimensions (Dq) for q=0, 1, 2, the width of the multifractal spectrum (Δα=αmax − αmin) and the spectrum arms' heights difference (|Δf|) of the normal images were expressed as mean±standard deviation (SD): for segmented versions, D0=1.7014±0.0057; D1=1.6507±0.0058; D2=1.5772±0.0059; Δα=0.92441±0.0085; |Δf|= 0.1453±0.0051; for skeletonised versions, D0=1.6303±0.0051; D1=1.6012±0.0059; D2=1.5531±0.0058; Δα=0.65032±0.0162; |Δf|= 0.0238±0.0161. The average of generalized dimensions (Dq) for q=0, 1, 2, the width of the multifractal spectrum (Δα) and the spectrum arms' heights difference (|Δf|) of the segmented versions was slightly greater than the skeletonised versions. CONCLUSION The multifractal analysis of fundus photographs may be used as a quantitative parameter for the evaluation of the complex three-dimensional structure of the retinal microvasculature as a potential marker for early detection of topological changes associated with retinal diseases. PMID:28393036
VLSI Technology for Cognitive Radio
NASA Astrophysics Data System (ADS)
VIJAYALAKSHMI, B.; SIDDAIAH, P.
2017-08-01
One of the most challenging tasks of cognitive radio is the efficiency in the spectrum sensing scheme to overcome the spectrum scarcity problem. The popular and widely used spectrum sensing technique is the energy detection scheme as it is very simple and doesn’t require any previous information related to the signal. We propose one such approach which is an optimised spectrum sensing scheme with reduced filter structure. The optimisation is done in terms of area and power performance of the spectrum. The simulations of the VLSI structure of the optimised flexible spectrum is done using verilog coding by using the XILINX ISE software. Our method produces performance with 13% reduction in area and 66% reduction in power consumption in comparison to the flexible spectrum sensing scheme. All the results are tabulated and comparisons are made. A new scheme for optimised and effective spectrum sensing opens up with our model.
ELER software - a new tool for urban earthquake loss assessment
NASA Astrophysics Data System (ADS)
Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.
2010-12-01
Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.
Analysis and control of the vibration of doubly fed wind turbine
NASA Astrophysics Data System (ADS)
Yu, Manye; Lin, Ying
2017-01-01
The fault phenomena of the violent vibration of certain doubly-fed wind turbine were researched comprehensively, and the dynamic characteristics, load and fault conditions of the system were discussed. Firstly, the structural dynamics analysis of wind turbine is made, and the dynamics mold is built. Secondly, the vibration testing of wind turbine is done with the German test and analysis systems BBM. Thirdly, signal should be analyzed and dealt with. Based on the experiment, spectrum analysis of the motor dynamic balance can be made by using signal processing toolbox of MATLAB software, and the analysis conclusions show that the vibration of wind turbine is caused by dynamic imbalance. The results show that integrating mechanical system dynamics theory with advanced test technology can solve the vibration problem more successfully, which is important in vibration diagnosis of mechanical equipment.
User Interactive Software for Analysis of Human Physiological Data
NASA Technical Reports Server (NTRS)
Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta
2006-01-01
Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration
Analysis of electric vehicle extended range misalignment based on rigid-flexible dynamics
NASA Astrophysics Data System (ADS)
Xu, Xiaowei; Lv, Mingliang; Chen, Zibo; Ji, Wei; Gao, Ruiceng
2017-04-01
The safety of the extended range electric vehicle is seriously affected by the misalignment fault. Therefore, this paper analyzed the electric vehicle extended range misalignment based on rigid-flexible dynamics. Through comprehensively applied the hybrid modeling of rigid-flexible and the method of fault diagnosis of machinery and equipment comprehensively, it established a extender hybrid rigid flexible mechanical model by means of the software ADAMS and ANSYS. By setting the relevant parameters to simulate the misalignment of shafting, the failure phenomenon, the spectrum analysis and the evolution rules were analyzed. It concluded that 0.5th and 1 harmonics are considered as the characteristic parameters of misalignment diagnostics for electric vehicle extended range.
Assessing Visual-Spatial Creativity in Youth on the Autism Spectrum
ERIC Educational Resources Information Center
Diener, Marissa L.; Wright, Cheryl A.; Smith, Katherine N.; Wright, Scott D.
2014-01-01
The goal of this study was to develop a measure of creativity that builds on the strengths of youth with autism spectrum disorders (ASD). The assessment of creativity focused on the visual-spatial abilities of these youth using 3D modeling software. One of the objectives of the research was to develop a measure of creativity in an authentic…
A survey of fault diagnosis technology
NASA Technical Reports Server (NTRS)
Riedesel, Joel
1989-01-01
Existing techniques and methodologies for fault diagnosis are surveyed. The techniques run the gamut from theoretical artificial intelligence work to conventional software engineering applications. They are shown to define a spectrum of implementation alternatives where tradeoffs determine their position on the spectrum. Various tradeoffs include execution time limitations and memory requirements of the algorithms as well as their effectiveness in addressing the fault diagnosis problem.
Xue, Alexander T; Hickerson, Michael J
2017-11-01
Population genetic data from multiple taxa can address comparative phylogeographic questions about community-scale response to environmental shifts, and a useful strategy to this end is to employ hierarchical co-demographic models that directly test multi-taxa hypotheses within a single, unified analysis. This approach has been applied to classical phylogeographic data sets such as mitochondrial barcodes as well as reduced-genome polymorphism data sets that can yield 10,000s of SNPs, produced by emergent technologies such as RAD-seq and GBS. A strategy for the latter had been accomplished by adapting the site frequency spectrum to a novel summarization of population genomic data across multiple taxa called the aggregate site frequency spectrum (aSFS), which potentially can be deployed under various inferential frameworks including approximate Bayesian computation, random forest and composite likelihood optimization. Here, we introduce the r package multi-dice, a wrapper program that exploits existing simulation software for flexible execution of hierarchical model-based inference using the aSFS, which is derived from reduced genome data, as well as mitochondrial data. We validate several novel software features such as applying alternative inferential frameworks, enforcing a minimal threshold of time surrounding co-demographic pulses and specifying flexible hyperprior distributions. In sum, multi-dice provides comparative analysis within the familiar R environment while allowing a high degree of user customization, and will thus serve as a tool for comparative phylogeography and population genomics. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.
OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Greiner, Annette; Cholia, Shreyas
Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data accessmore » (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.« less
Real-Time Spatio-Temporal Twice Whitening for MIMO Energy Detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; Mitra, Pramita; Barhen, Jacob
2010-01-01
While many techniques exist for local spectrum sensing of a primary user, each represents a computationally demanding task to secondary user receivers. In software-defined radio, computational complexity lengthens the time for a cognitive radio to recognize changes in the transmission environment. This complexity is even more significant for spatially multiplexed receivers, e.g., in SIMO and MIMO, where the spatio-temporal data sets grow in size with the number of antennae. Limits on power and space for the processor hardware further constrain SDR performance. In this report, we discuss improvements in spatio-temporal twice whitening (STTW) for real-time local spectrum sensing by demonstratingmore » a form of STTW well suited for MIMO environments. We implement STTW on the Coherent Logix hx3100 processor, a multicore processor intended for low-power, high-throughput software-defined signal processing. These results demonstrate how coupling the novel capabilities of emerging multicore processors with algorithmic advances can enable real-time, software-defined processing of large spatio-temporal data sets.« less
ERIC Educational Resources Information Center
Lacava, Paul G.; Rankin, Ana; Mahlios, Emily; Cook, Katie; Simpson, Richard L.
2010-01-01
Many students with Autism Spectrum Disorders (ASD) have delays learning to recognize emotions. Social behavior is also challenging, including initiating interactions, responding to others, developing peer relationships, and so forth. In this single case design study we investigated the relationship between use of computer software ("Mind Reading:…
An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data
2011-12-01
Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx
Using Animated Language Software with Children Diagnosed with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Mulholland, Rita; Pete, Ann Marie; Popeson, Joanne
2008-01-01
We examined the impact of using an animated software program (Team Up With Timo) on the expressive and receptive language abilities of five children ages 5-9 in a self-contained Learning and Language Disabilities class. We chose to use Team Up With Timo (Animated Speech Corporation) because it allows the teacher to personalize the animation for…
Near-Earth object 2012XJ112 as a source of bright bolides of achondritic nature
NASA Astrophysics Data System (ADS)
Madiedo, José M.; Trigo-Rodríguez, Josep M.; Williams, Iwan P.; Konovalova, Natalia; Ortiz, José L.; Castro-Tirado, Alberto J.; Pastor, Sensi; de los Reyes, José A.; Cabrera-Caño, Jesús
2014-04-01
We analyse the likely link between the recently discovered near-Earth object 2012XJ112 and a bright fireball observed over the south of Spain on 2012 December 27. The bolide, with an absolute magnitude of -9 ± 1, was simultaneously imaged during the morning twilight from two meteor stations operated by the SPanish Meteor Network (SPMN). It was also observed by several casual witnesses. The emission spectrum produced during the ablation of the meteoroid in the atmosphere was also recorded. From its analysis, the chemical nature of this particle was inferred. Although our orbital association software identified several potential parent bodies for this meteoroid, the analysis of the evolution of the orbital elements performed with the MERCURY 6 symplectic integrator supports the idea that NEO 2012XJ112 is the source of this meteoroid. The implications of this potential association are discussed here. In particular, the meteoroid bulk chemistry is consistent with a basaltic achondrite, and this emphasizes the importance to deduce from future Earth approaches the reflectance spectrum and taxonomic nature of 2012XJ112.
Statistical analysis of low-voltage EDS spectrum images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, I.M.
1998-03-01
The benefits of using low ({le}5 kV) operating voltages for energy-dispersive X-ray spectrometry (EDS) of bulk specimens have been explored only during the last few years. This paper couples low-voltage EDS with two other emerging areas of characterization: spectrum imaging of a computer chip manufactured by a major semiconductor company. Data acquisition was performed with a Philips XL30-FEG SEM operated at 4 kV and equipped with an Oxford super-ATW detector and XP3 pulse processor. The specimen was normal to the electron beam and the take-off angle for acquisition was 35{degree}. The microscope was operated with a 150 {micro}m diameter finalmore » aperture at spot size 3, which yielded an X-ray count rate of {approximately}2,000 s{sup {minus}1}. EDS spectrum images were acquired as Adobe Photoshop files with the 4pi plug-in module. (The spectrum images could also be stored as NIH Image files, but the raw data are automatically rescaled as maximum-contrast (0--255) 8-bit TIFF images -- even at 16-bit resolution -- which poses an inconvenience for quantitative analysis.) The 4pi plug-in module is designed for EDS X-ray mapping and allows simultaneous acquisition of maps from 48 elements plus an SEM image. The spectrum image was acquired by re-defining the energy intervals of 48 elements to form a series of contiguous 20 eV windows from 1.25 kV to 2.19 kV. A spectrum image of 450 x 344 pixels was acquired from the specimen with a sampling density of 50 nm/pixel and a dwell time of 0.25 live seconds per pixel, for a total acquisition time of {approximately}14 h. The binary data files were imported into Mathematica for analysis with software developed by the author at Oak Ridge National Laboratory. A 400 x 300 pixel section of the original image was analyzed. MSA required {approximately}185 Mbytes of memory and {approximately}18 h of CPU time on a 300 MHz Power Macintosh 9600.« less
Prowess - A Software Model for the Ooty Wide Field Array
NASA Astrophysics Data System (ADS)
Marthi, Visweshwar Ram
2017-03-01
One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted H i emission from z ˜ 3.35. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA H i experiment.
NASA Astrophysics Data System (ADS)
Sert, Yusuf; Singer, L. M.; Findlater, M.; Doğan, Hatice; Çırak, Ç.
2014-07-01
In this study, the experimental and theoretical vibrational frequencies of a newly synthesized tert-Butyl N-(thiophen-2yl)carbamate have been investigated. The experimental FT-IR (4000-400 cm-1) spectrum of the molecule in the solid phase have been recorded. The theoretical vibrational frequencies and optimized geometric parameters (bond lengths and bond angles) have been calculated by using density functional theory (DFT/B3LYP: Becke, 3-parameter, Lee-Yang-Parr) and DFT/M06-2X (the highly parametrized, empirical exchange correlation function) quantum chemical methods with the 6-311++G(d,p) basis set by Gaussian 09W software, for the first time. The vibrational frequencies have been assigned using potential energy distribution (PED) analysis by using VEDA 4 software. The computational optimized geometric parameters and vibrational frequencies have been found to be in good agreement with the corresponding experimental data, and with related literature results. In addition, the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) energies and the other related molecular energy values have been calculated and are depicted.
Chromá, Magdaléna; Hricová, Kristýna; Kolář, Milan; Sauer, Pavel; Koukalová, Dagmar
2011-11-01
A total of 78 bacterial strains with known β-lactamases were used to optimize a rapid detection system consisting of multiplex polymerase chain reaction and melting curve analysis to amplify and identify blaTEM, blaSHV, and blaCTX-M genes in a single reaction. Additionally, to evaluate the applicability of this method, 32 clinical isolates of Escherichia coli displaying an extended-spectrum β-lactamase phenotype from patients hospitalized at intensive care units were tested. Results were analyzed by the Rotor-Gene operating software and Rotor-Gene ScreenClust HRM Software. The individual melting curves differed by a temperature shift or curve shape, according to the presence of β-lactamase genes. With the use of this method and direct sequencing, blaCTX-M-15-like was identified as the most prevalent β-lactamase gene. In conclusion, this novel detection system seems to be a suitable tool for rapid detection of present β-lactamase genes and their characterization. Copyright © 2011 Elsevier Inc. All rights reserved.
Measurement of the refractive index of solutions based on digital holographic microscopy
NASA Astrophysics Data System (ADS)
Huang, Sujuan; Wang, Weiping; Zeng, Junzhang; Yan, Cheng; Lin, Yunyi; Wang, Tingyun
2018-01-01
A new approach for the refractive index (RI) measurement of solutions is proposed based on digital holographic microscopy. The experimental system consists of a modified Mach-Zehnder interferometer and related lab-developed analysis software. The high quality digital hologram of the tested solution is obtained by the real-time analysis software, which is firstly encapsulated into a capillary tube, and then the capillary tube is inserted in a matching fluid. An angular spectrum algorithm is adopted to extract the phase distribution from the hologram recorded by a CCD. Based on a capillary multi-layer calculation model, the RI of the tested solution is obtained at high accuracy. The results of transparent glycerol solution measured by the proposed method are more accurate than those measured by the Abbe refractometer. We also measure the RI of translucent magnetic fluid, which is not suitable to be measured by the Abbe refractometer. The relationship between the RI and the concentration of magnetic fluid is experimentally studied, and the results show that the RI is linearly related to the concentration of dilute magnetic fluid.
EML Gamma Spectrometry Data Evaluation Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Decker, Karin M.
1998-02-28
This report represents the results of the analyses for the second EML Gamma Spectrometry Data Evaluation Program (August 1997). A calibration spectrum, a background spectrum and three sample spectra were included for each software format as part of the evaluation. The calibration spectrum contained nuclides covering the range from 59.5 keV to 1836 keV. The participants were told fallout and fission product nuclides as well as naturally occurring nuclides could be present. The samples were designed to test the detection and quantification of very low levels of nuclides and the ability of the software and user to properly resolve multiplets.more » The participants were asked to report values and uncertainties as Becquerel per sample with no decay correction. Twenty-nine sets of results were reported from a total of 70 laboratories who received the spectra. The percentage of the results within 1 F of the expected value was 76, 67, and 55 for samples 1, 2, and 3, respectively. From all three samples, 12% of the results were more than 3 F from the expected value. Sixty-two nuclides out of a total of 580 expected results were not reported for the three samples. Sixty percent of these false negatives were due to nuclides which were present at the minimum detectable activity level. There were 53 false positives reported with 60% of the responses due to problems with background subtraction. The results indicate that the Program is beneficial to the participating laboratories in that it provides them with analysis problems that are difficult to create with spiked samples due to the unavailability of many nuclides and the short half-lives of others. EML will continue its annual distribution, the third is to be held in March 1999.« less
ERIC Educational Resources Information Center
Fox, Mary Murphy
2012-01-01
The current study investigated Theory of Mind in young adults with autism. The young adults with autism spectrum disorder (ASD) consisted of four students between the ages of 18 and 19 from an on-campus program for students with autism located at Marywood University in Northeastern Pennsylvania. It was hypothesized that "Mind Reading",…
Digital Simulation of Thunder from Three-Dimensional Lightning
NASA Astrophysics Data System (ADS)
Dunkin, James; Fleisch, Daniel
2010-04-01
The physics of lightning and its resultant thunder have been investigated by many people, but we still don't have a full understanding of the governing processes. In this study, we have constructed a three-dimensional model of lightning using MATLAB^ software, and used N-waves as postulated by Ribner and Roy to synthesize the resultant thunder signature. In addition, we have taken an FFT of the thunder signature, and compared the time-domain waveform and frequency spectrum to recordings of thunder taken over the summer of 2009. This analysis is done with the goal of further understanding the processes of thunder production.
In the soft-to-hard technical spectrum: Where is software engineering?
NASA Technical Reports Server (NTRS)
Leibfried, Theodore F.; Macdonald, Robert B.
1992-01-01
In the computer journals and tabloids, there have been a plethora of articles written about the software engineering field. But while advocates of the need for an engineering approach to software development, it is impressive how many authors have treated the subject of software engineering without adequately addressing the fundamentals of what engineering as a discipline consists of. A discussion is presented of the various related facets of this issue in a logical framework to advance the thesis that the software development process is necessarily an engineering process. The purpose is to examine more of the details of the issue of whether or not the design and development of software for digital computer processing systems should be both viewed and treated as a legitimate field of professional engineering. Also, the type of academic and professional level education programs that would be required to support a software engineering discipline is examined.
Clustering analysis of line indices for LAMOST spectra with AstroStat
NASA Astrophysics Data System (ADS)
Chen, Shu-Xin; Sun, Wei-Min; Yan, Qi
2018-06-01
The application of data mining in astronomical surveys, such as the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) survey, provides an effective approach to automatically analyze a large amount of complex survey data. Unsupervised clustering could help astronomers find the associations and outliers in a big data set. In this paper, we employ the k-means method to perform clustering for the line index of LAMOST spectra with the powerful software AstroStat. Implementing the line index approach for analyzing astronomical spectra is an effective way to extract spectral features for low resolution spectra, which can represent the main spectral characteristics of stars. A total of 144 340 line indices for A type stars is analyzed through calculating their intra and inter distances between pairs of stars. For intra distance, we use the definition of Mahalanobis distance to explore the degree of clustering for each class, while for outlier detection, we define a local outlier factor for each spectrum. AstroStat furnishes a set of visualization tools for illustrating the analysis results. Checking the spectra detected as outliers, we find that most of them are problematic data and only a few correspond to rare astronomical objects. We show two examples of these outliers, a spectrum with abnormal continuumand a spectrum with emission lines. Our work demonstrates that line index clustering is a good method for examining data quality and identifying rare objects.
Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping
2018-05-22
Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.
Quantum Entanglement Molecular Absorption Spectrum Simulator
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2006-01-01
Quantum Entanglement Molecular Absorption Spectrum Simulator (QE-MASS) is a computer program for simulating two photon molecular-absorption spectroscopy using quantum-entangled photons. More specifically, QE-MASS simulates the molecular absorption of two quantum-entangled photons generated by the spontaneous parametric down-conversion (SPDC) of a fixed-frequency photon from a laser. The two-photon absorption process is modeled via a combination of rovibrational and electronic single-photon transitions, using a wave-function formalism. A two-photon absorption cross section as a function of the entanglement delay time between the two photons is computed, then subjected to a fast Fourier transform to produce an energy spectrum. The program then detects peaks in the Fourier spectrum and displays the energy levels of very short-lived intermediate quantum states (or virtual states) of the molecule. Such virtual states were only previously accessible using ultra-fast (femtosecond) laser systems. However, with the use of a single-frequency continuous wave laser to produce SPDC photons, and QEMASS program, these short-lived molecular states can now be studied using much simpler laser systems. QE-MASS can also show the dependence of the Fourier spectrum on the tuning range of the entanglement time of any externally introduced optical-path delay time. QE-MASS can be extended to any molecule for which an appropriate spectroscopic database is available. It is a means of performing an a priori parametric analysis of entangled photon spectroscopy for development and implementation of emerging quantum-spectroscopic sensing techniques. QE-MASS is currently implemented using the Mathcad software package.
A simple computer-based measurement and analysis system of pulmonary auscultation sounds.
Polat, Hüseyin; Güler, Inan
2004-12-01
Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.
A Pipeline Software Architecture for NMR Spectrum Data Translation
Ellis, Heidi J.C.; Weatherby, Gerard; Nowling, Ronald J.; Vyas, Jay; Fenwick, Matthew; Gryk, Michael R.
2012-01-01
The problem of formatting data so that it conforms to the required input for scientific data processing tools pervades scientific computing. The CONNecticut Joint University Research Group (CONNJUR) has developed a data translation tool based on a pipeline architecture that partially solves this problem. The CONNJUR Spectrum Translator supports data format translation for experiments that use Nuclear Magnetic Resonance to determine the structure of large protein molecules. PMID:24634607
Digital processing of RF signals from optical frequency combs
NASA Astrophysics Data System (ADS)
Cizek, Martin; Smid, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Cip, Ondřej
2013-01-01
The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Secondly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique used for assessing the offset and repetition frequencies of the comb, resulting in digital servo-loop stabilization of the fs comb. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset frequency of the fs comb.
Digital processing of signals from femtosecond combs
NASA Astrophysics Data System (ADS)
Čížek, Martin; Šmíd, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Číp, Ondrej
2012-01-01
The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique and with fully digital servo-loop stabilization of the fs comb. Secondly, we are using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset and repetition frequencies of the fs comb.
Peak Doctor v 1.0.0 Labview Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garner, Scott
2014-05-29
PeakDoctor software works interactively with its user to analyze raw gamma-ray spectroscopic data. The goal of the software is to produce a list of energies and areas of all of the peaks in the spectrum, as accurately as possible. It starts by performing an energy calibration, creating a function that describes how energy can be related to channel number. Next, the software determines which channels in the raw histogram are in the Compton continuum and which channels are parts of a peak. Then the software fits the Compton continuum with cubic polynomials. The last step is to fit all ofmore » the peaks with Gaussian functions, thus producing the list.« less
The Software Maintenance Spectrum: Using More than Just New Toys
2000-04-01
Deitel & Deitel, How to Program Java, Prentice Hall, Upper Saddle River, NJ, 1998. Bjarne Stroustrup, The C++ Programming Language, ATT Bell Labs, New... to Program Java, Prentice Hall, Upper Saddle River, NJ, 1998. Dershem, Herbert L and Michael J. Jipping, Programming Languages: Structures and Models...Chikofsky, Elliot and James Cross. Reverse Engineering and Design Recovery: A Taxonomy. IEEE Software, 7(1):13-17 (Jan 1990). Deitel & Deitel, How
Integral Observations of the Reflection Component of Seyfert Galaxies
NASA Technical Reports Server (NTRS)
Fabian, Andrew
2005-01-01
The data were analyzed by Dr. Fabian's student Adrian Turner and included in his thesis (completed Sept 2004). We did not detect MCG-6 using the then current software and the spectrum of the Circinus galaxy turned out to be even worse then the published BeppoSAX spectrum. We decided not to do any more work on it. We were contacted about the data in March by Thierry Courvoisier (the data were thea public) as he had a student, Simona Soidi, working on a compilation of spectra. Dr. Fabian sent them the chapter from Adrian's thesis and we provided some general comments on what they were doing on 6 objects. This has since been accepted for publication with Fabian as a co-author. A paper on the Integral AGN catalogue appeared on astro-ph a few days ago which contains an detection of MCG-6 with a very poor spectrum. We didn't detect it because the software back then required a source to be detected within something like 30 min exposure in order to work. Integral is NOT very sensitive.
Equivalent source modeling of the main field using MAGSAT data
NASA Technical Reports Server (NTRS)
1981-01-01
Modeling and software development of the main field using MAGSAT data is discussed. The cause of the apparent bulge in the power spectrum of Dipole model no. 4 was investigated by simulation with POGO crustal anomaly field model. Results for cases with and without noise, and the spectra of selected reslts are given. It is indicated that the beginning of the bump in the spectrum of Dipole no. 4 is due to crustal influence, while the departure of the spectrum from that of MGST (12/80-2) around expansion order 17 is due to the resolution limits of the Dipole density.
An automatic detection software for differential reflection spectroscopy
NASA Astrophysics Data System (ADS)
Yuksel, Seniha Esen; Dubroca, Thierry; Hummel, Rolf E.; Gader, Paul D.
2012-06-01
Recent terrorist attacks have sprung a need for a large scale explosive detector. Our group has developed differential reflection spectroscopy which can detect explosive residue on surfaces such as parcel, cargo and luggage. In short, broad band ultra-violet and visible light is shone onto a material (such as a parcel) moving on a conveyor belt. Upon reflection off the surface, the light intensity is recorded with a spectrograph (spectrometer in combination with a CCD camera). This reflected light intensity is then subtracted and normalized with the next data point collected, resulting in differential reflection spectra in the 200-500 nm range. Explosives show spectral finger-prints at specific wavelengths, for example, the spectrum of 2,4,6, trinitrotoluene (TNT) shows an absorption edge at 420 nm. Additionally, we have developed an automated software which detects the characteristic features of explosives. One of the biggest challenges for the algorithm is to reach a practical limit of detection. In this study, we introduce our automatic detection software which is a combination of principal component analysis and support vector machines. Finally we present the sensitivity and selectivity response of our algorithm as a function of the amount of explosive detected on a given surface.
The igmspec database of public spectra probing the intergalactic medium
NASA Astrophysics Data System (ADS)
Prochaska, J. X.
2017-04-01
We describe v02 of igmspec, a database of publicly available ultraviolet, optical, and near-infrared spectra that probe the intergalactic medium (IGM). This database, a child of the specdb repository in the specdb github organization, comprises 403 277 unique sources and 434 686 spectra obtained with the world's greatest observatories. All of these data are distributed in a single ≈ 25GB HDF5 file maintained at the University of California Observatories and the University of California, Santa Cruz. The specdb software package includes Python scripts and modules for searching the source catalog and spectral datasets, and software links to the linetools package for spectral analysis. The repository also includes software to generate private spectral datasets that are compliant with International Virtual Observatory Alliance (IVOA) protocols and a Python-based interface for IVOA Simple Spectral Access queries. Future versions of igmspec will ingest other sources (e.g. gamma-ray burst afterglows) and other surveys as they become publicly available. The overall goal is to include every spectrum that effectively probes the IGM. Future databases of specdb may include publicly available galaxy spectra (exgalspec) and published supernovae spectra (snspec). The community is encouraged to join the effort on github: https://github.com/specdb.
Mohler, Rachel E; Dombek, Kenneth M; Hoggard, Jamin C; Pierce, Karisa M; Young, Elton T; Synovec, Robert E
2007-08-01
The first extensive study of yeast metabolite GC x GC-TOFMS data from cells grown under fermenting, R, and respiring, DR, conditions is reported. In this study, recently developed chemometric software for use with three-dimensional instrumentation data was implemented, using a statistically-based Fisher ratio method. The Fisher ratio method is fully automated and will rapidly reduce the data to pinpoint two-dimensional chromatographic peaks differentiating sample types while utilizing all the mass channels. The effect of lowering the Fisher ratio threshold on peak identification was studied. At the lowest threshold (just above the noise level), 73 metabolite peaks were identified, nearly three-fold greater than the number of previously reported metabolite peaks identified (26). In addition to the 73 identified metabolites, 81 unknown metabolites were also located. A Parallel Factor Analysis graphical user interface (PARAFAC GUI) was applied to selected mass channels to obtain a concentration ratio, for each metabolite under the two growth conditions. Of the 73 known metabolites identified by the Fisher ratio method, 54 were statistically changing to the 95% confidence limit between the DR and R conditions according to the rigorous Student's t-test. PARAFAC determined the concentration ratio and provided a fully-deconvoluted (i.e. mathematically resolved) mass spectrum for each of the metabolites. The combination of the Fisher ratio method with the PARAFAC GUI provides high-throughput software for discovery-based metabolomics research, and is novel for GC x GC-TOFMS data due to the use of the entire data set in the analysis (640 MB x 70 runs, double precision floating point).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parsons, Taylor; Guo, Yi; Veers, Paul
Software models that use design-level input variables and physics-based engineering analysis for estimating the mass and geometrical properties of components in large-scale machinery can be very useful for analyzing design trade-offs in complex systems. This study uses DriveSE, an OpenMDAO-based drivetrain model that uses stress and deflection criteria to size drivetrain components within a geared, upwind wind turbine. Because a full lifetime fatigue load spectrum can only be defined using computationally-expensive simulations in programs such as FAST, a parameterized fatigue loads spectrum that depends on wind conditions, rotor diameter, and turbine design life has been implemented. The parameterized fatigue spectrummore » is only used in this paper to demonstrate the proposed fatigue analysis approach. This paper details a three-part investigation of the parameterized approach and a comparison of the DriveSE model with and without fatigue analysis on the main shaft system. It compares loads from three turbines of varying size and determines if and when fatigue governs drivetrain sizing compared to extreme load-driven design. It also investigates the model's sensitivity to shaft material parameters. The intent of this paper is to demonstrate how fatigue considerations in addition to extreme loads can be brought into a system engineering optimization.« less
Automating the deconfliction of jamming and spectrum management
NASA Astrophysics Data System (ADS)
Segner, Samuel M.
1988-12-01
Powerful airborne and ground based jammers are being fielded by all services and nations as part of their intelligence/electronic warfare (I/EW) combat capability. For their survivability, these I/EW systems operate far from the FLOT; this creates rather large denial areas to friendly forces when they jam. Manual coordination between IE/W managers and spectrum managers is not practical to take on targets of opportunities or track the intended enemy victims when these victims counter by frequency maneuvers. Two possible architectures, one centralized, the other decentralized, are explored as is the applicability of the electromagnetic compatibility (EMC) software developed for the U.S. Army Automatic Tactical Frequency Engineering System (ATFES) pilot program. The proposed approach is to apply the principles of the Joint Commanders EW Staff (JCEWS). The initial simplified software to demonstrate the computer aided coordination at VHF is explained.
NASA Astrophysics Data System (ADS)
Esen, Ayse Nur; Haciyakupoglu, Sevilay
2016-02-01
The purpose of this study is to test the applicability of k0-INAA method at the Istanbul Technical University TRIGA Mark II research reactor. The neutron spectrum parameters such as epithermal neutron flux distribution parameter (α), thermal to epithermal neutron flux ratio (f) and thermal neutron flux (φth) were determined at the central irradiation channel of the ITU TRIGA Mark II research reactor using bare triple-monitor method. HPGe detector calibrations and calculations were carried out by k0-IAEA software. The α, f and φth values were calculated to be -0.009, 15.4 and 7.92·1012 cm-2 s-1, respectively. NIST SRM 1633b coal fly ash and intercomparison samples consisting of clay and sandy soil samples were used to evaluate the validity of the method. For selected elements, the statistical evaluation of the analysis results was carried out by z-score test. A good agreement between certified/reported and experimental values was obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollister, R
2009-08-26
Method - CES SOP-HW-P556 'Field and Bulk Gamma Analysis'. Detector - High-purity germanium, 40% relative efficiency. Calibration - The detector was calibrated on February 8, 2006 using a NIST-traceable sealed source, and the calibration was verified using an independent sealed source. Count Time and Geometry - The sample was counted for 20 minutes at 72 inches from the detector. A lead collimator was used to limit the field-of-view to the region of the sample. The drum was rotated 180 degrees halfway through the count time. Date and Location of Scans - June 1,2006 in Building 235 Room 1136. Spectral Analysismore » Spectra were analyzed with ORTEC GammaVision software. Matrix and geometry corrections were calculated using OR TEC Isotopic software. A background spectrum was measured at the counting location. No man-made radioactivity was observed in the background. Results were determined from the sample spectra without background subtraction. Minimum detectable activities were calculated by the Nureg 4.16 method. Results - Detected Pu-238, Pu-239, Am-241 and Am-243.« less
A Web-Based Development Environment for Collaborative Data Analysis
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.
2014-06-01
Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.
Mantini, Dante; Petrucci, Francesca; Pieragostino, Damiana; Del Boccio, Piero; Sacchetta, Paolo; Candiano, Giovanni; Ghiggeri, Gian Marco; Lugaresi, Alessandra; Federici, Giorgio; Di Ilio, Carmine; Urbani, Andrea
2010-01-03
Mass spectrometry (MS) is becoming the gold standard for biomarker discovery. Several MS-based bioinformatics methods have been proposed for this application, but the divergence of the findings by different research groups on the same MS data suggests that the definition of a reliable method has not been achieved yet. In this work, we propose an integrated software platform, MASCAP, intended for comparative biomarker detection from MALDI-TOF MS data. MASCAP integrates denoising and feature extraction algorithms, which have already shown to provide consistent peaks across mass spectra; furthermore, it relies on statistical analysis and graphical tools to compare the results between groups. The effectiveness in mass spectrum processing is demonstrated using MALDI-TOF data, as well as SELDI-TOF data. The usefulness in detecting potential protein biomarkers is shown comparing MALDI-TOF mass spectra collected from serum and plasma samples belonging to the same clinical population. The analysis approach implemented in MASCAP may simplify biomarker detection, by assisting the recognition of proteomic expression signatures of the disease. A MATLAB implementation of the software and the data used for its validation are available at http://www.unich.it/proteomica/bioinf. (c) 2009 Elsevier B.V. All rights reserved.
Sert, Yusuf; Singer, L M; Findlater, M; Doğan, Hatice; Çırak, Ç
2014-07-15
In this study, the experimental and theoretical vibrational frequencies of a newly synthesized tert-Butyl N-(thiophen-2yl)carbamate have been investigated. The experimental FT-IR (4000-400 cm(-1)) spectrum of the molecule in the solid phase have been recorded. The theoretical vibrational frequencies and optimized geometric parameters (bond lengths and bond angles) have been calculated by using density functional theory (DFT/B3LYP: Becke, 3-parameter, Lee-Yang-Parr) and DFT/M06-2X (the highly parametrized, empirical exchange correlation function) quantum chemical methods with the 6-311++G(d,p) basis set by Gaussian 09W software, for the first time. The vibrational frequencies have been assigned using potential energy distribution (PED) analysis by using VEDA 4 software. The computational optimized geometric parameters and vibrational frequencies have been found to be in good agreement with the corresponding experimental data, and with related literature results. In addition, the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) energies and the other related molecular energy values have been calculated and are depicted. Copyright © 2014 Elsevier B.V. All rights reserved.
Analysis of acoustic emission during abrasive waterjet machining of sheet metals
NASA Astrophysics Data System (ADS)
Mokhtar, Nazrin; Gebremariam, MA; Zohari, H.; Azhari, Azmir
2018-04-01
The present paper reports on the analysis of acoustic emission (AE) produced during abrasive waterjet (AWJ) machining process. This paper focuses on the relationship of AE and surface quality of sheet metals. The changes in acoustic emission signals recorded by the mean of power spectral density (PSD) via covariance method in relation to the surface quality of the cut are discussed. The test was made using two materials for comparison namely aluminium 6061 and stainless steel 304 with five different feed rates. The acoustic emission data were captured by Labview and later processed using MATLAB software. The results show that the AE spectrums correlated with different feed rates and surface qualities. It can be concluded that the AE is capable of monitoring the changes of feed rate and surface quality.
Analysis of the packet formation process in packet-switched networks
NASA Astrophysics Data System (ADS)
Meditch, J. S.
Two new queueing system models for the packet formation process in packet-switched telecommunication networks are developed, and their applications in process stability, performance analysis, and optimization studies are illustrated. The first, an M/M/1 queueing system characterization of the process, is a highly aggregated model which is useful for preliminary studies. The second, a marked extension of an earlier M/G/1 model, permits one to investigate stability, performance characteristics, and design of the packet formation process in terms of the details of processor architecture, and hardware and software implementations with processor structure and as many parameters as desired as variables. The two new models together with the earlier M/G/1 characterization span the spectrum of modeling complexity for the packet formation process from basic to advanced.
Structural analysis of the industrial grade calcite
NASA Astrophysics Data System (ADS)
Shah, Rajiv P.; Raval, Kamlesh G.
2017-05-01
The chemical, optical and structural characterization of the industrial grade Calcite by EDAX, FT-IR and XRD. EDAX is a widely used technique to analyze the chemical components in a material, FT-IR stands for Fourier Transform Infra-Red, the preferred method of infrared spectroscopy. The resultant spectrum represents the molecular absorption and transmission, creating a molecular fingerprint of the sample, The atomic planes of a crystal cause an incident beam of X-rays to interfere with one another as they leave the crystal. The phenomenon is called X ray diffraction.(XRD). Data analysis of EDAX, FT-IR and XRD has been carried out with help of various instruments and software and find out the results of the these industrial grade materials which are mostly used in ceramics industries
Terrain-analysis procedures for modeling radar backscatter
Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis
1978-01-01
The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.
1992-01-01
perturbations and nonstationary interference effects so as to reduce decoding 13 DARPA SBIR PHASE I AWARDS errors for spread spectrum communications...potential applications - Utilization of spread spectrum techniques by DoD and others is increasing because ot robusines, to interference and fading...Mirror Devices ( DMD ) illuminated by a low power laser diode or led will be considered as a source. Commercial optical software in conjunction with in
PINS chemical identification software
Caffrey, Augustine J.; Krebs, Kennth M.
2004-09-14
An apparatus and method for identifying a chemical compound. A neutron source delivers neutrons into the chemical compound. The nuclei of chemical elements constituting the chemical compound emit gamma rays upon interaction with the neutrons. The gamma rays are characteristic of the chemical elements constituting the chemical compound. A spectrum of the gamma rays is generated having a detection count and an energy scale. The energy scale is calibrated by comparing peaks in the spectrum to energies of pre-selected chemical elements in the spectrum. A least-squares fit completes the calibration. The chemical elements constituting the chemical compound can be readily determined, which then allows for identification of the chemical compound.
Moulder, Robert; Filén, Jan-Jonas; Salmi, Jussi; Katajamaa, Mikko; Nevalainen, Olli S; Oresic, Matej; Aittokallio, Tero; Lahesmaa, Riitta; Nyman, Tuula A
2005-07-01
The options available for processing quantitative data from isotope coded affinity tag (ICAT) experiments have mostly been confined to software specific to the instrument of acquisition. However, recent developments with data format conversion have subsequently increased such processing opportunities. In the present study, data sets from ICAT experiments, analysed with liquid chromatography/tandem mass spectrometry (MS/MS), using an Applied Biosystems QSTAR Pulsar quadrupole-TOF mass spectrometer, were processed in triplicate using separate mass spectrometry software packages. The programs Pro ICAT, Spectrum Mill and SEQUEST with XPRESS were employed. Attention was paid towards the extent of common identification and agreement of quantitative results, with additional interest in the flexibility and productivity of these programs. The comparisons were made with data from the analysis of a specifically prepared test mixture, nine proteins at a range of relative concentration ratios from 0.1 to 10 (light to heavy labelled forms), as a known control, and data selected from an ICAT study involving the measurement of cytokine induced protein expression in human lymphoblasts, as an applied example. Dissimilarities were detected in peptide identification that reflected how the associated scoring parameters favoured information from the MS/MS data sets. Accordingly, there were differences in the numbers of peptides and protein identifications, although from these it was apparent that both confirmatory and complementary information was present. In the quantitative results from the three programs, no statistically significant differences were observed.
NASA Technical Reports Server (NTRS)
Stocks, Dana R.
1986-01-01
The Dynamic Gas Temperature Measurement System compensation software accepts digitized data from two different diameter thermocouples and computes a compensated frequency response spectrum for one of the thermocouples. Detailed discussions of the physical system, analytical model, and computer software are presented in this volume and in Volume 1 of this report under Task 3. Computer program software restrictions and test cases are also presented. Compensated and uncompensated data may be presented in either the time or frequency domain. Time domain data are presented as instantaneous temperature vs time. Frequency domain data may be presented in several forms such as power spectral density vs frequency.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Automated three-dimensional quantification of myocardial perfusion and brain SPECT.
Slomka, P J; Radau, P; Hurwitz, G A; Dey, D
2001-01-01
To allow automated and objective reading of nuclear medicine tomography, we have developed a set of tools for clinical analysis of myocardial perfusion tomography (PERFIT) and Brain SPECT/PET (BRASS). We exploit algorithms for image registration and use three-dimensional (3D) "normal models" for individual patient comparisons to composite datasets on a "voxel-by-voxel basis" in order to automatically determine the statistically significant abnormalities. A multistage, 3D iterative inter-subject registration of patient images to normal templates is applied, including automated masking of the external activity before final fit. In separate projects, the software has been applied to the analysis of myocardial perfusion SPECT, as well as brain SPECT and PET data. Automatic reading was consistent with visual analysis; it can be applied to the whole spectrum of clinical images, and aid physicians in the daily interpretation of tomographic nuclear medicine images.
Sheng, Quanhu; Li, Rongxia; Dai, Jie; Li, Qingrun; Su, Zhiduan; Guo, Yan; Li, Chen; Shyr, Yu; Zeng, Rong
2015-01-01
Isobaric labeling techniques coupled with high-resolution mass spectrometry have been widely employed in proteomic workflows requiring relative quantification. For each high-resolution tandem mass spectrum (MS/MS), isobaric labeling techniques can be used not only to quantify the peptide from different samples by reporter ions, but also to identify the peptide it is derived from. Because the ions related to isobaric labeling may act as noise in database searching, the MS/MS spectrum should be preprocessed before peptide or protein identification. In this article, we demonstrate that there are a lot of high-frequency, high-abundance isobaric related ions in the MS/MS spectrum, and removing isobaric related ions combined with deisotoping and deconvolution in MS/MS preprocessing procedures significantly improves the peptide/protein identification sensitivity. The user-friendly software package TurboRaw2MGF (v2.0) has been implemented for converting raw TIC data files to mascot generic format files and can be downloaded for free from https://github.com/shengqh/RCPA.Tools/releases as part of the software suite ProteomicsTools. The data have been deposited to the ProteomeXchange with identifier PXD000994. PMID:25435543
Feasibility and demonstration of a cloud-based RIID analysis system
NASA Astrophysics Data System (ADS)
Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.
2015-06-01
A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.
Thin-film thickness measurement method based on the reflection interference spectrum
NASA Astrophysics Data System (ADS)
Jiang, Li Na; Feng, Gao; Shu, Zhang
2012-09-01
A method is introduced to measure the thin-film thickness, refractive index and other optical constants. When a beam of white light shines on the surface of the sample film, the reflected lights of the upper and the lower surface of the thin-film will interfere with each other and reflectivity of the film will fluctuate with light wavelength. The reflection interference spectrum is analyzed with software according to the database, while the thickness and refractive index of the thin-film is measured.
2012-08-01
EEG protocols, including hardware and software for neurofeedback training, were...rationale for using neurofeedback to affect changes in children on the autism spectrum is rooted in...functionally linked to the MNS network. Third, modifying these oscillation dynamics via neurofeedback
Performance of twist-coupled blades on variable speed rotors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lobitz, D.W.; Veers, P.S.; Laino, D.J.
1999-12-07
The load mitigation and energy capture characteristics of twist-coupled HAWT blades that are mounted on a variable speed rotor are investigated in this paper. These blades are designed to twist toward feather as they bend with pretwist set to achieve a desirable twist distribution at rated power. For this investigation, the ADAMS-WT software has been modified to include blade models with bending-twist coupling. Using twist-coupled and uncoupled models, the ADAMS software is exercised for steady wind environments to generate C{sub p} curves at a number of operating speeds to compare the efficiencies of the two models. The ADAMS software ismore » also used to generate the response of a twist-coupled variable speed rotor to a spectrum of stochastic wind time series. This spectrum contains time series with two mean wind speeds at two turbulence levels. Power control is achieved by imposing a reactive torque on the low speed shaft proportional to the RPM squared with the coefficient specified so that the rotor operates at peak efficiency in the linear aerodynamic range, and by limiting the maximum RPM to take advantage of the stall controlled nature of the rotor. Fatigue calculations are done for the generated load histories using a range of material exponents that represent materials from welded steel to aluminum to composites, and results are compared with the damage computed for the rotor without twist-coupling. Results indicate that significant reductions in damage are achieved across the spectrum of applied wind loading without any degradation in power production.« less
Cognitive software defined radar: waveform design for clutter and interference suppression
NASA Astrophysics Data System (ADS)
Kirk, Benjamin H.; Owen, Jonathan W.; Narayanan, Ram M.; Blunt, Shannon D.; Martone, Anthony F.; Sherbondy, Kelly D.
2017-05-01
Clutter and radio frequency interference (RFI) are prevalent issues in the field of radar and are specifically of interest to of cognitive radar. Here, methods for applying and testing the utility of cognitive radar for clutter and RFI mitigation are explored. Using the adaptable transmit capability, environmental database, and general "awareness" of a cognitive radar system (i.e. spectrum sensing, geographical location, etc.), a matched waveform is synthesized that improves the signal-to-clutter ratio (SCR), assuming at least an estimate of the target response and the environmental clutter response are known a prior i. RFI may also be mitigated by sensing the RF spectrum and adapting the transmit center frequency and bandwidth using methods that optimize bandwidth and signal-to-interference plus noise ratio (SINR) (i.e. the spectrum sensing, multi-objective (SS-MO) algorithm). The improvement is shown by a decrease in the noise floor. The above methods' effectiveness are examined via a test-bed developed around a software defined radio (SDR). Testing and the general use of commercial off the shelf (COTS) devices are desirable for their cost effectiveness, general ease of use, as well as technical and community support, but these devices provide design challenges in order to be effective. The universal software radio peripheral (USRP) X310 SDR is a relatively cheap and portable device that has all the system components of a basic cognitive radar. Design challenges of the SDR include phase coherency between channels, bandwidth limitations, dynamic range, and speed of computation and data communication / recording.
2008-04-16
Zhen (Edward) Hu Peng (Peter) Zhang Yu Song Amanpreet Singh Saini Corey Cooke April 16, 2006 Department of Electrical and Computer Engineering Center...and RF frequency agility is the most challenging issue for spectrum sensing. The radio under development is an ultra-wideband software -defined radio...PC USB programming cable and accom- panying PC software as well as download test vectors to the waveform memory module, as shown in Figure 3.25,3I
dbMDEGA: a database for meta-analysis of differentially expressed genes in autism spectrum disorder.
Zhang, Shuyun; Deng, Libin; Jia, Qiyue; Huang, Shaoting; Gu, Junwang; Zhou, Fankun; Gao, Meng; Sun, Xinyi; Feng, Chang; Fan, Guangqin
2017-11-16
Autism spectrum disorders (ASD) are hereditary, heterogeneous and biologically complex neurodevelopmental disorders. Individual studies on gene expression in ASD cannot provide clear consensus conclusions. Therefore, a systematic review to synthesize the current findings from brain tissues and a search tool to share the meta-analysis results are urgently needed. Here, we conducted a meta-analysis of brain gene expression profiles in the current reported human ASD expression datasets (with 84 frozen male cortex samples, 17 female cortex samples, 32 cerebellum samples and 4 formalin fixed samples) and knock-out mouse ASD model expression datasets (with 80 collective brain samples). Then, we applied R language software and developed an interactive shared and updated database (dbMDEGA) displaying the results of meta-analysis of data from ASD studies regarding differentially expressed genes (DEGs) in the brain. This database, dbMDEGA ( https://dbmdega.shinyapps.io/dbMDEGA/ ), is a publicly available web-portal for manual annotation and visualization of DEGs in the brain from data from ASD studies. This database uniquely presents meta-analysis values and homologous forest plots of DEGs in brain tissues. Gene entries are annotated with meta-values, statistical values and forest plots of DEGs in brain samples. This database aims to provide searchable meta-analysis results based on the current reported brain gene expression datasets of ASD to help detect candidate genes underlying this disorder. This new analytical tool may provide valuable assistance in the discovery of DEGs and the elucidation of the molecular pathogenicity of ASD. This database model may be replicated to study other disorders.
Advances in Mössbauer data analysis
NASA Astrophysics Data System (ADS)
de Souza, Paulo A.
1998-08-01
The whole Mössbauer community generates a huge amount of data in several fields of human knowledge since the first publication of Rudolf Mössbauer. Interlaboratory measurements of the same substance may result in minor differences in the Mössbauer Parameters (MP) of isomer shift, quadrupole splitting and internal magnetic field. Therefore, a conventional data bank of published MP will be of limited help in identification of substances. Data bank search for exact information became incapable to differentiate the values of Mössbauer parameters within the experimental errors (e.g., IS = 0.22 mm/s from IS = 0.23 mm/s), but physically both values may be considered the same. An artificial neural network (ANN) is able to identify a substance and its crystalline structure from measured MP, and its slight variations do not represent an obstacle for the ANN identification. A barrier to the popularization of Mössbauer spectroscopy as an analytical technique is the absence of a full automated equipment, since the analysis of a Mössbauer spectrum normally is time-consuming and requires a specialist. In this work, the fitting process of a Mössbauer spectrum was completely automated through the use of genetic algorithms and fuzzy logic. Both software and hardware systems were implemented turning out to be a fully automated Mössbauer data analysis system. The developed system will be presented.
An open-loop system design for deep space signal processing applications
NASA Astrophysics Data System (ADS)
Tang, Jifei; Xia, Lanhua; Mahapatra, Rabi
2018-06-01
A novel open-loop system design with high performance is proposed for space positioning and navigation signal processing. Divided by functions, the system has four modules, bandwidth selectable data recorder, narrowband signal analyzer, time-delay difference of arrival estimator and ANFIS supplement processor. A hardware-software co-design approach is made to accelerate computing capability and improve system efficiency. Embedded with the proposed signal processing algorithms, the designed system is capable of handling tasks with high accuracy over long period of continuous measurements. The experiment results show the Doppler frequency tracking root mean square error during 3 h observation is 0.0128 Hz, while the TDOA residue analysis in correlation power spectrum is 0.1166 rad.
Center for Space Telemetering and Telecommunications Systems, New Mexico State University
NASA Technical Reports Server (NTRS)
Horan, Stephen; DeLeon, Phillip; Borah, Deva; Lyman, Ray
2002-01-01
This viewgraph presentation gives an overview of the Center for Space Telemetering and Telecommunications Systems activities at New Mexico State University. Presentations cover the following topics: (1) small satellite communications, including nanosatellite radio and virtual satellite development; (2) modulation and detection studies, including details on smooth phase interpolated keying (SPIK) spectra and highlights of an adaptive turbo multiuser detector; (3) decoupled approaches to nonlinear ISI compensation; (4) space internet testing; (4) optical communication; (5) Linux-based receiver for lightweight optical communications without a laser in space, including software design, performance analysis, and the receiver algorithm; (6) carrier tracking hardware; and (7) subband transforms for adaptive direct sequence spread spectrum receivers.
Computation of tightly-focused laser beams in the FDTD method
Çapoğlu, İlker R.; Taflove, Allen; Backman, Vadim
2013-01-01
We demonstrate how a tightly-focused coherent TEMmn laser beam can be computed in the finite-difference time-domain (FDTD) method. The electromagnetic field around the focus is decomposed into a plane-wave spectrum, and approximated by a finite number of plane waves injected into the FDTD grid using the total-field/scattered-field (TF/SF) method. We provide an error analysis, and guidelines for the discrete approximation. We analyze the scattering of the beam from layered spaces and individual scatterers. The described method should be useful for the simulation of confocal microscopy and optical data storage. An implementation of the method can be found in our free and open source FDTD software (“Angora”). PMID:23388899
Computation of tightly-focused laser beams in the FDTD method.
Capoğlu, Ilker R; Taflove, Allen; Backman, Vadim
2013-01-14
We demonstrate how a tightly-focused coherent TEMmn laser beam can be computed in the finite-difference time-domain (FDTD) method. The electromagnetic field around the focus is decomposed into a plane-wave spectrum, and approximated by a finite number of plane waves injected into the FDTD grid using the total-field/scattered-field (TF/SF) method. We provide an error analysis, and guidelines for the discrete approximation. We analyze the scattering of the beam from layered spaces and individual scatterers. The described method should be useful for the simulation of confocal microscopy and optical data storage. An implementation of the method can be found in our free and open source FDTD software ("Angora").
An unstructured-grid software system for solving complex aerodynamic problems
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh
1995-01-01
A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.
WannierTools: An open-source software package for novel topological materials
NASA Astrophysics Data System (ADS)
Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.
2018-03-01
We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).
Improved CLARAty Functional-Layer/Decision-Layer Interface
NASA Technical Reports Server (NTRS)
Estlin, Tara; Rabideau, Gregg; Gaines, Daniel; Johnston, Mark; Chouinard, Caroline; Nessnas, Issa; Shu, I-Hsiang
2008-01-01
Improved interface software for communication between the CLARAty Decision and Functional layers has been developed. [The Coupled Layer Architecture for Robotics Autonomy (CLARAty) was described in Coupled-Layer Robotics Architecture for Autonomy (NPO-21218), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48. To recapitulate: the CLARAty architecture was developed to improve the modularity of robotic software while tightening coupling between planning/execution and basic control subsystems. Whereas prior robotic software architectures typically contained three layers, the CLARAty contains two layers: a decision layer (DL) and a functional layer (FL).] Types of communication supported by the present software include sending commands from DL modules to FL modules and sending data updates from FL modules to DL modules. The present software supplants prior interface software that had little error-checking capability, supported data parameters in string form only, supported commanding at only one level of the FL, and supported only limited updates of the state of the robot. The present software offers strong error checking, and supports complex data structures and commanding at multiple levels of the FL, and relative to the prior software, offers a much wider spectrum of state-update capabilities.
Simmons, Elizabeth Schoen; Paul, Rhea; Shic, Frederick
2016-01-01
This study examined the acceptability of a mobile application, SpeechPrompts, designed to treat prosodic disorders in children with ASD and other communication impairments. Ten speech-language pathologists (SLPs) in public schools and 40 of their students, 5-19 years with prosody deficits participated. Students received treatment with the software over eight weeks. Pre- and post-treatment speech samples and student engagement data were collected. Feedback on the utility of the software was also obtained. SLPs implemented the software with their students in an authentic education setting. Student engagement ratings indicated students' attention to the software was maintained during treatment. Although more testing is warranted, post-treatment prosody ratings suggest that SpeechPrompts has potential to be a useful tool in the treatment of prosodic disorders.
NASA Astrophysics Data System (ADS)
Eftekhari Zadeh, E.; Feghhi, S. A. H.; Roshani, G. H.; Rezaei, A.
2016-05-01
Due to variation of neutron energy spectrum in the target sample during the activation process and to peak overlapping caused by the Compton effect with gamma radiations emitted from activated elements, which results in background changes and consequently complex gamma spectrum during the measurement process, quantitative analysis will ultimately be problematic. Since there is no simple analytical correlation between peaks' counts with elements' concentrations, an artificial neural network for analyzing spectra can be a helpful tool. This work describes a study on the application of a neural network to determine the percentages of cement elements (mainly Ca, Si, Al, and Fe) using the neutron capture delayed gamma-ray spectra of the substance emitted by the activated nuclei as patterns which were simulated via the Monte Carlo N-particle transport code, version 2.7. The Radial Basis Function (RBF) network is developed with four specific peaks related to Ca, Si, Al and Fe, which were extracted as inputs. The proposed RBF model is developed and trained with MATLAB 7.8 software. To obtain the optimal RBF model, several structures have been constructed and tested. The comparison between simulated and predicted values using the proposed RBF model shows that there is a good agreement between them.
Beta Pic-like Circumstellar Gas Disk Around 2 And
NASA Technical Reports Server (NTRS)
Cheng, Patricia
2003-01-01
This grant was awarded to support the data analysis and publication of results from our project entitled P Pic-like Circumstellar Gas Disk Around 2 And . We proposed to obtain FUSE observations of 2 And and study the characteristics and origin of its circumstellar gas. We observed 2 Andromedae with FUSE on 3-4 July 2001 in 11 exposures with a total exposure time of 21,289 seconds through the LWRS aperture. Our data were calibrated with Version 1.8.7 of the CALFUSE pipeline processing software. We corrected the wavelength scale for the heliocentric velocity error in this version of the CALFUSE software. The relative accuracy of the calibrated wavelength scale is +/- 9 km/s . We produced a co-added spectrum in the LiF 1B and LiF 2A channels (covering the 1100 to 1180 A region) by cross-correlating the 11 individual exposures and doing an exposure-time weighted average flux. The final co-added spectra have a signal-to-noise ratio in the stellar continuum near 1150 A of about 20. To obtain an absolute wavelength calibration, we cross-correlated our observed spectra with a model spectrum to obtain the best fit for the photospheric C I lines. Because the photospheric lines are very broad, this yields an absolute accuracy for the wavelength scale of approx.+/- 15 km/s. We then rebinned 5 original pixels to yield the optimal sampling of .033 A for each new pixel, because the calibrated spectra oversample the spectral resolution for FUSE+LWRS (R = 20,000 +/- 2,000).
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
David, Matthieu; Fertin, Guillaume; Rogniaux, Hélène; Tessier, Dominique
2017-08-04
The analysis of discovery proteomics experiments relies on algorithms that identify peptides from their tandem mass spectra. The almost exhaustive interpretation of these spectra remains an unresolved issue. At present, an important number of missing interpretations is probably due to peptides displaying post-translational modifications and variants that yield spectra that are particularly difficult to interpret. However, the emergence of a new generation of mass spectrometers that provide high fragment ion accuracy has paved the way for more efficient algorithms. We present a new software, SpecOMS, that can handle the computational complexity of pairwise comparisons of spectra in the context of large volumes. SpecOMS can compare a whole set of experimental spectra generated by a discovery proteomics experiment to a whole set of theoretical spectra deduced from a protein database in a few minutes on a standard workstation. SpecOMS can ingeniously exploit those capabilities to improve the peptide identification process, allowing strong competition between all possible peptides for spectrum interpretation. Remarkably, this software resolves the drawbacks (i.e., efficiency problems and decreased sensitivity) that usually accompany open modification searches. We highlight this promising approach using results obtained from the analysis of a public human data set downloaded from the PRIDE (PRoteomics IDEntification) database.
Performance Data, Analytics & Services Job Logs & Statistics Training & Tutorials Software Outages NERSC Training Spectrum Scale User Group Meeting Live Status Now Computing Queue Look MOTD » Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale March 29, 2018
Speech Spectrum's Correlation with Speakers' Eysenck Personality Traits
Hu, Chao; Wang, Qiandong; Short, Lindsey A.; Fu, Genyue
2012-01-01
The current study explored the correlation between speakers' Eysenck personality traits and speech spectrum parameters. Forty-six subjects completed the Eysenck Personality Questionnaire. They were instructed to verbally answer the questions shown on a computer screen and their responses were recorded by the computer. Spectrum parameters of /sh/ and /i/ were analyzed by Praat voice software. Formant frequencies of the consonant /sh/ in lying responses were significantly lower than that in truthful responses, whereas no difference existed on the vowel /i/ speech spectrum. The second formant bandwidth of the consonant /sh/ speech spectrum was significantly correlated with the personality traits of Psychoticism, Extraversion, and Neuroticism, and the correlation differed between truthful and lying responses, whereas the first formant frequency of the vowel /i/ speech spectrum was negatively correlated with Neuroticism in both response types. The results suggest that personality characteristics may be conveyed through the human voice, although the extent to which these effects are due to physiological differences in the organs associated with speech or to a general Pygmalion effect is yet unknown. PMID:22439014
Science Gateways, Scientific Workflows and Open Community Software
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Marru, S.
2014-12-01
Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.
Design of apochromatic lens with large field and high definition for machine vision.
Yang, Ao; Gao, Xingyu; Li, Mingfeng
2016-08-01
Precise machine vision detection for a large object at a finite working distance (WD) requires that the lens has a high resolution for a large field of view (FOV). In this case, the effect of a secondary spectrum on image quality is not negligible. According to the detection requirements, a high resolution apochromatic objective is designed and analyzed. The initial optical structure (IOS) is combined with three segments. Next, the secondary spectrum of the IOS is corrected by replacing glasses using the dispersion vector analysis method based on the Buchdahl dispersion equation. Other aberrations are optimized by the commercial optical design software ZEMAX by properly choosing the optimization function operands. The optimized optical structure (OOS) has an f-number (F/#) of 3.08, a FOV of φ60 mm, a WD of 240 mm, and a modulated transfer function (MTF) of all fields of more than 0.1 at 320 cycles/mm. The design requirements for a nonfluorite material apochromatic objective lens with a large field and high definition for machine vision detection have been achieved.
An overview of the CILBO spectral observation program
NASA Astrophysics Data System (ADS)
Rudawska, R.; Zender, J.; Koschny, D.
2016-01-01
The video equipment can be easily adopted with a spectral grating to obtain spectral information from meteors. Therefore, in recent years spectroscopic observations of meteors have become quite popular. The Meteor Research Group (MRG) of the European Space Agency has been working on upgrating the analysis of meteor spectra as well, operating image-intensified camera with objective grating (ICC8). ICC8 is located on Tenerife station of the double-station camera setup CILBO (Canary Island Long-Baseline Observatory). The pipeline software processes data with the standard calibration procedure (dark current, flat field, lens distortion corrections). While using the position of a meteor recorded by ICC7 camera (zero order), the position of the 1st order spectrum as a function of wavelength is computed Moreover, thanks to the double meteor observations carried by ICC7 (Tenerife) and ICC9 (La Palma), trajectory of a meteor and its orbit is determined. Which merged with simultaneously measurement of meteor spectrum from ICC8, allow us to identify the source of the meteoroid. Here, we report on preliminary results from a sample of meteor spectra collected by CILBO-ICC8 camera since 2012.
Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M.; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A.; Bartholmai, Brian J.
2015-01-01
Rationale: Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. Objectives: To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. Methods: We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. Measurements and Main Results: A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. Conclusions: CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas. PMID:26052977
Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias
2015-09-15
Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.
Tandem Mass Spectrum Sequencing: An Alternative to Database Search Engines in Shotgun Proteomics.
Muth, Thilo; Rapp, Erdmann; Berven, Frode S; Barsnes, Harald; Vaudel, Marc
2016-01-01
Protein identification via database searches has become the gold standard in mass spectrometry based shotgun proteomics. However, as the quality of tandem mass spectra improves, direct mass spectrum sequencing gains interest as a database-independent alternative. In this chapter, the general principle of this so-called de novo sequencing is introduced along with pitfalls and challenges of the technique. The main tools available are presented with a focus on user friendly open source software which can be directly applied in everyday proteomic workflows.
Network of wireless gamma ray sensors for radiological detection and identification
NASA Astrophysics Data System (ADS)
Barzilov, A.; Womble, P.; Novikov, I.; Paschal, J.; Board, J.; Moss, K.
2007-04-01
The paper describes the design and development of a network of wireless gamma-ray sensors based on cell phone or WiFi technology. The system is intended for gamma-ray detection and automatic identification of radioactive isotopes and nuclear materials. The sensor is a gamma-ray spectrometer that uses wireless technology to distribute the results. A small-size sensor module contains a scintillation detector along with a small size data acquisition system, PDA, battery, and WiFi radio or a cell phone modem. The PDA with data acquisition and analysis software analyzes the accumulated spectrum on real-time basis and returns results to the screen reporting the isotopic composition and intensity of detected radiation source. The system has been programmed to mitigate false alarms from medical isotopes and naturally occurring radioactive materials. The decision-making software can be "trained" to indicate specific signatures of radiation sources like special nuclear materials. The sensor is supplied with GPS tracker coupling radiological information with geographical coordinates. The sensor is designed for easy use and rapid deployment in common wireless networks.
Chukhryaeva, M I; Ivanov, I O; Frolova, S A; Koshel, S M; Utevska, O M; Skhalyakho, R A; Agdzhoyan, A T; Bogunov, Yu V; Balanovska, E V; Balanovsky, O P
2016-05-01
STR haplotypes of the Y chromosome are widely used as effective genetic markers in studies of human populations and in forensic DNA analysis. The task often arises to compare the spectrum of haplotypes in individuals or entire populations. Performing this task manually is too laborious and thus unrealistic. We propose an algorithm for counting similarity between STR haplotypes. This algorithm is suitable for massive analyses of samples. It is implemented in the computer program Haplomatch, which makes it possible to find haplotypes that differ from the target haplotype by 0, 1, 2, 3, or more mutational steps. The program may operate in two modes: comparison of individuals and comparison of populations. Flexibility of the program (the possibility of using any external database), its usability (MS Excel spreadsheets are used), and the capability of being applied to other chromosomes and other species could make this software a new useful tool in population genetics and forensic and genealogical studies. The Haplomatch software is freely available on our website www.genofond.ru. The program is applied to studying the gene pool of Cossacks. Experimental analysis of Y-chromosomal diversity in a representative set (N = 131) of Upper Don Cossacks is performed. Analysis of the STR haplotypes detects genetic proximity of Cossacks to East Slavic populations (in particular, to Southern and Central Russians, as well as to Ukrainians), which confirms the hypothesis of the origin of the Cossacks mainly due to immigration from Russia and Ukraine. Also, a small genetic influence of Turkicspeaking Nogais is found, probably caused by their occurrence in the Don Voisko as part of the Tatar layer. No similarities between haplotype spectra of Cossacks and Caucasus populations are found. This case study demonstrates the effectiveness of the Haplomatch software in analyzing large sets of STR haplotypes.
NASA Astrophysics Data System (ADS)
Zoller, Christian; Hohmann, Ansgar; Ertl, Thomas; Kienle, Alwin
2017-07-01
The Monte Carlo method is often referred as the gold standard to calculate the light propagation in turbid media [1]. Especially for complex shaped geometries where no analytical solutions are available the Monte Carlo method becomes very important [1, 2]. In this work a Monte Carlo software is presented, to simulate the light propagation in complex shaped geometries. To improve the simulation time the code is based on OpenCL such that graphics cards can be used as well as other computing devices. Within the software an illumination concept is presented to realize easily all kinds of light sources, like spatial frequency domain (SFD), optical fibers or Gaussian beam profiles. Moreover different objects, which are not connected to each other, can be considered simultaneously, without any additional preprocessing. This Monte Carlo software can be used for many applications. In this work the transmission spectrum of a tooth and the color reconstruction of a virtual object are shown, using results from the Monte Carlo software.
Dowla, Farid U; Nekoogar, Faranak
2015-03-03
A method for adaptive Radio Frequency (RF) jamming according to one embodiment includes dynamically monitoring a RF spectrum; detecting any undesired signals in real time from the RF spectrum; and sending a directional countermeasure signal to jam the undesired signals. A method for adaptive Radio Frequency (RF) communications according to another embodiment includes transmitting a data pulse in a RF spectrum; and transmitting a reference pulse separated by a predetermined period of time from the data pulse; wherein the data pulse is modulated with data, wherein the reference pulse is unmodulated. A method for adaptive Radio Frequency (RF) communications according to yet another embodiment includes receiving a data pulse in a RF spectrum; and receiving a reference pulse separated in time from the data pulse, wherein the data pulse is modulated with data, wherein the reference pulse is unmodulated; and demodulating the pulses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowla, Farid; Nekoogar, Faranak
A method for adaptive Radio Frequency (RF) jamming according to one embodiment includes dynamically monitoring a RF spectrum; detecting any undesired signals in real time from the RF spectrum; and sending a directional countermeasure signal to jam the undesired signals. A method for adaptive Radio Frequency (RF) communications according to another embodiment includes transmitting a data pulse in a RF spectrum; and transmitting a reference pulse separated by a predetermined period of time from the data pulse; wherein the data pulse is modulated with data, wherein the reference pulse is unmodulated. A method for adaptive Radio Frequency (RF) communications accordingmore » to yet another embodiment includes receiving a data pulse in a RF spectrum; and receiving a reference pulse separated in time from the data pulse, wherein the data pulse is modulated with data, wherein the reference pulse is unmodulated; and demodulating the pulses.« less
Burckley, Elizabeth; Tincani, Matt; Guld Fisher, Amanda
2015-04-01
To evaluate the iPad 2™ with Book Creator™ software to provide visual cues and video prompting to teach shopping skills in the community to a young adult with an autism spectrum disorder and intellectual disability. A multiple probe across settings design was used to assess effects of the intervention on the participant's independence with following a shopping list in a grocery store across three community locations. Visual cues and video prompting substantially increased the participant's shopping skills within two of the three community locations, skill increases maintained after the intervention was withdrawn, and shopping skills generalized to two untaught shopping items. Social validity surveys suggested that the participant's parent and staff favorably viewed the goals, procedures, and outcomes of intervention. The iPad 2™ with Book Creator™ software may be an effective way to teach independent shopping skills in the community; additional replications are needed.
A method to test the performance of an energy-dispersive X-ray spectrometer (EDS).
Hodoroaba, Vasile-Dan; Procop, Mathias
2014-10-01
A test material for routine performance evaluation of energy-dispersive X-ray spectrometers (EDS) is presented. It consists of a synthetic, thick coating of C, Al, Mn, Cu, and Zr, in an elemental composition that provides interference-free characteristic X-ray lines of similar intensities at 10 kV scanning electron microscope voltage. The EDS energy resolution at the C-K, Mn-Lα, Cu-Lα, Al-K, Zr-Lα, and Mn-Kα lines, the calibration state of the energy scale, and the Mn-Lα/Mn-Kα intensity ratio as a measure for the low-energy detection efficiency are calculated by a dedicated software package from the 10 kV spectrum. Measurements at various input count rates and processor shaping times enable an estimation of the operation conditions for which the X-ray spectrum is not yet corrupted by pile-up events. Representative examples of EDS systems characterized with the test material and the related software are presented and discussed.
NASA Technical Reports Server (NTRS)
Mckay, C. W.; Bown, R. L.
1985-01-01
The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.
Autonomy Software: V&V Challenges and Characteristics
NASA Technical Reports Server (NTRS)
Schumann, Johann; Visser, Willem
2006-01-01
The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.
NASA Astrophysics Data System (ADS)
Ayuga, Carlos Eugenio Tapia; Zamorano, Jaime
2018-07-01
The night sky spectra of light-polluted areas is the result of the artificial light scattered back from the atmosphere and the reemission of the light after reflections in painted surfaces. This emission comes mainly from street and decorative lamps. We have built an extensive database of lamps spectra covering from UV to near IR and the software needed to analyze them. We describe the LICA-AstroCalc free software that is a user friendly GUI tool to extract information from our database spectra or any other user provided spectrum. The software also includes the complete color database of paints from NCS comprising 1950 types. This helps to evaluate how different colors modify the reflected spectra from different lamps. All spectroscopic measurements have been validated with recommendations from CIELAB and ISO from NCS database.
Mathcad in the Chemistry Curriculum Symbolic Software in the Chemistry Curriculum
NASA Astrophysics Data System (ADS)
Zielinski, Theresa Julia
2000-05-01
Physical chemistry is such a broad discipline that the topics we expect average students to complete in two semesters usually exceed their ability for meaningful learning. Consequently, the number and kind of topics and the efficiency with which students can learn them are important concerns. What topics are essential and what can we do to provide efficient and effective access to those topics? How do we accommodate the fact that students come to upper-division chemistry courses with a variety of nonuniformly distributed skills, a bit of calculus, and some physics studied one or more years before physical chemistry? The critical balance between depth and breadth of learning in courses and curricula may be achieved through appropriate use of technology and especially through the use of symbolic mathematics software. Software programs such as Mathcad, Mathematica, and Maple, however, have learning curves that diminish their effectiveness for novices. There are several ways to address the learning curve conundrum. First, basic instruction in the software provided during laboratory sessions should be followed by requiring laboratory reports that use the software. Second, one should assign weekly homework that requires the software and builds student skills within the discipline and with the software. Third, a complementary method, supported by this column, is to provide students with Mathcad worksheets or templates that focus on one set of related concepts and incorporate a variety of features of the software that they are to use to learn chemistry. In this column we focus on two significant topics for young chemists. The first is curve-fitting and the statistical analysis of the fitting parameters. The second is the analysis of the rotation/vibration spectrum of a diatomic molecule, HCl. A broad spectrum of Mathcad documents exists for teaching chemistry. One collection of 50 documents can be found at http://www.monmouth.edu/~tzielins/mathcad/Lists/index.htm. Another collection of peer-reviewed documents is developing through this column at the JCE Internet Web site, http://jchemed.chem.wisc.edu/JCEWWW/Features/ McadInChem/index.html. With this column we add three peer-reviewed and tested Mathcad documents to the JCE site. In Linear Least-Squares Regression, Sidney H. Young and Andrzej Wierzbicki demonstrate various implicit and explicit methods for determining the slope and intercept of the regression line for experimental data. The document shows how to determine the standard deviation for the slope, the intercept, and the standard deviation of the overall fit. Students are next given the opportunity to examine the confidence level for the fit through the Student's t-test. Examination of the residuals of the fit leads students to explore the possibility of rejecting points in a set of data. The document concludes with a discussion of and practice with adding a quadratic term to create a polynomial fit to a set of data and how to determine if the quadratic term is statistically significant. There is full documentation of the various steps used throughout the exposition of the statistical concepts. Although the statistical methods presented in this worksheet are generally accessible to average physical chemistry students, an instructor would be needed to explain the finer points of the matrix methods used in some sections of the worksheet. The worksheet is accompanied by a set of data for students to use to practice the techniques presented. It would be worthwhile for students to spend one or two laboratory periods learning to use the concepts presented and then to apply them to experimental data they have collected for themselves. Any linear or linearizable data set would be appropriate for use with this Mathcad worksheet. Alternatively, instructors may select sections of the document suited to the skill level of their students and the laboratory tasks at hand. In a second Mathcad document, Non-Linear Least-Squares Regression, Young and Wierzbicki introduce the basic concepts of nonlinear curve-fitting and develop the techniques needed to fit a variety of mathematical functions to experimental data. This approach is especially important when mathematical models for chemical processes cannot be linearized. In Mathcad the Levenberg-Marquardt algorithm is used to determine the best fitting parameters for a particular mathematical model. As in linear least-squares, the goal of the fitting process is to find the values for the fitting parameters that minimize the sum of the squares of the deviations between the data and the mathematical model. Students are asked to determine the fitting parameters, use the Hessian matrix to compute the standard deviation of the fitting parameters, test for the significance of the parameters using Student's t-test, use residual analysis to test for data points to remove, and repeat the calculations for another set of data. The nonlinear least-squares procedure follows closely on the pattern set up for linear least-squares by the same authors (see above). If students master the linear least-squares worksheet content they will be able to master the nonlinear least-squares technique (see also refs 1, 2). In the third document, The Analysis of the Vibrational Spectrum of a Linear Molecule by Richard Schwenz, William Polik, and Sidney Young, the authors build on the concepts presented in the curve fitting worksheets described above. This vibrational analysis document, which supports a classic experiment performed in the physical chemistry laboratory, shows how a Mathcad worksheet can increase the efficiency by which a set of complicated manipulations for data reduction can be made more accessible for students. The increase in efficiency frees up time for students to develop a fuller understanding of the physical chemistry concepts important to the interpretation of spectra and understanding of bond vibrations in general. The analysis of the vibration/rotation spectrum for a linear molecule worksheet builds on the rich literature for this topic (3). Before analyzing their own spectral data, students practice and learn the concepts and methods of the HCl spectral analysis by using the fundamental and first harmonic vibrational frequencies provided by the authors. This approach has a fundamental pedagogical advantage. Most explanations in laboratory texts are very concise and lack mathematical details required by average students. This Mathcad worksheet acts as a tutor; it guides students through the essential concepts for data reduction and lets them focus on learning important spectroscopic concepts. The Mathcad worksheet is amply annotated. Students who have moderate skill with the software and have learned about regression analysis from the curve-fitting worksheets described in this column will be able to complete and understand their analysis of the IR spectrum of HCl. The three Mathcad worksheets described here stretch the physical chemistry curriculum by presenting important topics in forms that students can use with only moderate Mathcad skills. The documents facilitate learning by giving students opportunities to interact with the material in meaningful ways in addition to using the documents as sources of techniques for building their own data-reduction worksheets. However, working through these Mathcad worksheets is not a trivial task for the average student. Support needs to be provided by the instructor to ease students through more advanced mathematical and Mathcad processes. These worksheets raise the question of how much we can ask diligent students to do in one course and how much time they need to spend to master the essential concepts of that course. The Mathcad documents and associated PDF versions are available at the JCE Internet WWW site. The Mathcad documents require Mathcad version 6.0 or higher and the PDF files require Adobe Acrobat. Every effort has been made to make the documents fully compatible across the various Mathcad versions. Users may need to refer to Mathcad manuals for functions that vary with the Mathcad version number. Literature Cited 1. Bevington, P. R. Data Reduction and Error Analysis for the Physical Sciences; McGraw-Hill: New York, 1969. 2. Zielinski, T. J.; Allendoerfer, R. D. J. Chem. Educ. 1997, 74, 1001. 3. Schwenz, R. W.; Polik, W. F. J. Chem. Educ. 1999, 76, 1302.
NASA Astrophysics Data System (ADS)
Bringley, Eric; Cao, Tongtong; Ilieva, Yordonka; Nadel-Turonski, Pawel; Park, Kijun; Zorn, Carl
2014-09-01
At the Thomas Jefferson National Accelerator Facility (JLab) a research and development project for a Detector of Internally-Reflected Cherenkov light for the upcoming Electron Ion Collider is underway. One goal is the development of a compact readout camera that can operate in high magnetic fields. Small-size photon sensors, such as Microchannel-Plate Photomultipliers (MCP-PMT), are key components of the readout. Here we present our work to set up and commission a dedicated test facility at JLab where MCP-PMT gain is evaluated in magnetic fields of up to 5 T, and to develop a test procedure and analysis software to determine the gain. We operate the setup in a single-photon mode, where a light-emitting diode delivers photons to the sensor's photocathode. The PMT spectrum is measured with a flash Analog-to-Digital converter (fADC). We model the spectrum as a sum of an exponential background and a convolution of Poisson and Gaussian distributions of the pedestal and multiple photoelectron peaks, respectively. We determine the PMT's gain from the position of the single-photoelectron peak obtained by fitting the fADC spectrum to the model. Our gain uncertainty is <10%. The facility is now established and will have a long-lasting value for sensor tests and beyond-nuclear-physics applications.
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.
2014-12-01
The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that software is both upgradable and maintainable, and can be readily reused with complexly integrated systems and become part of the growing global trusted community tools for cross-disciplinary research.
Radial Profiles of PKS 0745-191 Galaxy Cluster with XMM-Newton X-Ray Observations
NASA Astrophysics Data System (ADS)
Tumer, A.; Ezer, C.; Ercan, E.
2017-10-01
Since clusters of galaxies are the largest comprehensive samples of the universe, they provide essential information on from the most basic to the most complex physical mechanisms such as nucleosynthesis and supernovae events. Some of these information are provided by the X-ray emission data from Intra Cluster Medium (ICM) which contains hot dilute gas. Recent archieved observation of the X-Ray spectrum of the cool core galaxy cluster PKS 0745-191 provided by XMM-Newton is subjected to data analysis using ESAS package. Followed by spectra analysis utilizing Xspec spectral fitting software, we present the radial profiles of temperature and abundance from the core to 0.5R_500 of brightest distant cluster (z ˜ 0.102) PKS 0745-191. Using the deprojected spectra, the radial distribution of pressure and entropy in the aforementioned region are also presented.
Radical Software. Number Two. The Electromagnetic Spectrum.
ERIC Educational Resources Information Center
Korot, Beryl, Ed.; Gershuny, Phyllis, Ed.
1970-01-01
In an effort to foster the innovative uses of television technology, this tabloid format periodical details social, educational, and artistic experiments with television and lists a large number of experimental videotapes available from various television-centered groups and individuals. The principal areas explored in this issue include cable…
5nsec Dead time multichannel scaling system for Mössbauer spectrometer
NASA Astrophysics Data System (ADS)
Verrastro, C.; Trombetta, G.; Pita, A.; Saragovi, C.; Duhalde, S.
1991-11-01
A PC programmable and fast multichannel scaling module has been designed to use a commercial Mössbauer spectrometer. This module is based on a 10 single chip 8 bits microcomputer (MC6805) and on a 35 fast ALU, which allows a high performance and low cost system. The module can operate in a stand-alone mode. Data analysis are performed in real time display, on XT/AT IBM PC or compatibles. The channels are ranged between 256 and 4096, the maximum number of counts is 232-1 per channel, the dwell time is 3 μsec and the dead time between channels is 5 nsec. A friendly software display the real time spectrum and offers menues with different options at each state.
Portable gas chromatograph-mass spectrometer
Andresen, Brian D.; Eckels, Joel D.; Kimmons, James F.; Myers, David W.
1996-01-01
A gas chromatograph-mass spectrometer (GC-MS) for use as a field portable organic chemical analysis instrument. The GC-MS is designed to be contained in a standard size suitcase, weighs less than 70 pounds, and requires less than 600 watts of electrical power at peak power (all systems on). The GC-MS includes: a conduction heated, forced air cooled small bore capillary gas chromatograph, a small injector assembly, a self-contained ion/sorption pump vacuum system, a hydrogen supply, a dual computer system used to control the hardware and acquire spectrum data, and operational software used to control the pumping system and the gas chromatograph. This instrument incorporates a modified commercial quadrupole mass spectrometer to achieve the instrument sensitivity and mass resolution characteristic of laboratory bench top units.
RootGraph: a graphic optimization tool for automated image analysis of plant roots
Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.
2015-01-01
This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880
Eghrari, Allen O; Mumtaz, Aisha A; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S; Riazuddin, S Amer; Gottsch, John D
2017-01-01
Retroillumination photography analysis is an objective tool for the assessment of the number and distribution of guttae in eyes affected with Fuchs corneal dystrophy (FCD). Current protocols include manual processing of images; here, we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Retroillumination photographs of 97 FCD-affected corneas were acquired, and total counts of guttae were previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. A set of 97 retroillumination photographs was analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R = 0.79) and manual counts (R = 0.88). Intraclass correlation coefficients demonstrated strong correlation at 0.924 (95% CI, 0.870-0.958) among cases analyzed by 3 students, and 0.869 (95% CI, 0.797-0.918) among cases for which images were analyzed by an ophthalmologist and 2 students. Automated retroillumination photography analysis allows for grading of FCD severity with high resolution across a spectrum of disease severity.
Ranganath, Prajnya; Matta, Divya; Bhavani, Gandham SriLakshmi; Wangnekar, Savita; Jain, Jamal Mohammed Nurul; Verma, Ishwar C; Kabra, Madhulika; Puri, Ratna Dua; Danda, Sumita; Gupta, Neerja; Girisha, Katta M; Sankar, Vaikom H; Patil, Siddaramappa J; Ramadevi, Akella Radha; Bhat, Meenakshi; Gowrishankar, Kalpana; Mandal, Kausik; Aggarwal, Shagun; Tamhankar, Parag Mohan; Tilak, Preetha; Phadke, Shubha R; Dalal, Ashwin
2016-10-01
Acid sphingomyelinase (ASM)-deficient Niemann-Pick disease is an autosomal recessive lysosomal storage disorder caused by biallelic mutations in the SMPD1 gene. To date, around 185 mutations have been reported in patients with ASM-deficient NPD world-wide, but the mutation spectrum of this disease in India has not yet been reported. The aim of this study was to ascertain the mutation profile in Indian patients with ASM-deficient NPD. We sequenced SMPD1 in 60 unrelated families affected with ASM-deficient NPD. A total of 45 distinct pathogenic sequence variants were found, of which 14 were known and 31 were novel. The variants included 30 missense, 4 nonsense, and 9 frameshift (7 single base deletions and 2 single base insertions) mutations, 1 indel, and 1 intronic duplication. The pathogenicity of the novel mutations was inferred with the help of the mutation prediction software MutationTaster, SIFT, Polyphen-2, PROVEAN, and HANSA. The effects of the identified sequence variants on the protein structure were studied using the structure modeled with the help of the SWISS-MODEL workspace program. The p. (Arg542*) (c.1624C>T) mutation was the most commonly identified mutation, found in 22% (26 out of 120) of the alleles tested, but haplotype analysis for this mutation did not identify a founder effect for the Indian population. To the best of our knowledge, this is the largest study on mutation analysis of patients with ASM-deficient Niemann-Pick disease reported in literature and also the first study on the SMPD1 gene mutation spectrum in India. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Adult Literacy and Technology Newsletter. Vol. 3, Nos. 1-4.
ERIC Educational Resources Information Center
Gueble, Ed, Ed.
1989-01-01
This document consists of four issues of a newsletter focused on the spectrum of technology use in literacy instruction. The first issue contains the following articles: "Five 'Big' Systems and One 'Little' Option" (Weisberg); "Computer Use Patterns at Blackfeet Community College" (Hill); "Software Review: Educational Activities' Science Series"…
Visual-Auditory Integration during Speech Imitation in Autism
ERIC Educational Resources Information Center
Williams, Justin H. G.; Massaro, Dominic W.; Peel, Natalie J.; Bosseler, Alexis; Suddendorf, Thomas
2004-01-01
Children with autistic spectrum disorder (ASD) may have poor audio-visual integration, possibly reflecting dysfunctional "mirror neuron" systems which have been hypothesised to be at the core of the condition. In the present study, a computer program, utilizing speech synthesizer software and a "virtual" head (Baldi), delivered speech stimuli for…
A learning tool for optical and microwave satellite image processing and analysis
NASA Astrophysics Data System (ADS)
Dashondhi, Gaurav K.; Mohanty, Jyotirmoy; Eeti, Laxmi N.; Bhattacharya, Avik; De, Shaunak; Buddhiraju, Krishna M.
2016-04-01
This paper presents a self-learning tool, which contains a number of virtual experiments for processing and analysis of Optical/Infrared and Synthetic Aperture Radar (SAR) images. The tool is named Virtual Satellite Image Processing and Analysis Lab (v-SIPLAB) Experiments that are included in Learning Tool are related to: Optical/Infrared - Image and Edge enhancement, smoothing, PCT, vegetation indices, Mathematical Morphology, Accuracy Assessment, Supervised/Unsupervised classification etc.; Basic SAR - Parameter extraction and range spectrum estimation, Range compression, Doppler centroid estimation, Azimuth reference function generation and compression, Multilooking, image enhancement, texture analysis, edge and detection. etc.; SAR Interferometry - BaseLine Calculation, Extraction of single look SAR images, Registration, Resampling, and Interferogram generation; SAR Polarimetry - Conversion of AirSAR or Radarsat data to S2/C3/T3 matrix, Speckle Filtering, Power/Intensity image generation, Decomposition of S2/C3/T3, Classification of S2/C3/T3 using Wishart Classifier [3]. A professional quality polarimetric SAR software can be found at [8], a part of whose functionality can be found in our system. The learning tool also contains other modules, besides executable software experiments, such as aim, theory, procedure, interpretation, quizzes, link to additional reading material and user feedback. Students can have understanding of Optical and SAR remotely sensed images through discussion of basic principles and supported by structured procedure for running and interpreting the experiments. Quizzes for self-assessment and a provision for online feedback are also being provided to make this Learning tool self-contained. One can download results after performing experiments.
NASA Astrophysics Data System (ADS)
The present conference on the development status of communications systems in the context of electronic warfare gives attention to topics in spread spectrum code acquisition, digital speech technology, fiber-optics communications, free space optical communications, the networking of HF systems, and applications and evaluation methods for digital speech. Also treated are issues in local area network system design, coding techniques and applications, technology applications for HF systems, receiver technologies, software development status, channel simultion/prediction methods, C3 networking spread spectrum networks, the improvement of communication efficiency and reliability through technical control methods, mobile radio systems, and adaptive antenna arrays. Finally, communications system cost analyses, spread spectrum performance, voice and image coding, switched networks, and microwave GaAs ICs, are considered.
Automation photometer of Hitachi U–2000 spectrophotometer with RS–232C–based computer
Kumar, K. Senthil; Lakshmi, B. S.; Pennathur, Gautam
1998-01-01
The interfacing of a commonly used spectrophotometer, the Hitachi U2000, through its RS–232C port to a IBM compatible computer is described. The hardware for data acquisation was designed by suitably modifying readily available materials, and the software was written using the C programming language. The various steps involved in these procedures are elucidated in detail. The efficacy of the procedure was tested experimentally by running the visible spectrum of a cyanine dye. The spectrum was plotted through a printer hooked to the computer. The spectrum was also plotted by transforming the abscissa to the wavenumber scale. This was carried out by using another module written in C. The efficiency of the whole set-up has been calculated using standard procedures. PMID:18924834
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.
Determination of $sup 241$Am in soil using an automated nuclear radiation measurement laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engstrom, D.E.; White, M.G.; Dunaway, P.B.
The recent completion of REECo's Automated Laboratory and associated software systems has provided a significant increase in capability while reducing manpower requirements. The system is designed to perform gamma spectrum analyses on the large numbers of samples required by the current Nevada Applied Ecology Group (NAEG) and Plutonium Distribution Inventory Program (PDIP) soil sampling programs while maintaining sufficient sensitivities as defined by earlier investigations of the same type. The hardware and systems are generally described in this paper, with emphasis being placed on spectrum reduction and the calibration procedures used for soil samples. (auth)
GlycReSoft: A Software Package for Automated Recognition of Glycans from LC/MS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Evan; Tan, Yan; Tan, Yuxiang
2012-09-26
Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS) is used to profile themore » glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.« less
Real-time classification of signals from three-component seismic sensors using neural nets
NASA Astrophysics Data System (ADS)
Bowman, B. C.; Dowla, F.
1992-05-01
Adaptive seismic data acquisition systems with capabilities of signal discrimination and event classification are important in treaty monitoring, proliferation, and earthquake early detection systems. Potential applications include monitoring underground chemical explosions, as well as other military, cultural, and natural activities where characteristics of signals change rapidly and without warning. In these applications, the ability to detect and interpret events rapidly without falling behind the influx of the data is critical. We developed a system for real-time data acquisition, analysis, learning, and classification of recorded events employing some of the latest technology in computer hardware, software, and artificial neural networks methods. The system is able to train dynamically, and updates its knowledge based on new data. The software is modular and hardware-independent; i.e., the front-end instrumentation is transparent to the analysis system. The software is designed to take advantage of the multiprocessing environment of the Unix operating system. The Unix System V shared memory and static RAM protocols for data access and the semaphore mechanism for interprocess communications were used. As the three-component sensor detects a seismic signal, it is displayed graphically on a color monitor using X11/Xlib graphics with interactive screening capabilities. For interesting events, the triaxial signal polarization is computed, a fast Fourier Transform (FFT) algorithm is applied, and the normalized power spectrum is transmitted to a backpropagation neural network for event classification. The system is currently capable of handling three data channels with a sampling rate of 500 Hz, which covers the bandwidth of most seismic events. The system has been tested in laboratory setting with artificial events generated in the vicinity of a three-component sensor.
NASA Astrophysics Data System (ADS)
Li, Gang; Xu, Jiayun; Bai, Lixin
2017-03-01
The metal films are widely used in the Inertial Confinement Fusion (ICF) experiments to obtain the radiation opacity, and the accuracy of the measuring results mainly depends on the accuracy of the film thickness and thickness uniformity. The traditional used measuring methods all have various disadvantages, the optical method and stylus method cannot provide mass thickness which reflects the internal density distribution of the films, and the weighing method cannot provide the uniformity of the thickness distribution. This paper describes a new method which combines the α-particle energy loss (AEL) method and the successive scanning measurements to obtain the film thickness and thickness uniformity. The measuring system was partly installed in the vacuum chamber, and the relationship of chamber pressure and energy loss caused by the residual air in the vacuum chamber was studied for the source-to-detector distance ranging from 1 to 5 cm. The results show that the chamber pressure should be less than 10 Pa for the present measuring system. In the process of measurement, the energy spectrum of α-particles transmitted through each different measuring point were obtained, and then recorded automatically by a self-developed multi-channel analysis software. At the same time, the central channel numbers of the spectrum (CH) were also saved in a text form document. In order to realize the automation of data processing and represent the thickness uniformity visually in a graphic 3D plot, a software package was developed to convert the CH values into film thickness and thickness uniformity. The results obtained in this paper make the film thickness uniformity measurements more accurate and efficient in the ICF experiments.
NASA Astrophysics Data System (ADS)
Lotfy, Hayam Mahmoud; Omran, Yasmin Rostom
2018-07-01
A novel, simple, rapid, accurate, and economical spectrophotometric method, namely absorptivity centering (a-Centering) has been developed and validated for the simultaneous determination of mixtures with partially and completely overlapping spectra in different matrices using either normalized or factorized spectrum using built-in spectrophotometer software without a need of special purchased program. Mixture I (Mix I) composed of Simvastatin (SM) and Ezetimibe (EZ) is the one with partial overlapping spectra formulated as tablets, while mixture II (Mix II) formed by Chloramphenicol (CPL) and Prednisolone acetate (PA) is that with complete overlapping spectra formulated as eye drops. These procedures do not require any separation steps. Resolution of spectrally overlapping binary mixtures has been achieved getting recovered zero-order (D0) spectrum of each drug, then absorbance was recorded at their maxima 238, 233.5, 273 and 242.5 nm for SM, EZ, CPL and PA, respectively. Calibration graphs were established with good correlation coefficients. The method shows significant advantages as simplicity, minimal data manipulation besides maximum reproducibility and robustness. Moreover, it was validated according to ICH guidelines. Selectivity was tested using laboratory-prepared mixtures. Accuracy, precision and repeatability were found to be within the acceptable limits. The proposed method is good enough to be applied to an assay of drugs in their combined formulations without any interference from excipients. The obtained results were statistically compared with those of the reported and official methods by applying t-test and F-test at 95% confidence level concluding that there is no significant difference with regard to accuracy and precision. Generally, this method could be used successfully for the routine quality control testing.
Lotfy, Hayam Mahmoud; Omran, Yasmin Rostom
2018-07-05
A novel, simple, rapid, accurate, and economical spectrophotometric method, namely absorptivity centering (a-Centering) has been developed and validated for the simultaneous determination of mixtures with partially and completely overlapping spectra in different matrices using either normalized or factorized spectrum using built-in spectrophotometer software without a need of special purchased program. Mixture I (Mix I) composed of Simvastatin (SM) and Ezetimibe (EZ) is the one with partial overlapping spectra formulated as tablets, while mixture II (Mix II) formed by Chloramphenicol (CPL) and Prednisolone acetate (PA) is that with complete overlapping spectra formulated as eye drops. These procedures do not require any separation steps. Resolution of spectrally overlapping binary mixtures has been achieved getting recovered zero-order (D 0 ) spectrum of each drug, then absorbance was recorded at their maxima 238, 233.5, 273 and 242.5 nm for SM, EZ, CPL and PA, respectively. Calibration graphs were established with good correlation coefficients. The method shows significant advantages as simplicity, minimal data manipulation besides maximum reproducibility and robustness. Moreover, it was validated according to ICH guidelines. Selectivity was tested using laboratory-prepared mixtures. Accuracy, precision and repeatability were found to be within the acceptable limits. The proposed method is good enough to be applied to an assay of drugs in their combined formulations without any interference from excipients. The obtained results were statistically compared with those of the reported and official methods by applying t-test and F-test at 95% confidence level concluding that there is no significant difference with regard to accuracy and precision. Generally, this method could be used successfully for the routine quality control testing. Copyright © 2018 Elsevier B.V. All rights reserved.
Becker, P.; Gabriel, F.; Cassagne, C.; Accoceberry, I.; Gari-Toussaint, M.; Hasseine, L.; De Geyter, D.; Pierard, D.; Surmont, I.; Djenad, F.; Donnadieu, J. L.; Piarroux, M.; Hendrickx, M.; Piarroux, R.
2017-01-01
ABSTRACT Matrix-assisted laser desorption ionization–time of flight (MALDI-TOF) mass spectrometry has emerged as a reliable technique to identify molds involved in human diseases, including dermatophytes, provided that exhaustive reference databases are available. This study assessed an online identification application based on original algorithms and an extensive in-house reference database comprising 11,851 spectra (938 fungal species and 246 fungal genera). Validation criteria were established using an initial panel of 422 molds, including dermatophytes, previously identified via DNA sequencing (126 species). The application was further assessed using a separate panel of 501 cultured clinical isolates (88 mold taxa including dermatophytes) derived from five hospital laboratories. A total of 438 (87.35%) isolates were correctly identified at the species level, while 26 (5.22%) were assigned to the correct genus but the wrong species and 37 (7.43%) were not identified, since the defined threshold of 20 was not reached. The use of the Bruker Daltonics database included in the MALDI Biotyper software resulted in a much higher rate of unidentified isolates (39.76 and 74.30% using the score thresholds 1.7 and 2.0, respectively). Moreover, the identification delay of the online application remained compatible with real-time online queries (0.15 s per spectrum), and the application was faster than identifications using the MALDI Biotyper software. This is the first study to assess an online identification system based on MALDI-TOF spectrum analysis. We have successfully applied this approach to identify molds, including dermatophytes, for which diversity is insufficiently represented in commercial databases. This free-access application is available to medical mycologists to improve fungal identification. PMID:28637907
Normand, A C; Becker, P; Gabriel, F; Cassagne, C; Accoceberry, I; Gari-Toussaint, M; Hasseine, L; De Geyter, D; Pierard, D; Surmont, I; Djenad, F; Donnadieu, J L; Piarroux, M; Ranque, S; Hendrickx, M; Piarroux, R
2017-09-01
Matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass spectrometry has emerged as a reliable technique to identify molds involved in human diseases, including dermatophytes, provided that exhaustive reference databases are available. This study assessed an online identification application based on original algorithms and an extensive in-house reference database comprising 11,851 spectra (938 fungal species and 246 fungal genera). Validation criteria were established using an initial panel of 422 molds, including dermatophytes, previously identified via DNA sequencing (126 species). The application was further assessed using a separate panel of 501 cultured clinical isolates (88 mold taxa including dermatophytes) derived from five hospital laboratories. A total of 438 (87.35%) isolates were correctly identified at the species level, while 26 (5.22%) were assigned to the correct genus but the wrong species and 37 (7.43%) were not identified, since the defined threshold of 20 was not reached. The use of the Bruker Daltonics database included in the MALDI Biotyper software resulted in a much higher rate of unidentified isolates (39.76 and 74.30% using the score thresholds 1.7 and 2.0, respectively). Moreover, the identification delay of the online application remained compatible with real-time online queries (0.15 s per spectrum), and the application was faster than identifications using the MALDI Biotyper software. This is the first study to assess an online identification system based on MALDI-TOF spectrum analysis. We have successfully applied this approach to identify molds, including dermatophytes, for which diversity is insufficiently represented in commercial databases. This free-access application is available to medical mycologists to improve fungal identification. Copyright © 2017 American Society for Microbiology.
The Venus Balloon Project telemetry processing
NASA Technical Reports Server (NTRS)
Urech, J. M.; Chamarro, A.; Morales, J. L.; Urech, M. A.
1986-01-01
The peculiarities of the Venus Balloon telemetry system required the development of a new methodology for the telemetry processing, since the capabilities of the Deep Space Network (DSN) telemetry system do not include burst processing of short frames with two different bit rates and first bit acquisition. A software package was produced for the non-real time detection, demodulation, and decoding of the telemetry streams obtained from an open loop recording utilizing the DSN spectrum processing subsystem-radio science (DSP-RS). A general description of the resulting software package (DMO-5539-SP) and its adaptability to the real mission's variations is contained.
NASA Astrophysics Data System (ADS)
Dickens, J. K.; Hill, N. W.; Hou, F. S.; McConnell, J. W.; Spencer, R. R.; Tsang, F. Y.
1985-08-01
A system for making diagnostic measurements of the energy spectra of greater than or equal to 0.8-MeV neutrons produced during plasma operations of the Princeton Tokamak Fusion Test Reactor (TFTR) has been fabricated and tested and is presently in operation in the TFTR Test Cell Basement. The system consists of two separate detectors, each made up of cells containing liquid NE-213 scintillator attached permanently to RCA-8850 photomultiplier tubes. Pulses obtained from each photomultiplier system are amplified and electronically analyzed to identify and separate those pulses due to neutron-induced events in the detector from those due to photon-induced events in the detector. Signals from each detector are routed to two separate Analog-to-Digital Converters, and the resulting digitized information, representing: (1) the raw neutron-spectrum data; and (2) the raw photon-spectrum data, are transmited to the CICADA data-acquisition computer system of the TFTR. Software programs have been installed on the CICADA system to analyze the raw data to provide moderate-resolution recreations of the energy spectrum of the neutron and photon fluences incident on the detector during the operation of the TFTR. A complete description of, as well as the operation of, the hardware and software is given in this report.
Rudnick, Paul A.; Markey, Sanford P.; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V.; Edwards, Nathan J.; Thangudu, Ratna R.; Ketchum, Karen A.; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E.
2016-01-01
The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics datasets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and non-reference markers of cancer. The CPTAC labs have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these datasets were produced from 2D LC-MS/MS analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) Peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false discovery rate (FDR)-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the datasets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level (“rolled-up”) precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ™. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data, enabling comparisons between different samples and cancer types as well as across the major ‘omics fields. PMID:26860878
Rudnick, Paul A; Markey, Sanford P; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V; Edwards, Nathan J; Thangudu, Ratna R; Ketchum, Karen A; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E
2016-03-04
The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics data sets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and nonreference markers of cancer. The CPTAC laboratories have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these data sets were produced from 2D liquid chromatography-tandem mass spectrometry analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false-discovery rate-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the data sets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level ("rolled-up") precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data to enable comparisons between different samples and cancer types as well as across the major omics fields.
GUIDEseq: a bioconductor package to analyze GUIDE-Seq datasets for CRISPR-Cas nucleases.
Zhu, Lihua Julie; Lawrence, Michael; Gupta, Ankit; Pagès, Hervé; Kucukural, Alper; Garber, Manuel; Wolfe, Scot A
2017-05-15
Genome editing technologies developed around the CRISPR-Cas9 nuclease system have facilitated the investigation of a broad range of biological questions. These nucleases also hold tremendous promise for treating a variety of genetic disorders. In the context of their therapeutic application, it is important to identify the spectrum of genomic sequences that are cleaved by a candidate nuclease when programmed with a particular guide RNA, as well as the cleavage efficiency of these sites. Powerful new experimental approaches, such as GUIDE-seq, facilitate the sensitive, unbiased genome-wide detection of nuclease cleavage sites within the genome. Flexible bioinformatics analysis tools for processing GUIDE-seq data are needed. Here, we describe an open source, open development software suite, GUIDEseq, for GUIDE-seq data analysis and annotation as a Bioconductor package in R. The GUIDEseq package provides a flexible platform with more than 60 adjustable parameters for the analysis of datasets associated with custom nuclease applications. These parameters allow data analysis to be tailored to different nuclease platforms with different length and complexity in their guide and PAM recognition sequences or their DNA cleavage position. They also enable users to customize sequence aggregation criteria, and vary peak calling thresholds that can influence the number of potential off-target sites recovered. GUIDEseq also annotates potential off-target sites that overlap with genes based on genome annotation information, as these may be the most important off-target sites for further characterization. In addition, GUIDEseq enables the comparison and visualization of off-target site overlap between different datasets for a rapid comparison of different nuclease configurations or experimental conditions. For each identified off-target, the GUIDEseq package outputs mapped GUIDE-Seq read count as well as cleavage score from a user specified off-target cleavage score prediction algorithm permitting the identification of genomic sequences with unexpected cleavage activity. The GUIDEseq package enables analysis of GUIDE-data from various nuclease platforms for any species with a defined genomic sequence. This software package has been used successfully to analyze several GUIDE-seq datasets. The software, source code and documentation are freely available at http://www.bioconductor.org/packages/release/bioc/html/GUIDEseq.html .
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
NASA Astrophysics Data System (ADS)
Sathya, K.; Dhamodharan, P.; Dhandapani, M.
2018-05-01
A molecular complex, 1H-benzo[d][1,2,3]triazol-3-ium-3,5-dinitrobenzoate, (BTDB), was synthesized, crystallized and characterized by CHN analysis and 1H, 13C NMR spectral studies. The crystal is transparent in entire visible region as evidenced by UV-Vis-NIR spectrum. TG/DTA analysis shows that BTDB is stable up to 150 °C. Single crystal XRD analysis was carried out to ascertain the molecular structure and BTDB crystallizes in the monoclinic system with space group P21/n. Computational studies that include optimization of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-311G(d,p) level. Hirshfeld surfaces and 2D fingerprint plots revealed that O⋯H, H⋯H and O⋯C interactions are the most prevalent. The first order hyperpolarizability (β) of BITB is 44 times greater than urea. The results show that the BTDB may be used for various opto-electronic applications.
The Project LITE Spectrum Explorer
NASA Astrophysics Data System (ADS)
Brecher, K.; Carr, P.; Garik, P.; Weeks, E.
2002-12-01
We are developing a powerful new software tool which can help students at all levels understand the spectral properties of light. As a recent AAS survey of astronomy faculty members found (The Physics Teacher, 39, 52, 2001), essentially all introductory astronomy courses spend a significant amount of time dealing with the nature of light. Among the most difficult concepts for students to master are Kirchhoff's laws, blackbody radiation, the Stefan-Boltzmann law, Wien's law, the nature and causes of emission and absorption lines, and the relation of spectra to the underlying astronomical and physical processes producing them. Students often seem baffled by the connection between a spectrum seen visually as a color band and the same spectrum plotted graphically as intensity versus wavelength or frequency. The "Spectrum Explorer", a JAVA applet, is being developed as part of "Project LITE: Light Inquiry Through Experiments" to address these issues. It can be used by instructors in lecture presentations and by students learning at home or working in laboratory settings. We will show some of the current capabilities of the software which include simultaneous display of multiple spectra (normalized and non-normalized as a function of either wavelength or frequency) and the ability to manipulate blackbody spectra. Our future development plans include the addition of a variety of spectral data sets (from physics and chemistry as well as from astronomy); computed inputs from basic quantum mechanics (e.g. Zeeman effect in hydrogen) and from astronomical models (e.g. time varying spectra in binary stars); and the ability to test the effect of filters and physical processes (e.g. Rayleigh scattering) on input spectra. The Spectrum Explorer (along with many other applets about both the physical and perceptual nature of light) can be found on the Project LITE web site http://lite.bu.edu. Project LITE is supported by Grant #DUE-0125992 from the National Science Foundation Division of Undergraduate Education.
Validation of software for calculating the likelihood ratio for parentage and kinship.
Drábek, J
2009-03-01
Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.
NASA Astrophysics Data System (ADS)
Weeks, E.; Brecher, K.; Carr, P.; Garik, P.
2003-12-01
Spectroscopy is one of the most important tools used by astronomers to disentangle information about the universe. However, it is one of the most challenging subjects in undergraduate astronomy courses. Among the most difficult concepts for students to master are Kirchhoff's laws, blackbody radiation, the Stefan-Boltzmann law, Wien's law, the nature and causes of emission and absorption lines, and the relation of spectra to the underlying astronomical and physical processes producing them. Students often seem baffled by the connection between a spectrum seen visually as a color band and the same spectrum plotted graphically as intensity versus wavelength or frequency. Project LITE (Light Inquiry Through Experiments) is a software, curriculum, and materials development project at Boston University. As part of the project, we are currently developing a suite of spectroscopic tools for astronomy education. We are also assessing their effectiveness in improving conceptual understanding of spectroscopic phenomena by astronomy students at the undergraduate level. The spectroscopy component of Project LITE includes take-home laboratory materials and experiments, which are integrated with web-based software. We have also developed a novel quantitative handheld binocular spectrometer (patent pending). Here we present an overview of the Project LITE homelab kits and curriculum, the Spectrum Explorer, and the Project LITE spectrometer. The homelab experiments and the Spectrum Explorer have been tested with students in a non-science majors introductory astronomy course as well as in a School of Education course for prospective elementary school science teachers. We present preliminary results of pre- and post-instruction surveys of student understanding of various spectral properties of light both from students who used the homelab activities and the Spectrum Explorer and those who did not. The Spectrum Explorer (along with many other applets about both the physical and perceptual nature of light) can be found at the Project LITE web site http://lite.bu.edu. Project LITE is supported by Grant #DUE-0125992 from the National Science Foundation Division of Undergraduate Education. E. W. is supported by a NASA Graduate Student Research Fellowship, NASA Grant number NGT5-50482.
Kastberger, G; Kranner, G
2000-02-01
Viscovery SOMine is a software tool for advanced analysis and monitoring of numerical data sets. It was developed for professional use in business, industry, and science and to support dependency analysis, deviation detection, unsupervised clustering, nonlinear regression, data association, pattern recognition, and animated monitoring. Based on the concept of self-organizing maps (SOMs), it employs a robust variant of unsupervised neural networks--namely, Kohonen's Batch-SOM, which is further enhanced with a new scaling technique for speeding up the learning process. This tool provides a powerful means by which to analyze complex data sets without prior statistical knowledge. The data representation contained in the trained SOM is systematically converted to be used in a spectrum of visualization techniques, such as evaluating dependencies between components, investigating geometric properties of the data distribution, searching for clusters, or monitoring new data. We have used this software tool to analyze and visualize multiple influences of the ocellar system on free-flight behavior in giant honeybees. Occlusion of ocelli will affect orienting reactivities in relation to flight target, level of disturbance, and position of the bee in the flight chamber; it will induce phototaxis and make orienting imprecise and dependent on motivational settings. Ocelli permit the adjustment of orienting strategies to environmental demands by enforcing abilities such as centering or flight kinetics and by providing independent control of posture and flight course.
NASA Astrophysics Data System (ADS)
Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.
2018-06-01
The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.
Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad
2015-01-01
Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.
Measurement of the ν _{μ } energy spectrum with IceCube-79
NASA Astrophysics Data System (ADS)
Aartsen, M. G.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Al Samarai, I.; Altmann, D.; Andeen, K.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Argüelles, C.; Auffenberg, J.; Axani, S.; Bagherpour, H.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; BenZvi, S.; Berley, D.; Bernardini, E.; Besson, D. Z.; Binder, G.; Bindig, D.; Blaufuss, E.; Blot, S.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Bradascio, F.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Bron, S.; Burgman, A.; Carver, T.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cross, R.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dujmovic, H.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Eller, P.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Franckowiak, A.; Friedman, E.; Fuchs, T.; Gaisser, T. K.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Giang, W.; Gladstone, L.; Glauch, T.; Glüsenkamp, T.; Goldschmidt, A.; Gonzalez, J. G.; Grant, D.; Griffith, Z.; Haack, C.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, T.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Hoshina, K.; Huang, F.; Huber, M.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Kang, W.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kheirandish, A.; Kim, J.; Kim, M.; Kintscher, T.; Kiryluk, J.; Kittler, T.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, M.; Krückl, G.; Krüger, C.; Kunnen, J.; Kunwar, S.; Kurahashi, N.; Kuwabara, T.; Kyriacou, A.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lauber, F.; Lennarz, D.; Lesiak-Bzdak, M.; Leuermann, M.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mancina, S.; Maruyama, R.; Mase, K.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Micallef, J.; Momenté, G.; Montaruli, T.; Moulai, M.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Peiffer, P.; Penek, Ö.; Pepper, J. A.; Pérez de los Heros, C.; Pieloth, D.; Pinat, E.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relethford, B.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Rysewyk, D.; Sabbatini, L.; Sanchez Herrera, S. E.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Satalecka, K.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schumacher, L.; Seckel, D.; Seunarine, S.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stachurska, J.; Stanev, T.; Stasik, A.; Stettner, J.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Tenholt, F.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Tung, C. F.; Turcati, A.; Unger, E.; Usner, M.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Rossem, M.; van Santen, J.; Vehring, M.; Voge, M.; Vogel, E.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Waza, A.; Weaver, Ch.; Weiss, M. J.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wickmann, S.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wolf, M.; Wood, T. R.; Woolsey, E.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.
2017-10-01
IceCube is a neutrino observatory deployed in the glacial ice at the geographic South Pole. The ν _μ energy unfolding described in this paper is based on data taken with IceCube in its 79-string configuration. A sample of muon neutrino charged-current interactions with a purity of 99.5% was selected by means of a multivariate classification process based on machine learning. The subsequent unfolding was performed using the software Truee. The resulting spectrum covers an E_ν -range of more than four orders of magnitude from 125 GeV to 3.2 PeV. Compared to the Honda atmospheric neutrino flux model, the energy spectrum shows an excess of more than 1.9 σ in four adjacent bins for neutrino energies E_ν ≥ 177.8 {TeV}. The obtained spectrum is fully compatible with previous measurements of the atmospheric neutrino flux and recent IceCube measurements of a flux of high-energy astrophysical neutrinos.
A spectrum fractal feature classification algorithm for agriculture crops with hyper spectrum image
NASA Astrophysics Data System (ADS)
Su, Junying
2011-11-01
A fractal dimension feature analysis method in spectrum domain for hyper spectrum image is proposed for agriculture crops classification. Firstly, a fractal dimension calculation algorithm in spectrum domain is presented together with the fast fractal dimension value calculation algorithm using the step measurement method. Secondly, the hyper spectrum image classification algorithm and flowchart is presented based on fractal dimension feature analysis in spectrum domain. Finally, the experiment result of the agricultural crops classification with FCL1 hyper spectrum image set with the proposed method and SAM (spectral angle mapper). The experiment results show it can obtain better classification result than the traditional SAM feature analysis which can fulfill use the spectrum information of hyper spectrum image to realize precision agricultural crops classification.
2015-10-26
platforms and are quickly using up available spectrum. The national need in the commercial sector with emerging technologies such as 5G is pushing for...recovered and post processed later. The Front End Server also sends selected data stream across a high speed network link to the centralized
DOT National Transportation Integrated Search
2015-08-04
In 2012, the Navy requested spectrum certification for the shipboard AN/UPX-41(C) Digital Interrogator System, Software Version 5.5 with Mode 5. : Current operating conditions for the Navys AN/UPX-41(C) are the same as restrictions imposed on the ...
DOT National Transportation Integrated Search
2015-10-31
In 2012, the Navy requested spectrum certification for the shipboard AN/UPX-41(C) Digital Interrogator System, Software Version 5.5 with Mode 5. Current operating conditions for the Navys AN/UPX-41(C) are the same as restrictions imposed on the AN...
Using Tablet Applications for Children with Autism to Increase Their Cognitive and Social Skills
ERIC Educational Resources Information Center
Esposito, Marco; Sloan, Janette; Tancredi, Andrea; Gerardi, Giovanna; Postiglione, Paola; Fotia, Francesca; Napoli, Eleonora; Mazzone, Luigi; Valeri, Giovanni; Vicari, Stefano
2017-01-01
Several researchers along with technicians have been developing software and hardware to support and/or replace the standard method of teaching for children with autism spectrum disorders (ASDs) and/or other developmental disabilities. Moreover, computer-based intervention and electronic tablets have shown benefits for people with special needs…
ERIC Educational Resources Information Center
Cabielles-Hernandez, David; Pérez-Pérez, Juan-Ramón; Paule-Ruiz, MPuerto; Fernández-Fernández, Samuel
2017-01-01
New possibilities offered by mobile devices for special education students have led to the design of skill acquisition software applications. Advances in mobile technologies development have made progress possible in helping teachers with autistic students modelling and evaluation. "Chain of Words" theoretical basis is the autism…
Simple and Efficient Technique for Spatial/Temporal Composite Imagery
2007-08-01
visible spectrum between 412nm and 869nm, three bands at 500m and two bands at 250m. The MODIS data was processed using the Automated Processing System2...Version 3.6 developed by the Naval Research Labo- ratory (NRL). The Automated Processing System (APS) is a collection of software programs assembled
Comment on Technology-Based Intervention Research for Individuals on the Autism Spectrum
ERIC Educational Resources Information Center
McCleery, Joseph P.
2015-01-01
The purpose of this letter to the editor is to comment on several review papers recently published in the current "Journal of Autism and Developmental Disorders, Special Issue on Technology: Software, Robotics, and Translational Science." These reviews address a variety of aspects relating to technology-aided intervention and instruction…
Reflection Effects in Multimode Fiber Systems Utilizing Laser Transmitters
NASA Technical Reports Server (NTRS)
Bates, Harry E.
1991-01-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
NASA Astrophysics Data System (ADS)
Hansen, Christopher S.; Kirk, Benjamin B.; Blanksby, Stephen J.; O'Hair, Richard. A. J.; Trevitt, Adam J.
2013-06-01
UV-vis photodissociation action spectroscopy is becoming increasingly prevalent because of advances in, and commercial availability of, ion trapping technologies and tunable laser sources. This study outlines in detail an instrumental arrangement, combining a commercial ion-trap mass spectrometer and tunable nanosecond pulsed laser source, for performing fully automated photodissociation action spectroscopy on gas-phase ions. The components of the instrumentation are outlined, including the optical and electronic interfacing, in addition to the control software for automating the experiment and performing online analysis of the spectra. To demonstrate the utility of this ensemble, the photodissociation action spectra of 4-chloroanilinium, 4-bromoanilinium, and 4-iodoanilinium cations are presented and discussed. Multiple photoproducts are detected in each case and the photoproduct yields are followed as a function of laser wavelength. It is shown that the wavelength-dependent partitioning of the halide loss, H loss, and NH3 loss channels can be broadly rationalized in terms of the relative carbon-halide bond dissociation energies and processes of energy redistribution. The photodissociation action spectrum of (phenyl)Ag2 + is compared with a literature spectrum as a further benchmark.
Observation of proton chorus waves close to the equatorial plane by Cluster
NASA Astrophysics Data System (ADS)
Grison, B.; Pickett, J. S.; Santolik, O.; Robert, P.; Cornilleau-Wehrlin, N.; Engebretson, M. J.; Constantinescu, D. O.
2009-12-01
Whistler mode chorus waves are a widely studied phenomena. They are present in numerous regions of the magnetosphere and are presumed to originate in the magnetic equatorial region. In a spectrogram they are characterized by narrowband features with rise (or fall) in frequency over short periods of time. Being whistler mode waves around a few tenths of the electron cyclotron frequency they interact mainly with electrons. In the present study we report observations by the Cluster spacecraft of what we call proton chorus waves. They have spectral features with rising frequency, similar to the electron chorus waves, but they are detected in a frequency range that starts roughly at 0.50fH+ up to fH+ (the local proton gyro-frequency). The lower part of their spectrum seems to originate from monochromatic Pc 1 waves (1.5 Hz). Proton chorus waves are detected close to the magnetic equatorial plane in both hemispheres during the same event. Our interpretation of these waves as proton chorus is supported by polarization analysis with the Roproc procedures and the Prassadco software using both the magnetic (STAFF-SC) and electric (EFW) parts of the fluctuations spectrum.
Xi, Jia-Fu; Tang, Lei; Zhang, Jian-Hua; Zhang, Hong-Jian; Chen, Xu-Sheng; Mao, Zhong-Gui
2014-11-01
Circular dichroism (CD) is a special absorption spectrum. The secondary structure of protein such as α-helix, β-sheet and β-turn in the far ultraviolet region (190-250 nm) has a characteristic CD spectrum. In order to understand the activity and structural changes of ascorbate peroxidase from Chinese kale (BaAPX) during denaturation, specific activity and percentage of secondary structure of BaAPX under different time, temperature and concentration were analyzed by CD dynamically. In addition, the percentage of four secondary structures in BaAPX was calculated by CD analysis software Dichroweb. The results show that BaAPX is a full α-type enzyme whose specific activity is positively related to the percentage of α-helix. During denaturation of BaAPX, three kinds of structural changes were proposed: the one-step structural change from initial state (N state) to minimum state of α-helix (R state) under low concentration and low temperature; the one-step structural change from N state to equilibrium state (T state) under high concentration and low temperature; the two-step structural changes from N state through R state to final T state under heat treatment and low temperature renaturation.
Reflection effects in multimode fiber systems utilizing laser transmitters
NASA Astrophysics Data System (ADS)
Bates, Harry E.
1991-11-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
Debugging and Performance Analysis Software Tools for Peregrine System |
High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea
Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm
2016-06-03
Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile capabilities with the free and open source software WordPress. This allows institutions without proprietary software to build a sophisticated ePortfolio system adapted to their needs with relatively few resources. The implementation of WordPress should be accompanied by introductory courses in the use of the software and its apps in order to facilitate its usability.
Kelstrup, Christian D.; Frese, Christian; Heck, Albert J. R.; Olsen, Jesper V.; Nielsen, Michael L.
2014-01-01
Unambiguous identification of tandem mass spectra is a cornerstone in mass-spectrometry-based proteomics. As the study of post-translational modifications (PTMs) by means of shotgun proteomics progresses in depth and coverage, the ability to correctly identify PTM-bearing peptides is essential, increasing the demand for advanced data interpretation. Several PTMs are known to generate unique fragment ions during tandem mass spectrometry, the so-called diagnostic ions, which unequivocally identify a given mass spectrum as related to a specific PTM. Although such ions offer tremendous analytical advantages, algorithms to decipher MS/MS spectra for the presence of diagnostic ions in an unbiased manner are currently lacking. Here, we present a systematic spectral-pattern-based approach for the discovery of diagnostic ions and new fragmentation mechanisms in shotgun proteomics datasets. The developed software tool is designed to analyze large sets of high-resolution peptide fragmentation spectra independent of the fragmentation method, instrument type, or protease employed. To benchmark the software tool, we analyzed large higher-energy collisional activation dissociation datasets of samples containing phosphorylation, ubiquitylation, SUMOylation, formylation, and lysine acetylation. Using the developed software tool, we were able to identify known diagnostic ions by comparing histograms of modified and unmodified peptide spectra. Because the investigated tandem mass spectra data were acquired with high mass accuracy, unambiguous interpretation and determination of the chemical composition for the majority of detected fragment ions was feasible. Collectively we present a freely available software tool that allows for comprehensive and automatic analysis of analogous product ions in tandem mass spectra and systematic mapping of fragmentation mechanisms related to common amino acids. PMID:24895383
Lyerla, R; Gouws, E; García-Calleja, J M; Zaniewski, E
2006-06-01
This paper describes improvements and updates to an established approach to making epidemiological estimates of HIV prevalence in countries with low level and concentrated epidemics. The structure of the software used to make estimates is briefly described, with particular attention to changes and improvements. The approach focuses on identifying populations which, through their behaviour, are at high risk of infection with HIV or who are exposed through the risk behaviour of their sexual partners. Estimates of size and HIV prevalence of these populations allow the total number of HIV infected people in a country or region to be estimated. Major changes in the software focus on the move away from short term projections and towards developing an epidemiological curve that more accurately represents the change in prevalence of HIV over time. The software continues to provide an output file for use in the Spectrum software so as to estimate the demographic impact of HIV infection at country level.
A proposed classification scheme for Ada-based software products
NASA Technical Reports Server (NTRS)
Cernosek, Gary J.
1986-01-01
As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.
2015-01-01
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686
Sample, Paul J.; Gaston, Kirk W.; Alfonzo, Juan D.; Limbach, Patrick A.
2015-01-01
Ribosomal ribonucleic acid (RNA), transfer RNA and other biological or synthetic RNA polymers can contain nucleotides that have been modified by the addition of chemical groups. Traditional Sanger sequencing methods cannot establish the chemical nature and sequence of these modified-nucleotide containing oligomers. Mass spectrometry (MS) has become the conventional approach for determining the nucleotide composition, modification status and sequence of modified RNAs. Modified RNAs are analyzed by MS using collision-induced dissociation tandem mass spectrometry (CID MS/MS), which produces a complex dataset of oligomeric fragments that must be interpreted to identify and place modified nucleosides within the RNA sequence. Here we report the development of RoboOligo, an interactive software program for the robust analysis of data generated by CID MS/MS of RNA oligomers. There are three main functions of RoboOligo: (i) automated de novo sequencing via the local search paradigm. (ii) Manual sequencing with real-time spectrum labeling and cumulative intensity scoring. (iii) A hybrid approach, coined ‘variable sequencing’, which combines the user intuition of manual sequencing with the high-throughput sampling of automated de novo sequencing. PMID:25820423
French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L
2015-02-06
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.
NASA Astrophysics Data System (ADS)
Sabchevski, S.; Idehara, T.; Damyanova, M.; Zhelyazkov, I.; Balabanova, E.; Vasileva, E.
2018-03-01
Gyrotrons are the most powerful sources of CW coherent radiation in the sub-THz and THz frequency bands. In recent years, they have demonstrated a remarkable potential for bridging the so-called THz-gap in the electromagnetic spectrum and opened the road to many novel applications of the terahertz waves. Among them are various advanced spectroscopic techniques (e.g., ESR and DNP-NMR), plasma physics and fusion research, materials processing and characterization, imaging and inspection, new medical technologies and biological studies. In this paper, we review briefly the current status of the research in this broad field and present our problem-oriented software packages developed recently for numerical analysis, computer-aided design (CAD) and optimization of gyrotrons.
Portable gas chromatograph-mass spectrometer
Andresen, B.D.; Eckels, J.D.; Kimmons, J.F.; Myers, D.W.
1996-06-11
A gas chromatograph-mass spectrometer (GC-MS) is described for use as a field portable organic chemical analysis instrument. The GC-MS is designed to be contained in a standard size suitcase, weighs less than 70 pounds, and requires less than 600 watts of electrical power at peak power (all systems on). The GC-MS includes: a conduction heated, forced air cooled small bore capillary gas chromatograph, a small injector assembly, a self-contained ion/sorption pump vacuum system, a hydrogen supply, a dual computer system used to control the hardware and acquire spectrum data, and operational software used to control the pumping system and the gas chromatograph. This instrument incorporates a modified commercial quadrupole mass spectrometer to achieve the instrument sensitivity and mass resolution characteristic of laboratory bench top units. 4 figs.
NASA Astrophysics Data System (ADS)
Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2015-04-01
The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.
Evaluation of the BreastSimulator software platform for breast tomography
NASA Astrophysics Data System (ADS)
Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.
2017-08-01
The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f) = α/f β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.
Practices to enable the geophysical research spectrum: from fundamentals to applications
NASA Astrophysics Data System (ADS)
Kang, S.; Cockett, R.; Heagy, L. J.; Oldenburg, D.
2016-12-01
In a geophysical survey, a source injects energy into the earth and a response is measured. These physical systems are governed by partial differential equations and their numerical solutions are obtained by discretizing the earth. Geophysical simulations and inversions are tools for understanding physical responses and constructing models of the subsurface given a finite amount of data. SimPEG (http://simpeg.xyz) is our effort to synthesize geophysical forward and inverse methodologies into a consistent framework. The primary focus of our initial development has been on the electromagnetics (EM) package, with recent extensions to magnetotelluric, direct current (DC), and induced polarization. Across these methods, and applied geophysics in general, we require tools to explore and build an understanding of the physics (behaviour of fields, fluxes), and work with data to produce models through reproducible inversions. If we consider DC or EM experiments, with the aim of understanding responses from subsurface conductors, we require resources that provide multiple "entry points" into the geophysical problem. To understand the physical responses and measured data, we must simulate the physical system and visualize electric fields, currents, and charges. Performing an inversion requires that many moving pieces be brought together: simulation, physics, linear algebra, data processing, optimization, etc. Each component must be trusted, accessible to interrogation and manipulation, and readily combined in order to enable investigation into inversion methodologies. To support such research, we not only require "entry points" into the software, but also extensibility to new situations. In our development of SimPEG, we have sought to use leading practices in software development with the aim of supporting and promoting collaborations across a spectrum of geophysical research: from fundamentals to applications. Designing software to enable this spectrum puts unique constraints on both the architecture of the codebase as well as the development practices that are employed. In this presentation, we will share some lessons learned and, in particular, how our prioritization of testing, documentation, and refactoring has impacted our own research and fostered collaborations.
StrAuto: automation and parallelization of STRUCTURE analysis.
Chhatre, Vikram E; Emerson, Kevin J
2017-03-24
Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .
An excitation wavelength-scanning spectral imaging system for preclinical imaging
NASA Astrophysics Data System (ADS)
Leavesley, Silas; Jiang, Yanan; Patsekin, Valery; Rajwa, Bartek; Robinson, J. Paul
2008-02-01
Small-animal fluorescence imaging is a rapidly growing field, driven by applications in cancer detection and pharmaceutical therapies. However, the practical use of this imaging technology is limited by image-quality issues related to autofluorescence background from animal tissues, as well as attenuation of the fluorescence signal due to scatter and absorption. To combat these problems, spectral imaging and analysis techniques are being employed to separate the fluorescence signal from background autofluorescence. To date, these technologies have focused on detecting the fluorescence emission spectrum at a fixed excitation wavelength. We present an alternative to this technique, an imaging spectrometer that detects the fluorescence excitation spectrum at a fixed emission wavelength. The advantages of this approach include increased available information for discrimination of fluorescent dyes, decreased optical radiation dose to the animal, and ability to scan a continuous wavelength range instead of discrete wavelength sampling. This excitation-scanning imager utilizes an acousto-optic tunable filter (AOTF), with supporting optics, to scan the excitation spectrum. Advanced image acquisition and analysis software has also been developed for classification and unmixing of the spectral image sets. Filtering has been implemented in a single-pass configuration with a bandwidth (full width at half maximum) of 16nm at 550nm central diffracted wavelength. We have characterized AOTF filtering over a wide range of incident light angles, much wider than has been previously reported in the literature, and we show how changes in incident light angle can be used to attenuate AOTF side lobes and alter bandwidth. A new parameter, in-band to out-of-band ratio, was defined to assess the quality of the filtered excitation light. Additional parameters were measured to allow objective characterization of the AOTF and the imager as a whole. This is necessary for comparing the excitation-scanning imager to other spectral and fluorescence imaging technologies. The effectiveness of the hyperspectral imager was tested by imaging and analysis of mice with injected fluorescent dyes. Finally, a discussion of the optimization of spectral fluorescence imagers is given, relating the effects of filter quality on fluorescence images collected and the analysis outcome.
Optimization techniques applied to spectrum management for communications satellites
NASA Astrophysics Data System (ADS)
Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.
This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.
NASA Astrophysics Data System (ADS)
Vorndran, Shelby D.; Wu, Yuechen; Ayala, Silvana; Kostuk, Raymond K.
2015-09-01
Concentrating and spectrum splitting photovoltaic (PV) modules have a limited acceptance angle and thus suffer from optical loss under off-axis illumination. This loss manifests itself as a substantial reduction in energy yield in locations where a significant portion of insulation is diffuse. In this work, a spectrum splitting PV system is designed to efficiently collect and convert light in a range of illumination conditions. The system uses a holographic lens to concentrate shortwavelength light onto a smaller, more expensive indium gallium phosphide (InGaP) PV cell. The high efficiency PV cell near the axis is surrounded with silicon (Si), a less expensive material that collects a broader portion of the solar spectrum. Under direct illumination, the device achieves increased conversion efficiency from spectrum splitting. Under diffuse illumination, the device collects light with efficiency comparable to a flat-panel Si module. Design of the holographic lens is discussed. Optical efficiency and power output of the module under a range of illumination conditions from direct to diffuse are simulated with non-sequential raytracing software. Using direct and diffuse Typical Metrological Year (TMY3) irradiance measurements, annual energy yield of the module is calculated for several installation sites. Energy yield of the spectrum splitting module is compared to that of a full flat-panel Si reference module.
Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.
2016-01-01
Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565
Bulabula, Andre N H; Dramowski, Angela; Mehtar, Shaheen
2017-11-01
To summarize published studies on the prevalence of and risk factors for maternal bacterial colonization and/or infection with extended-spectrum beta-lactamase-producing Enterobacteriaceae (ESBL-E) in pregnant and/or post-partum women in Africa. A systematic review was conducted using the PubMed, Scopus, and Google Scholar databases. Bibliographies of included eligible studies were manually searched to identify additional relevant articles. No language restriction was applied. The timeframe of the search included all records from electronic database inception to July 15, 2017. A random-effects meta-analysis was performed to summarize the prevalence and the 95% confidence intervals (CI) of ESBL-E colonization or infection in pregnant or post-partum women in Africa. The meta-analysis was conducted using STATA IC 13.1 software and the metaprop function/plugin. Ten studies (seven on pregnant women and three on post-partum women) were included, documenting a 17% prevalence of maternal colonization with ESBL-E in Africa (95% CI 10-23%). The prevalence of ESBL-E in community isolates exceeded that in isolates from the hospital setting (22% vs. 14%). The most frequently reported ESBL-encoding gene was CTX-M (cefotaxime hydrolyzing capabilities). Data on risk factors for maternal ESBL-E colonization and infection are very limited. The prevalence of colonization and/or infection with ESBL-E in pregnant and post-partum women in Africa exceeds that reported from high- and middle-income settings, representing a risk for subsequent neonatal colonization and/or infection with ESBL-E. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Grinhut, Tzafrir; Lansky, Dedy; Gaspar, Andras; Hertkorn, Norbert; Schmitt-Kopplin, Philippe; Hadar, Yitzhak; Chen, Yona
2010-10-15
Natural organic matter (NOM) occurs as an extremely complex mixture of large, charged molecules that are formed by secondary synthesis reactions. Due to their nature, their full characterization is an important challenge to scientists specializing in NOM as well as analytical chemistry. Ultra-high-resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) analysis enables the identification of thousands of masses in a single measurement. A major challenge in the data analysis process of NOM using the FT-ICR MS technique is the need to sort the entire data set and to present it in an accessible mode. Here we present a simple targeted algorithm called the David Mass Sort (DMS) algorithm which facilitates the detection and counting of consecutive series of masses correlated to any selected mass spacing. This program searches for specific mass differences among all of the masses in a single spectrum against all of the masses in the same spectrum. As a representative case, the current study focuses on the analysis of the well-characterized Suwannee River humic and fulvic acid (SRHA and SRFA, respectively). By applying this algorithm, we were able to find and assess the amount of singly and doubly charged molecules. In addition we present the capabilities of the program to detect any series of consecutive masses correlated to specific mass spacing, e.g. COO, H(2), OCH(2) and O(2). Under several limitations, these mass spacings may be correlated to both chemical and biochemical changes which occur simultaneously during the formation and/or degradation of large mixtures of compounds. Copyright © 2010 John Wiley & Sons, Ltd.
Computer-Aided Recognition of Facial Attributes for Fetal Alcohol Spectrum Disorders.
Valentine, Matthew; Bihm, Dustin C J; Wolf, Lior; Hoyme, H Eugene; May, Philip A; Buckley, David; Kalberg, Wendy; Abdul-Rahman, Omar A
2017-12-01
To compare the detection of facial attributes by computer-based facial recognition software of 2-D images against standard, manual examination in fetal alcohol spectrum disorders (FASD). Participants were gathered from the Fetal Alcohol Syndrome Epidemiology Research database. Standard frontal and oblique photographs of children were obtained during a manual, in-person dysmorphology assessment. Images were submitted for facial analysis conducted by the facial dysmorphology novel analysis technology (an automated system), which assesses ratios of measurements between various facial landmarks to determine the presence of dysmorphic features. Manual blinded dysmorphology assessments were compared with those obtained via the computer-aided system. Areas under the curve values for individual receiver-operating characteristic curves revealed the computer-aided system (0.88 ± 0.02) to be comparable to the manual method (0.86 ± 0.03) in detecting patients with FASD. Interestingly, cases of alcohol-related neurodevelopmental disorder (ARND) were identified more efficiently by the computer-aided system (0.84 ± 0.07) in comparison to the manual method (0.74 ± 0.04). A facial gestalt analysis of patients with ARND also identified more generalized facial findings compared to the cardinal facial features seen in more severe forms of FASD. We found there was an increased diagnostic accuracy for ARND via our computer-aided method. As this category has been historically difficult to diagnose, we believe our experiment demonstrates that facial dysmorphology novel analysis technology can potentially improve ARND diagnosis by introducing a standardized metric for recognizing FASD-associated facial anomalies. Earlier recognition of these patients will lead to earlier intervention with improved patient outcomes. Copyright © 2017 by the American Academy of Pediatrics.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
Dipy, a library for the analysis of diffusion MRI data.
Garyfallidis, Eleftherios; Brett, Matthew; Amirbekian, Bagrat; Rokem, Ariel; van der Walt, Stefan; Descoteaux, Maxime; Nimmo-Smith, Ian
2014-01-01
Diffusion Imaging in Python (Dipy) is a free and open source software project for the analysis of data from diffusion magnetic resonance imaging (dMRI) experiments. dMRI is an application of MRI that can be used to measure structural features of brain white matter. Many methods have been developed to use dMRI data to model the local configuration of white matter nerve fiber bundles and infer the trajectory of bundles connecting different parts of the brain. Dipy gathers implementations of many different methods in dMRI, including: diffusion signal pre-processing; reconstruction of diffusion distributions in individual voxels; fiber tractography and fiber track post-processing, analysis and visualization. Dipy aims to provide transparent implementations for all the different steps of dMRI analysis with a uniform programming interface. We have implemented classical signal reconstruction techniques, such as the diffusion tensor model and deterministic fiber tractography. In addition, cutting edge novel reconstruction techniques are implemented, such as constrained spherical deconvolution and diffusion spectrum imaging (DSI) with deconvolution, as well as methods for probabilistic tracking and original methods for tractography clustering. Many additional utility functions are provided to calculate various statistics, informative visualizations, as well as file-handling routines to assist in the development and use of novel techniques. In contrast to many other scientific software projects, Dipy is not being developed by a single research group. Rather, it is an open project that encourages contributions from any scientist/developer through GitHub and open discussions on the project mailing list. Consequently, Dipy today has an international team of contributors, spanning seven different academic institutions in five countries and three continents, which is still growing.
Dipy, a library for the analysis of diffusion MRI data
Garyfallidis, Eleftherios; Brett, Matthew; Amirbekian, Bagrat; Rokem, Ariel; van der Walt, Stefan; Descoteaux, Maxime; Nimmo-Smith, Ian
2014-01-01
Diffusion Imaging in Python (Dipy) is a free and open source software project for the analysis of data from diffusion magnetic resonance imaging (dMRI) experiments. dMRI is an application of MRI that can be used to measure structural features of brain white matter. Many methods have been developed to use dMRI data to model the local configuration of white matter nerve fiber bundles and infer the trajectory of bundles connecting different parts of the brain. Dipy gathers implementations of many different methods in dMRI, including: diffusion signal pre-processing; reconstruction of diffusion distributions in individual voxels; fiber tractography and fiber track post-processing, analysis and visualization. Dipy aims to provide transparent implementations for all the different steps of dMRI analysis with a uniform programming interface. We have implemented classical signal reconstruction techniques, such as the diffusion tensor model and deterministic fiber tractography. In addition, cutting edge novel reconstruction techniques are implemented, such as constrained spherical deconvolution and diffusion spectrum imaging (DSI) with deconvolution, as well as methods for probabilistic tracking and original methods for tractography clustering. Many additional utility functions are provided to calculate various statistics, informative visualizations, as well as file-handling routines to assist in the development and use of novel techniques. In contrast to many other scientific software projects, Dipy is not being developed by a single research group. Rather, it is an open project that encourages contributions from any scientist/developer through GitHub and open discussions on the project mailing list. Consequently, Dipy today has an international team of contributors, spanning seven different academic institutions in five countries and three continents, which is still growing. PMID:24600385
Laser Induced Breakdown Spectroscopy of Glass and Crystal Samples
NASA Astrophysics Data System (ADS)
Sharma, Prakash; Sandoval, Alejandra; Carter, Michael; Kumar, Akshaya
2015-03-01
Different types of quartz crystals and rare earth ions doped glasses have been identified using the laser induced breakdown spectroscopy (LIBS) technique. LIBS is a real time technique, can be used to identify samples in solid, liquid and gas phases. The advantage of LIBS technique is that no sample preparation is required and laser causes extremely minimal damage to the sample surface. The LIBS spectrum of silicate glasses, prepared by sol-gel method and doped with different concentration of rare earth ions, has been recorded. The limit of detection of rare earth ions in glass samples has been calculated. Total 10 spectrums of each sample were recorded and then averaged to get a final spectrum. The ocean optics LIBS2500 plus spectrometer along with a Q- switched Nd: YAG laser (Quantel, Big Sky) were used to record the LIBS spectrum. This spectrometer can analyze the sample in the spectral range of 200 nm to 980 nm. The spectrum was processed by OOILIBS-plus (v1.0) software. This study has application in the industry where different crystals can be easily identified before they go for shaping and polishing. Also, concentration of rare earth ions in glass can be monitored in real time for quality control.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrecht, David G.; Schwantes, Jon M.; Kukkadapu, Ravi K.
2015-02-01
Spectrum-processing software that incorporates a gaussian smoothing kernel within the statistics of first-order Kalman filtration has been developed to provide cross-channel spectral noise reduction for increased real-time signal-to-noise ratios for Mossbauer spectroscopy. The filter was optimized for the breadth of the gaussian using the Mossbauer spectrum of natural iron foil, and comparisons between the peak broadening, signal-to-noise ratios, and shifts in the calculated hyperfine parameters are presented. The results of optimization give a maximum improvement in the signal-to-noise ratio of 51.1% over the unfiltered spectrum at a gaussian breadth of 27 channels, or 2.5% of the total spectrum width. Themore » full-width half-maximum of the spectrum peaks showed an increase of 19.6% at this optimum point, indicating a relatively weak increase in the peak broadening relative to the signal enhancement, leading to an overall increase in the observable signal. Calculations of the hyperfine parameters showed no statistically significant deviations were introduced from the application of the filter, confirming the utility of this filter for spectroscopy applications.« less
MASPECTRAS: a platform for management and analysis of proteomics LC-MS/MS data
Hartler, Jürgen; Thallinger, Gerhard G; Stocker, Gernot; Sturn, Alexander; Burkard, Thomas R; Körner, Erik; Rader, Robert; Schmidt, Andreas; Mechtler, Karl; Trajanoski, Zlatko
2007-01-01
Background The advancements of proteomics technologies have led to a rapid increase in the number, size and rate at which datasets are generated. Managing and extracting valuable information from such datasets requires the use of data management platforms and computational approaches. Results We have developed the MAss SPECTRometry Analysis System (MASPECTRAS), a platform for management and analysis of proteomics LC-MS/MS data. MASPECTRAS is based on the Proteome Experimental Data Repository (PEDRo) relational database schema and follows the guidelines of the Proteomics Standards Initiative (PSI). Analysis modules include: 1) import and parsing of the results from the search engines SEQUEST, Mascot, Spectrum Mill, X! Tandem, and OMSSA; 2) peptide validation, 3) clustering of proteins based on Markov Clustering and multiple alignments; and 4) quantification using the Automated Statistical Analysis of Protein Abundance Ratios algorithm (ASAPRatio). The system provides customizable data retrieval and visualization tools, as well as export to PRoteomics IDEntifications public repository (PRIDE). MASPECTRAS is freely available at Conclusion Given the unique features and the flexibility due to the use of standard software technology, our platform represents significant advance and could be of great interest to the proteomics community. PMID:17567892
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickens, J.K.; Hill, N.W.; Hou, F.S.
1985-08-01
A system for making diagnostic measurements of the energy spectra of greater than or equal to 0.8-MeV neutrons produced during plasma operations of the Princeton Tokamak Fusion Test Reactor (TFTR) has been fabricated and tested and is presently in operation in the TFTR Test Cell Basement. The system consists of two separate detectors, each made up of cells containing liquid NE-213 scintillator attached permanently to RCA-8850 photomultiplier tubes. Pulses obtained from each photomultiplier system are amplified and electronically analyzed to identify and separate those pulses due to neutron-induced events in the detector from those due to photon-induced events in themore » detector. Signals from each detector are routed to two separate Analog-to-Digital Converters, and the resulting digitized information, representing: (1) the raw neutron-spectrum data; and (2) the raw photon-spectrum data, are transmited to the CICADA data-acquisition computer system of the TFTR. Software programs have been installed on the CICADA system to analyze the raw data to provide moderate-resolution recreations of the energy spectrum of the neutron and photon fluences incident on the detector during the operation of the TFTR. A complete description of, as well as the operation of, the hardware and software is given in this report.« less
Building achromatic refractive beam shapers
NASA Astrophysics Data System (ADS)
Laskin, Alexander; Shealy, David
2014-10-01
Achromatic beam shapers can provide beam shaping in a certain spectral band and are very important for various laser techniques, such as, applications based on ultra-short pulse lasers with pulse width <100 fs, confocal microscopy, multicolour holography, life sciences fluorescence techniques, where several lasers in spectrum 405-650 nm are used simultaneously, for example 405-650 nm. Conditions of energy re-distribution and zero wave aberration are strictly fulfilled in ordinary plano-aspheric lens pair beam shapers for a definite wavelength only. Hence, these beam shapers work efficiently in relatively narrow, few nm spectrum. To provide acceptable beam quality for refractive beam shaping over a wide spectrum, an achromatizing design condition should be added. Consequently, the typical beam shaper design contains more than two-lenses, to avoid any damaging and other undesirable effects the lenses of beam shaper should be air-spaced. We suggest a two-step method of designing the beam shaper: 1) achromatizing of each plano-aspheric lens using a buried achromatizing surface ("chromatic radius"), then each beam shaper component presents a cemented doublet lens, 2) "splitting" the cemented lenses and realizing air-spaced lens design using optical systems design software. This method allows for using an achromatic design principle during the first step of the design, and then, refining the design by using optimization software. We shall present examples of this design procedure for an achromatic Keplerian beam shaper and for the design of an achromatic Galilean type of beam shaper. Experimental results of operation of refractive beam shapers will be presented as well.
Injection Locking Techniques for Spectrum Analysis
NASA Astrophysics Data System (ADS)
Gathma, Timothy D.; Buckwalter, James F.
2011-04-01
Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.
Computer Decision Support to Improve Autism Screening and Care in Community Pediatric Clinics
ERIC Educational Resources Information Center
Bauer, Nerissa S.; Sturm, Lynne A.; Carroll, Aaron E.; Downs, Stephen M.
2013-01-01
An autism module was added to an existing computer decision support system (CDSS) to facilitate adherence to recommended guidelines for screening for autism spectrum disorders in primary care pediatric clinics. User satisfaction was assessed by survey and informal feedback at monthly meetings between clinical staff and the software team. To assess…
Inertial Upper Stage (IUS) software analysis
NASA Technical Reports Server (NTRS)
Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.
1979-01-01
The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.
Investigation of a novel approach to scoring Giemsa-stained malaria-infected thin blood films.
Proudfoot, Owen; Drew, Nathan; Scholzen, Anja; Xiang, Sue; Plebanski, Magdalena
2008-04-21
Daily assessment of the percentage of erythrocytes that are infected ('percent-parasitaemia') across a time-course is a necessary step in many experimental studies of malaria, but represents a time-consuming and unpopular task among researchers. The most common method is extensive microscopic examination of Giemsa-stained thin blood-films. This study explored a method for the assessment of percent-parasitaemia that does not require extended periods of microscopy and results in a descriptive and permanent record of parasitaemia data that is highly amenable to subsequent 'data-mining'. Digital photography was utilized in conjunction with a basic purpose-written computer programme to test the viability of the concept. Partial automation of the determination of percent parasitaemia was then explored, resulting in the successful customization of commercially available broad-spectrum image analysis software towards this aim. Lastly, automated discrimination between infected and uninfected RBCs based on analysis of digital parameters of individual cell images was explored in an effort to completely automate the calculation of an accurate percent-parasitaemia.
A fatigue monitoring system based on time-domain and frequency-domain analysis of pulse data
NASA Astrophysics Data System (ADS)
Shen, Jiaai
2018-04-01
Fatigue is almost a problem that everyone would face, and a psychosis that everyone hates. If we can test people's fatigue condition and remind them of the tiredness, dangers in life, for instance, traffic accidents and sudden death will be effectively reduced, people's fatigued operations will be avoided. And people can be assisted to have access to their own and others' physical condition in time to alternate work with rest. The article develops a wearable bracelet based on FFT Pulse Frequency Spectrum Analysis and IBI's standard deviation and range calculation, according to people's heart rate (BPM) and inter-beat interval (IBI) while being tired and conscious. The hardware part is based on Arduino, pulse rate sensor, and Bluetooth module, and the software part is relied on network micro database and APP. By doing sample experiment to get more accurate standard value to judge tiredness, we prove that we can judge people's fatigue condition based on heart rate (BPM) and inter-beat interval (IBI).
NASA Technical Reports Server (NTRS)
Fischer, Robert E. (Editor); Rogers, Philip J. (Editor)
1986-01-01
The present conference considers topics in the fields of optical systems design software, the design and analysis of optical systems, illustrative cases of advanced optical system design, the integration of optical designs into greater systems, and optical fabrication and testing techniques. Attention is given to an extended range diffraction-based merit function for lens design optimization, an assessment of technologies for stray light control and evaluation, the automated characterization of IR systems' spatial resolution, a spectrum of design techniques based on aberration theory, a three-field IR telescope, a large aperture zoom lens for 16-mm motion picture cameras, and the use of concave holographic gratings as monochomators. Also discussed are the use of aspherics in optical systems, glass choice procedures for periscope design, the fabrication and testing of unconventional optics, low mass mirrors for large optics, and the diamond grinding of optical surfaces on aspheric lens molds.
Near-infrared hyperspectral imaging for quality analysis of agricultural and food products
NASA Astrophysics Data System (ADS)
Singh, C. B.; Jayas, D. S.; Paliwal, J.; White, N. D. G.
2010-04-01
Agricultural and food processing industries are always looking to implement real-time quality monitoring techniques as a part of good manufacturing practices (GMPs) to ensure high-quality and safety of their products. Near-infrared (NIR) hyperspectral imaging is gaining popularity as a powerful non-destructive tool for quality analysis of several agricultural and food products. This technique has the ability to analyse spectral data in a spatially resolved manner (i.e., each pixel in the image has its own spectrum) by applying both conventional image processing and chemometric tools used in spectral analyses. Hyperspectral imaging technique has demonstrated potential in detecting defects and contaminants in meats, fruits, cereals, and processed food products. This paper discusses the methodology of hyperspectral imaging in terms of hardware, software, calibration, data acquisition and compression, and development of prediction and classification algorithms and it presents a thorough review of the current applications of hyperspectral imaging in the analyses of agricultural and food products.
jsNMR: an embedded platform-independent NMR spectrum viewer.
Vosegaard, Thomas
2015-04-01
jsNMR is a lightweight NMR spectrum viewer written in JavaScript/HyperText Markup Language (HTML), which provides a cross-platform spectrum visualizer that runs on all computer architectures including mobile devices. Experimental (and simulated) datasets are easily opened in jsNMR by (i) drag and drop on a jsNMR browser window, (ii) by preparing a jsNMR file from the jsNMR web site, or (iii) by mailing the raw data to the jsNMR web portal. jsNMR embeds the original data in the HTML file, so a jsNMR file is a self-transforming dataset that may be exported to various formats, e.g. comma-separated values. The main applications of jsNMR are to provide easy access to NMR data without the need for dedicated software installed and to provide the possibility to visualize NMR spectra on web sites. Copyright © 2015 John Wiley & Sons, Ltd.
SeisCode: A seismological software repository for discovery and collaboration
NASA Astrophysics Data System (ADS)
Trabant, C.; Reyes, C. G.; Clark, A.; Karstens, R.
2012-12-01
SeisCode is a community repository for software used in seismological and related fields. The repository is intended to increase discoverability of such software and to provide a long-term home for software projects. Other places exist where seismological software may be found, but none meet the requirements necessary for an always current, easy to search, well documented, and citable resource for projects. Organizations such as IRIS, ORFEUS, and the USGS have websites with lists of available or contributed seismological software. Since the authors themselves do often not maintain these lists, the documentation often consists of a sentence or paragraph, and the available software may be outdated. Repositories such as GoogleCode and SourceForge, which are directly maintained by the authors, provide version control and issue tracking but do not provide a unified way of locating geophysical software scattered in and among countless unrelated projects. Additionally, projects are hosted at language-specific sites such as Mathworks and PyPI, in FTP directories, and in websites strewn across the Web. Search engines are only partially effective discovery tools, as the desired software is often hidden deep within the results. SeisCode provides software authors a place to present their software, codes, scripts, tutorials, and examples to the seismological community. Authors can choose their own level of involvement. At one end of the spectrum, the author might simply create a web page that points to an existing site. At the other extreme, an author may choose to leverage the many tools provided by SeisCode, such as a source code management tool with integrated issue tracking, forums, news feeds, downloads, wikis, and more. For software development projects with multiple authors, SeisCode can also be used as a central site for collaboration. SeisCode provides the community with an easy way to discover software, while providing authors a way to build a community around their software packages. IRIS invites the seismological community to browse and to submit projects to https://seiscode.iris.washington.edu/
NASA Technical Reports Server (NTRS)
Jakeman, Hali L.
2013-01-01
The Ka-Band Object Observation and Monitoring, or KaBOOM, project is designed mainly to track and characterize near Earth objects. However, a smaller goal of the project would be to monitor pulsars and study their radio frequency signals for use as a clock in interstellar travel. The use of pulsars and their timing accuracy has been studied for decades, but never in the Ka-band of the radio frequency spectrum. In order to begin the use of KaBOOM for this research, the control systems need to be analyzed to ensure its capability. Flaws in the control documentation leave it unclear as to whether the control software processes coordinates from the J200 epoch. This experiment will examine the control software of the Intertronic 12m antennas used for the KaBOOM project and detail its capabilities in its "equatorial mode." The antennas will be pointed at 4 chosen points in the sky on several days while probing the virtual azimuth and elevation (horizon coordinate) registers. The input right ascension and declination coordinates will then be converted separately from the control software to horizontal coordinates and compared, thus determining the ability of the control software to process equatorial coordinates.
NASA Astrophysics Data System (ADS)
Mudraya, I. S.; Revenko, S. V.; Khodyreva, L. A.; Markosyan, T. G.; Dudareva, A. A.; Ibragimov, A. R.; Romich, V. V.; Kirpatovsky, V. I.
2013-04-01
The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic - in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.
[A study of Boletus bicolor from different areas using Fourier transform infrared spectrometry].
Zhou, Zai-Jin; Liu, Gang; Ren, Xian-Pei
2010-04-01
It is hard to differentiate the same species of wild growing mushrooms from different areas by macromorphological features. In this paper, Fourier transform infrared (FTIR) spectroscopy combined with principal component analysis was used to identify 58 samples of boletus bicolor from five different areas. Based on the fingerprint infrared spectrum of boletus bicolor samples, principal component analysis was conducted on 58 boletus bicolor spectra in the range of 1 350-750 cm(-1) using the statistical software SPSS 13.0. According to the result, the accumulated contributing ratio of the first three principal components accounts for 88.87%. They included almost all the information of samples. The two-dimensional projection plot using first and second principal component is a satisfactory clustering effect for the classification and discrimination of boletus bicolor. All boletus bicolor samples were divided into five groups with a classification accuracy of 98.3%. The study demonstrated that wild growing boletus bicolor at species level from different areas can be identified by FTIR spectra combined with principal components analysis.
Time Series Analysis of the Quasar PKS 1749+096
NASA Astrophysics Data System (ADS)
Lam, Michael T.; Balonek, T. J.
2011-01-01
Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.
NASA Astrophysics Data System (ADS)
Papanikolaou, T. D.; Papadopoulos, N.
2015-06-01
The present study aims at the validation of global gravity field models through numerical investigation in gravity field functionals based on spherical harmonic synthesis of the geopotential models and the analysis of terrestrial data. We examine gravity models produced according to the latest approaches for gravity field recovery based on the principles of the Gravity field and steadystate Ocean Circulation Explorer (GOCE) and Gravity Recovery And Climate Experiment (GRACE) satellite missions. Furthermore, we evaluate the overall spectrum of the ultra-high degree combined gravity models EGM2008 and EIGEN-6C3stat. The terrestrial data consist of gravity and collocated GPS/levelling data in the overall Hellenic region. The software presented here implements the algorithm of spherical harmonic synthesis in a degree-wise cumulative sense. This approach may quantify the bandlimited performance of the individual models by monitoring the degree-wise computed functionals against the terrestrial data. The degree-wise analysis performed yields insight in the short-wavelengths of the Earth gravity field as these are expressed by the high degree harmonics.
Ganz, Jennifer B; Morin, Kristi L; Foster, Margaret J; Vannest, Kimberly J; Genç Tosun, Derya; Gregori, Emily V; Gerow, Stephanie L
2017-12-01
The use of mobile technology is ubiquitous in modern society and is rapidly increasing in novel use. The use of mobile devices and software applications ("apps") as augmentative and alternative communication (AAC) is rapidly expanding in the community, and this is also reflected in the research literature. This article reports the social-communication outcome results of a meta-analysis of single-case experimental research on the use of high-tech AAC, including mobile devices, by individuals with intellectual and developmental disabilities, including autism spectrum disorder. Following inclusion determination, and excluding studies with poor design quality, raw data from 24 publications were extracted and included 89 A-B phase contrasts. Tau-U nonparametric, non-overlap effect size was used to aggregate the results across all studies for an omnibus and moderator analyses. Kendall's S was calculated for confidence intervals, p-values, and standard error. The omnibus analysis indicated overall low to moderate positive effects on social-communication outcomes for high-tech AAC use by individuals with intellectual and developmental disabilities.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
NASA Technical Reports Server (NTRS)
2003-01-01
Topics include: Tool for Bending a Metal Tube Precisely in a Confined Space; Multiple-Use Mechanisms for Attachment to Seat Tracks; Force-Measuring Clamps; Cellular Pressure-Actuated Joint; Block QCA Fault-Tolerant Logic Gates; Hybrid VLSI/QCA Architecture for Computing FFTs; Arrays of Carbon Nanotubes as RF Filters in Waveguides; Carbon Nanotubes as Resonators for RF Spectrum Analyzers; Software for Viewing Landsat Mosaic Images; Updated Integrated Mission Program; Software for Sharing and Management of Information; Optical-Quality Thin Polymer Membranes; Rollable Thin Shell Composite-Material Paraboloidal Mirrors; Folded Resonant Horns for Power Ultrasonic Applications; Touchdown Ball-Bearing System for Magnetic Bearings; Flux-Based Deadbeat Control of Induction-Motor Torque; Block Copolymers as Templates for Arrays of Carbon Nanotubes; Throttling Cryogen Boiloff To Control Cryostat Temperature; Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station; Turbulence in Supercritical O2/H2 and C7H16/N2 Mixing Layers; and Time-Resolved Measurements in Optoelectronic Microbioanal.
Validation of a Custom-made Software for DQE Assessment in Mammography Digital Detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayala-Dominguez, L.; Perez-Ponce, H.; Brandan, M. E.
2010-12-07
This works presents the validation of a custom-made software, designed and developed in Matlab, intended for routine evaluation of detective quantum efficiency DQE, according to algorithms described in the IEC 62220-1-2 standard. DQE, normalized noise power spectrum NNPS and pre-sampling modulation transfer function MTF were calculated from RAW images from a GE Senographe DS (FineView disabled) and a Siemens Novation system. Calculated MTF is in close agreement with results obtained with alternative codes: MTF lowbar tool (Maidment), ImageJ plug-in (Perez-Ponce) and MIQuaELa (Ayala). Overall agreement better than {approx_equal}90% was found in MTF; the largest differences were observed at frequencies closemore » to the Nyquist limit. For the measurement of NNPS and DQE, agreement is similar to that obtained in the MTF. These results suggest that the developed software can be used with confidence for image quality assessment.« less
An expert system executive for automated assembly of large space truss structures
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1993-01-01
Langley Research Center developed a unique test bed for investigating the practical problems associated with the assembly of large space truss structures using robotic manipulators. The test bed is the result of an interdisciplinary effort that encompasses the full spectrum of assembly problems - from the design of mechanisms to the development of software. The automated structures assembly test bed and its operation are described, the expert system executive and its development are detailed, and the planned system evolution is discussed. Emphasis is on the expert system implementation of the program executive. The executive program must direct and reliably perform complex assembly tasks with the flexibility to recover from realistic system errors. The employment of an expert system permits information that pertains to the operation of the system to be encapsulated concisely within a knowledge base. This consolidation substantially reduced code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
A tool to include gamma analysis software into a quality assurance program.
Agnew, Christina E; McGarry, Conor K
2016-03-01
To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.
Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip
2018-02-01
Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.
Ojima-Kato, Teruyo; Yamamoto, Naomi; Takahashi, Hajime; Tamura, Hiroto
2016-01-01
The genetic lineages of Listeria monocytogenes and other species of the genus Listeria are correlated with pathogenesis in humans. Although matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has become a prevailing tool for rapid and reliable microbial identification, the precise discrimination of Listeria species and lineages remains a crucial issue in clinical settings and for food safety. In this study, we constructed an accurate and reliable MS database to discriminate the lineages of L. monocytogenes and the species of Listeria (L. monocytogenes, L. innocua, L. welshimeri, L. seeligeri, L. ivanovii, L. grayi, and L. rocourtiae) based on the S10-spc-alpha operon gene encoded ribosomal protein mass spectrum (S10-GERMS) proteotyping method, which relies on both genetic information (genomics) and observed MS peaks in MALDI-TOF MS (proteomics). The specific set of eight biomarkers (ribosomal proteins L24, L6, L18, L15, S11, S9, L31 type B, and S16) yielded characteristic MS patterns for the lineages of L. monocytogenes and the different species of Listeria, and led to the construction of a MS database that was successful in discriminating between these organisms in MALDI-TOF MS fingerprinting analysis followed by advanced proteotyping software Strain Solution analysis. We also confirmed the constructed database on the proteotyping software Strain Solution by using 23 Listeria strains collected from natural sources.
Energy analysis of holographic lenses for solar concentration
NASA Astrophysics Data System (ADS)
Marín-Sáez, Julia; Collados, M. Victoria; Chemisana, Daniel; Atencia, Jesús
2017-05-01
The use of volume and phase holographic elements in the design of photovoltaic solar concentrators has become very popular as an alternative solution to refractive systems, due to their high efficiency, low cost and possibilities of building integration. Angular and chromatic selectivity of volume holograms can affect their behavior as solar concentrators. In holographic lenses, angular and chromatic selectivity varies along the lens plane. Besides, considering that the holographic materials are not sensitive to the wavelengths for which the solar cells are most efficient, the reconstruction wavelength is usually different from the recording one. As a consequence, not all points of the lens work at Bragg condition for a defined incident direction or wavelength. A software tool that calculates the direction and efficiency of solar rays at the output of a volume holographic element has been developed in this study. It allows the analysis of the total energy that reaches the solar cell, taking into account the sun movement, the solar spectrum and the sensitivity of the solar cell. The dependence of the recording wavelength on the collected energy is studied with this software. As the recording angle is different along a holographic lens, some zones of the lens could not act as a volume hologram. The efficiency at the transition zones between volume and thin behavior in lenses recorded in Bayfol HX is experimentally analyzed in order to decide if the energy of generated higher diffraction orders has to be included in the simulation.
Yamamoto, Naomi; Takahashi, Hajime; Tamura, Hiroto
2016-01-01
The genetic lineages of Listeria monocytogenes and other species of the genus Listeria are correlated with pathogenesis in humans. Although matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has become a prevailing tool for rapid and reliable microbial identification, the precise discrimination of Listeria species and lineages remains a crucial issue in clinical settings and for food safety. In this study, we constructed an accurate and reliable MS database to discriminate the lineages of L. monocytogenes and the species of Listeria (L. monocytogenes, L. innocua, L. welshimeri, L. seeligeri, L. ivanovii, L. grayi, and L. rocourtiae) based on the S10-spc-alpha operon gene encoded ribosomal protein mass spectrum (S10-GERMS) proteotyping method, which relies on both genetic information (genomics) and observed MS peaks in MALDI-TOF MS (proteomics). The specific set of eight biomarkers (ribosomal proteins L24, L6, L18, L15, S11, S9, L31 type B, and S16) yielded characteristic MS patterns for the lineages of L. monocytogenes and the different species of Listeria, and led to the construction of a MS database that was successful in discriminating between these organisms in MALDI-TOF MS fingerprinting analysis followed by advanced proteotyping software Strain Solution analysis. We also confirmed the constructed database on the proteotyping software Strain Solution by using 23 Listeria strains collected from natural sources. PMID:27442502
Retinal health information and notification system (RHINO)
NASA Astrophysics Data System (ADS)
Dashtbozorg, Behdad; Zhang, Jiong; Abbasi-Sureshjani, Samaneh; Huang, Fan; ter Haar Romeny, Bart M.
2017-03-01
The retinal vasculature is the only part of the blood circulation system that can be observed non-invasively using fundus cameras. Changes in the dynamic properties of retinal blood vessels are associated with many systemic and vascular diseases, such as hypertension, coronary heart disease and diabetes. The assessment of the characteristics of the retinal vascular network provides important information for an early diagnosis and prognosis of many systemic and vascular diseases. The manual analysis of the retinal vessels and measurement of quantitative biomarkers in large-scale screening programs is a tedious task, time-consuming and costly. This paper describes a reliable, automated, and efficient retinal health information and notification system (acronym RHINO) which can extract a wealth of geometric biomarkers in large volumes of fundus images. The fully automated software presented in this paper includes vessel enhancement and segmentation, artery/vein classification, optic disc, fovea, and vessel junction detection, and bifurcation/crossing discrimination. Pipelining these tools allows the assessment of several quantitative vascular biomarkers: width, curvature, bifurcation geometry features and fractal dimension. The brain-inspired algorithms outperform most of the state-of-the-art techniques. Moreover, several annotation tools are implemented in RHINO for the manual labeling of arteries and veins, marking optic disc and fovea, and delineating vessel centerlines. The validation phase is ongoing and the software is currently being used for the analysis of retinal images from the Maastricht study (the Netherlands) which includes over 10,000 subjects (healthy and diabetic) with a broad spectrum of clinical measurements
Wang, Bing; Fang, Aiqin; Heim, John; Bogdanov, Bogdan; Pugh, Scott; Libardoni, Mark; Zhang, Xiang
2010-01-01
A novel peak alignment algorithm using a distance and spectrum correlation optimization (DISCO) method has been developed for two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC/TOF-MS) based metabolomics. This algorithm uses the output of the instrument control software, ChromaTOF, as its input data. It detects and merges multiple peak entries of the same metabolite into one peak entry in each input peak list. After a z-score transformation of metabolite retention times, DISCO selects landmark peaks from all samples based on both two-dimensional retention times and mass spectrum similarity of fragment ions measured by Pearson’s correlation coefficient. A local linear fitting method is employed in the original two-dimensional retention time space to correct retention time shifts. A progressive retention time map searching method is used to align metabolite peaks in all samples together based on optimization of the Euclidean distance and mass spectrum similarity. The effectiveness of the DISCO algorithm is demonstrated using data sets acquired under different experiment conditions and a spiked-in experiment. PMID:20476746
NASA Astrophysics Data System (ADS)
Casoli, Pierre; Grégoire, Gilles; Rousseau, Guillaume; Jacquet, Xavier; Authier, Nicolas
2016-02-01
CALIBAN is a metallic critical assembly managed by the Criticality, Neutron Science and Measurement Department located on the French CEA Center of Valduc. The reactor is extensively used for benchmark experiments dedicated to the evaluation of nuclear data, for electronic hardening or to study the effect of the neutrons on various materials. Therefore CALIBAN irradiation characteristics and especially its central cavity neutron spectrum have to be very accurately evaluated. In order to strengthen our knowledge of this spectrum, several adjustment methods based on activation foils measurements are being studied for a few years in the laboratory. Firstly two codes included in the UMG package have been tested and compared: MAXED and GRAVEL. More recently, the CALIBAN cavity spectrum has been studied using CALMAR, a new adjustment tool currently under development at the CEA Center of Cadarache. The article will discuss and compare the results and the quality of spectrum rebuilding obtained with the UMG codes and with the CALMAR software, from a set of activation measurements carried out in the CALIBAN irradiation cavity.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Collision management utilizing CCD and remote sensing technology
NASA Technical Reports Server (NTRS)
Mcdaniel, Harvey E., Jr.
1995-01-01
With the threat of damage to aerospace systems (space station, shuttle, hypersonic a/c, solar power satellites, loss of life, etc.) from collision with debris (manmade/artificial), there exists an opportunity for the design of a novel system (collision avoidance) to be incorporated into the overall design. While incorporating techniques from ccd and remote sensing technologies, an integrated system utilized in the infrared/visible spectrum for detection, tracking, localization, and maneuvering from doppler shift measurements is achievable. Other analysis such as impact assessment, station keeping, chemical, and optical tracking/fire control solutions are possible through this system. Utilizing modified field programmable gated arrays (software reconfiguring the hardware) the mission and mission effectiveness can be varied. This paper outlines the theoretical operation of a prototype system as it applies to collision avoidance (to be followed up by research).
NASA Astrophysics Data System (ADS)
Various papers on communications for the information age are presented. Among the general topics considered are: telematic services and terminals, satellite communications, telecommunications mangaement network, control of integrated broadband networks, advances in digital radio systems, the intelligent network, broadband networks and services deployment, future switch architectures, performance analysis of computer networks, advances in spread spectrum, optical high-speed LANs, and broadband switching and networks. Also addressed are: multiple access protocols, video coding techniques, modulation and coding, photonic switching, SONET terminals and applications, standards for video coding, digital switching, progress in MANs, mobile and portable radio, software design for improved maintainability, multipath propagation and advanced countermeasure, data communication, network control and management, fiber in the loop, network algorithm and protocols, and advances in computer communications.
NASA Technical Reports Server (NTRS)
Clark, T. A.; Brainard, G.; Salazar, G.; Johnston, S.; Schwing, B.; Litaker, H.; Kolomenski, A.; Venus, D.; Tran, K.; Hanifin, J.;
2017-01-01
NASA has demonstrated an interest in improving astronaut health and performance through the installment of a new lighting countermeasure on the International Space Station. The Solid State Lighting Assembly (SSLA) system is designed to positively influence astronaut health by providing a daily change to light spectrum to improve circadian entrainment. Unfortunately, existing NASA standards and requirements define ambient light level requirements for crew sleep and other tasks, yet the number of light-emitting diode (LED) indicators and displays within a habitable volume is currently uncontrolled. Because each of these light sources has its own unique spectral properties, the additive lighting environment ends up becoming something different from what was planned or researched. Restricting the use of displays and indicators is not a solution because these systems provide beneficial feedback to the crew. The research team for this grant used computer-based computational modeling and real-world lighting mockups to document the impact that light sources other than the ambient lighting system contribute to the ambient spectral lighting environment. In particular, the team was focused on understanding the impacts of long-term tasks located in front of avionics or computer displays. The team also wanted to understand options for mitigating the changes to the ambient light spectrum in the interest of maintaining the performance of a lighting countermeasure. The project utilized a variety of physical and computer-based simulations to determine direct relationships between system implementation and light spectrum. Using real-world data, computer models were built in the commercially available optics analysis software Zemax Optics Studio(c). The team also built a mockup test facility that had the same volume and configuration as one of the Zemax models. The team collected over 1200 spectral irradiance measurements, each representing a different configuration of the mockup. Analysis of the data showed a measurable impact on ambient light spectrum. This data showed that obvious design techniques exist that can be used to bind the ambient light spectrum closer to the planned spectral operating environment for the observer's eye point. The following observations should be considered when designing an operational environment that is dominated by computer displays. When more light is directed into the field of view of the observer, the greater the impact it will make on various human factors issues that depend on spectral shape and intensity. Because viewing angle has a large part to play in the amount of light flux on the crewmember's retina, beam shape, combined with light source location is an important factor for determining percent probable incident flux on the observer from any combination of light sources. Computer graphics design and display lumen output are major factors influencing the amount of spectrally intense light projected into the environment and in the viewer's direction. Use of adjustable white point display software was useful only if the predominant background color was white and if it matched the ambient light system's color. Display graphics that used a predominantly black background had the least influence on unplanned spectral energy projected into the environment. Percent reflectance makes a difference in total energy reflected back into an environment, and within certain architectural geometries, reflectance can be used to control the amount of a light spectrum that is allowed to perpetuate in the environment. Data showed that room volume and distance from significant light sources influence the total spectrum in a room. Smaller environments had a homogenizing effect on total light spectrum, whereas light from multiple sources in larger environments was less mixed. The findings indicated above should be considered when making recommendations for practice or standards for architectural systems. The ambient lighting system, surface reflectance, and display and indicator implementation all factor into the users' spectral environment. A variety of low-cost solutions exist to mitigate the impact of light from non-architectural lighting systems, and much potential for system automation and integration of display systems with the ambient environment. This team believes that proper planning can be used to avoid integration problems and also believes that human-in-the-loop evaluations, real-world test and measurement, and computer modeling can be used to determine how changes to a process, display graphics, and architecture will help maintain the planned spectral operating lighting environment.
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Fast Fourier Transform Spectral Analysis Program
NASA Technical Reports Server (NTRS)
Daniel, J. A., Jr.; Graves, M. L.; Hovey, N. M.
1969-01-01
Fast Fourier Transform Spectral Analysis Program is used in frequency spectrum analysis of postflight, space vehicle telemetered trajectory data. This computer program with a digital algorithm can calculate power spectrum rms amplitudes and cross spectrum of sampled parameters at even time increments.
Kuz'min, A A; Meshkovskiĭ, D V; Filist, S A
2008-01-01
Problems of engineering and algorithm development of magnetic therapy apparatuses with pseudo-random radiation spectrum within the audio range for treatment of prostatitis and gynecopathies are considered. A typical design based on a PIC 16F microcontroller is suggested. It includes a keyboard, LCD indicator, audio amplifier, inducer, and software units. The problem of pseudo-random signal generation within the audio range is considered. A series of rectangular pulses is generated on a random-length interval on the basis of a three-component random vector. This series provides the required spectral characteristics of the therapeutic magnetic field and their adaptation to the therapeutic conditions and individual features of the patient.
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rousseau, G.; Chambru, L.; Authier, N.
2015-07-01
In the context of criticality accident alarm system tests, several experiments were carried out in 2013 on the PROSPERO reactor to study the response to neutron and gamma of different devices and dosimeters, particularly on the SNAC2 dosimeter. This article presents the results of this criticality dosimeter in different configurations, and compares the experimental measurements with the results of calculation performed with the TRIPOLI-4 Monte-Carlo Neutral Particles transport code. PROSPERO is a metallic critical assembly managed by the Criticality, Neutron Science and Measurement Department located at the French CEA Research Center of Valduc. The core, surrounded by a reflector ofmore » depleted uranium, is composed of 2 horizontal cylindrical blocks made of a highly enriched uranium alloy which can be placed in contact, and of 4 depleted uranium control rods which allow the reactor to be driven. This reactor, placed in a cell 10 m x 8 m x 6 m high, with 1.4-meter-thick concrete walls, is used as a fast neutron spectrum source and is operated at stable power level in delayed critical state, which can vary from 3 mW to 3 kW. PROSPERO is extensively used for electronic hardening or to study the effect of the neutrons on various materials. The SNAC2 criticality dosimeter is a zone dosimeter allowing the off line measurement of criticality accident neutron doses. This dosimeter consists of the pile up of seven activation foils embedded into a 23 mm diameter x 21 mm height cadmium container. The activation measurement of each foil, using a gamma spectroscopy technique, gives information about the neutron reaction rates. The SNAC2 software allows the spectrum unfolding from these values, taking into account the hypothesis of a particular spectrum shape, in three components: a Maxwell spectrum component for the thermal range, a 1/E component for the epithermal range, and a Watt spectrum component for the high energy range. Moreover, from the neutron spectrum, the SNAC software can calculate the neutron fluence integrated by the dosimeter and the neutron dose. During the 3 weeks measurement campaign many radioprotection devices were used. To modify the spectrum seen by these devices, several shields of various thicknesses made of concrete or polyethylene, with or without cadmium covers, were placed in the PROSPERO cell. These devices allow the study of criticality accident spectra in several environments: from metal to pseudo liquid. The fluxes measured by the SNAC2 devices were compared with TRIPOLI-4 calculations. (authors)« less
NASA Astrophysics Data System (ADS)
Watson, Clifton L.; Biswas, Subir
2014-06-01
With an increasing demand for spectrum, dynamic spectrum access (DSA) has been proposed as viable means for providing the flexibility and greater access to spectrum necessary to meet this demand. Within the DSA concept, unlicensed secondary users temporarily "borrow" or access licensed spectrum, while respecting the licensed primary user's rights to that spectrum. As key enablers for DSA, cognitive radios (CRs) are based on software-defined radios which allow them to sense, learn, and adapt to the spectrum environment. These radios can operate independently and rapidly switch channels. Thus, the initial setup and maintenance of cognitive radio networks are dependent upon the ability of CR nodes to find each other, in a process known as rendezvous, and create a link on a common channel for the exchange of data and control information. In this paper, we propose a novel rendezvous protocol, known as QLP, which is based on Q-learning and the p-persistent CSMA protocol. With the QLP protocol, CR nodes learn which channels are best for rendezvous and thus adapt their behavior to visit those channels more frequently. We demonstrate through simulation that the QLP protocol provides a rendevous capability for DSA environments with different dynamics of PU activity, while attempting to achieve the following performance goals: (1) minimize the average time-to-rendezvous, (2) maximize system throughput, (3) minimize primary user interference, and (4) minimize collisions among CR nodes.
FPGA-based RF spectrum merging and adaptive hopset selection
NASA Astrophysics Data System (ADS)
McLean, R. K.; Flatley, B. N.; Silvius, M. D.; Hopkinson, K. M.
The radio frequency (RF) spectrum is a limited resource. Spectrum allotment disputes stem from this scarcity as many radio devices are confined to a fixed frequency or frequency sequence. One alternative is to incorporate cognition within a reconfigurable radio platform, therefore enabling the radio to adapt to dynamic RF spectrum environments. In this way, the radio is able to actively sense the RF spectrum, decide, and act accordingly, thereby sharing the spectrum and operating in more flexible manner. In this paper, we present a novel solution for merging many distributed RF spectrum maps into one map and for subsequently creating an adaptive hopset. We also provide an example of our system in operation, the result of which is a pseudorandom adaptive hopset. The paper then presents a novel hardware design for the frequency merger and adaptive hopset selector, both of which are written in VHDL and implemented as a custom IP core on an FPGA-based embedded system using the Xilinx Embedded Development Kit (EDK) software tool. The design of the custom IP core is optimized for area, and it can process a high-volume digital input via a low-latency circuit architecture. The complete embedded system includes the Xilinx PowerPC microprocessor, UART serial connection, and compact flash memory card IP cores, and our custom map merging/hopset selection IP core, all of which are targeted to the Virtex IV FPGA. This system is then incorporated into a cognitive radio prototype on a Rice University Wireless Open Access Research Platform (WARP) reconfigurable radio.
ERIC Educational Resources Information Center
Kawada, Taku; Ando, Akinobu; Saito, Hirotaka; Uekida, Jun; Nagai, Nobuyuki; Takeshima, Hisashi; Davis, Darold
2016-01-01
In this paper, we developed two kinds of application software run on a mobile/wearable device for autistic spectrum disorder students, intellectual disability students, or physically challenged. One of the applications is expression detector/evaluator using a smartphone and a small expression sensor for social skill training. This sensor can…
ERIC Educational Resources Information Center
Aji, Chadia Affane; Khan, M. Javed
2015-01-01
Student engagement is an essential element for learning. Active learning has been consistently shown to increase student engagement and hence learning. Hands-on activities are one of the many active learning approaches. These activities vary from structured laboratory experiments on one end of the spectrum to virtual gaming environments and to for…
ERIC Educational Resources Information Center
Thomeer, Marcus L.; Smith, Rachael A.; Lopata, Christopher; Volker, Martin A.; Lipinski, Alanna M.; Rodgers, Jonathan D.; McDonald, Christin A.; Lee, Gloria K.
2015-01-01
This randomized controlled trial evaluated the efficacy of a computer software (i.e., "Mind Reading") and in vivo rehearsal treatment on the emotion decoding and encoding skills, autism symptoms, and social skills of 43 children, ages 7-12 years with high-functioning autism spectrum disorder (HFASD). Children in treatment (n = 22)…
Proceedings of the 14th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1989-01-01
Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
ASERA: A Spectrum Eye Recognition Assistant
NASA Astrophysics Data System (ADS)
Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao; Zhao, Yongheng
2018-04-01
ASERA, ASpectrum Eye Recognition Assistant, aids in quasar spectral recognition and redshift measurement and can also be used to recognize various types of spectra of stars, galaxies and AGNs (Active Galactic Nucleus). This interactive software allows users to visualize observed spectra, superimpose template spectra from the Sloan Digital Sky Survey (SDSS), and interactively access related spectral line information. ASERA is an efficient and user-friendly semi-automated toolkit for the accurate classification of spectra observed by LAMOST (the Large Sky Area Multi-object Fiber Spectroscopic Telescope) and is available as a standalone Java application and as a Java applet. The software offers several functions, including wavelength and flux scale settings, zoom in and out, redshift estimation, and spectral line identification.
User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh
NASA Astrophysics Data System (ADS)
Jones, Craig H.
2002-12-01
"PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.
Dynamic load synthesis for shock numerical simulation in space structure design
NASA Astrophysics Data System (ADS)
Monti, Riccardo; Gasbarri, Paolo
2017-08-01
Pyroshock loads are the most stressing environments that a space equipment experiences during its operating life from a mechanical point of view. In general, the mechanical designer considers the pyroshock analysis as a very demanding constraint. Unfortunately, due to the non-linear behaviour of the structure under such loads, only the experimental tests can demonstrate if it is able to withstand these dynamic loads. By taking all the previous considerations into account, some preliminary information about the design correctness could be done by performing ;ad-hoc; numerical simulations, for example via commercial finite element software (i.e. MSC Nastran). Usually these numerical tools face the shock solution in two ways: 1) a direct mode, by using a time dependent enforcement and by evaluating the time-response and space-response as well as the internal forces; 2) a modal basis approach, by considering a frequency dependent load and of course by evaluating internal forces in the frequency domain. This paper has the main aim to develop a numerical tool to synthetize the time dependent enforcement based on deterministic and/or genetic algorithm optimisers. In particular starting from a specified spectrum in terms of SRS (Shock Response Spectrum) a time dependent discrete function, typically an acceleration profile, will be obtained to force the equipment by simulating the shock event. The synthetizing time and the interface with standards numerical codes will be two of the main topics dealt with in the paper. In addition a congruity and consistency methodology will be presented to ensure that the identified time dependent loads fully match the specified spectrum.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-24
... Comment Request: Autism Spectrum Disorder Research Portfolio Analysis SUMMARY: In compliance with the... the proposed project, contact: The Office of Autism Research Coordination, NIMH, NIH, Neuroscience.... Proposed Collection: Autism Spectrum Disorder (ASD) Research Portfolio Analysis, 0925--NEW--National...
NASA Technical Reports Server (NTRS)
Gregory, J. C.; Smith, A. E.
1994-01-01
BUGS-4 (Bristol University Gas Scintillator-4) made its maiden engineering flight from Fort Summer (NM) on the 29th of September 1993. The instrument was consumed by fire after striking a power line during landing following 24 hours at float. The analysis of the telemetered data from this sophisticated instrument is a demanding task. Early analysis was compromised by electronic artifacts. Unravelling these problems has been difficult and time consuming, especially as the flight hardware was burned beyond salvage, but is is an essential preliminary to analysis. During this report period we have concentrated on a small sub-set of data (the first 30,000 events; 90 minutes at float), and developed software algorithms to correct systematic errors. Using these corrected events we have begun to develop the analysis algorithms. Although the analysis is preliminary, and restricted to the first 30,000 events, the results are encouraging, and suggest the design concepts are well matched to this application. Further work will refine the analysis, and allow quantitative evaluation of the concepts employed in BUGS-4 for applicability to future instruments. We believe this work will justify fabrication of a new instrument employing techniques deployed on BUGS-4.
Semantic Metrics for Analysis of Software
NASA Technical Reports Server (NTRS)
Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara
2005-01-01
A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.
Polley, Nabarun; Saha, Srimoyee; Singh, Soumendra; Adhikari, Aniruddha; Das, Sukhen; Choudhury, Bhaskar Roy; Pal, Samir Kumar
2015-06-01
Jaundice is one of the notable markers of liver malfunction in our body, revealing a significant rise in the concentration of an endogenous yellow pigment bilirubin. We have described a method for measuring the optical spectrum of our conjunctiva and derived pigment concentration by using diffused reflection measurement. The method uses no prior model and is expected to work across the races (skin color) encompassing a wide range of age groups. An optical fiber-based setup capable of measuring the conjunctival absorption spectrum from 400 to 800 nm is used to monitor the level of bilirubin and is calibrated with the value measured from blood serum of the same human subject. We have also developed software in the LabVIEW platform for use in online monitoring of bilirubin levels in human subjects by nonexperts. The results demonstrate that relative absorption at 460 and 600 nm has a distinct correlation with that of the bilirubin concentration measured from blood serum. Statistical analysis revealed that our proposed method is in agreement with the conventional biochemical method. The innovative noncontact, low-cost technique is expected to have importance in monitoring jaundice in developing/underdeveloped countries, where the inexpensive diagnosis of jaundice with minimally trained manpower is obligatory.
NASA Astrophysics Data System (ADS)
Polley, Nabarun; Saha, Srimoyee; Singh, Soumendra; Adhikari, Aniruddha; Das, Sukhen; Choudhury, Bhaskar Roy; Pal, Samir Kumar
2015-06-01
Jaundice is one of the notable markers of liver malfunction in our body, revealing a significant rise in the concentration of an endogenous yellow pigment bilirubin. We have described a method for measuring the optical spectrum of our conjunctiva and derived pigment concentration by using diffused reflection measurement. The method uses no prior model and is expected to work across the races (skin color) encompassing a wide range of age groups. An optical fiber-based setup capable of measuring the conjunctival absorption spectrum from 400 to 800 nm is used to monitor the level of bilirubin and is calibrated with the value measured from blood serum of the same human subject. We have also developed software in the LabVIEW platform for use in online monitoring of bilirubin levels in human subjects by nonexperts. The results demonstrate that relative absorption at 460 and 600 nm has a distinct correlation with that of the bilirubin concentration measured from blood serum. Statistical analysis revealed that our proposed method is in agreement with the conventional biochemical method. The innovative noncontact, low-cost technique is expected to have importance in monitoring jaundice in developing/underdeveloped countries, where the inexpensive diagnosis of jaundice with minimally trained manpower is obligatory.
Label-free biodetection using a smartphone.
Gallegos, Dustin; Long, Kenneth D; Yu, Hojeong; Clark, Peter P; Lin, Yixiao; George, Sherine; Nath, Pabitra; Cunningham, Brian T
2013-06-07
Utilizing its integrated camera as a spectrometer, we demonstrate the use of a smartphone as the detection instrument for a label-free photonic crystal biosensor. A custom-designed cradle holds the smartphone in fixed alignment with optical components, allowing for accurate and repeatable measurements of shifts in the resonant wavelength of the sensor. Externally provided broadband light incident upon an entrance pinhole is subsequently collimated and linearly polarized before passing through the biosensor, which resonantly reflects only a narrow band of wavelengths. A diffraction grating spreads the remaining wavelengths over the camera's pixels to display a high resolution transmission spectrum. The photonic crystal biosensor is fabricated on a plastic substrate and attached to a standard glass microscope slide that can easily be removed and replaced within the optical path. A custom software app was developed to convert the camera images into the photonic crystal transmission spectrum in the visible wavelength range, including curve-fitting analysis that computes the photonic crystal resonant wavelength with 0.009 nm accuracy. We demonstrate the functionality of the system through detection of an immobilized protein monolayer, and selective detection of concentration-dependent antibody binding to a functionalized photonic crystal. We envision the capability for an inexpensive, handheld biosensor instrument with web connectivity to enable point-of-care sensing in environments that have not been practical previously.
Adaptive cyber-attack modeling system
NASA Astrophysics Data System (ADS)
Gonsalves, Paul G.; Dougherty, Edward T.
2006-05-01
The pervasiveness of software and networked information systems is evident across a broad spectrum of business and government sectors. Such reliance provides an ample opportunity not only for the nefarious exploits of lone wolf computer hackers, but for more systematic software attacks from organized entities. Much effort and focus has been placed on preventing and ameliorating network and OS attacks, a concomitant emphasis is required to address protection of mission critical software. Typical software protection technique and methodology evaluation and verification and validation (V&V) involves the use of a team of subject matter experts (SMEs) to mimic potential attackers or hackers. This manpower intensive, time-consuming, and potentially cost-prohibitive approach is not amenable to performing the necessary multiple non-subjective analyses required to support quantifying software protection levels. To facilitate the evaluation and V&V of software protection solutions, we have designed and developed a prototype adaptive cyber attack modeling system. Our approach integrates an off-line mechanism for rapid construction of Bayesian belief network (BN) attack models with an on-line model instantiation, adaptation and knowledge acquisition scheme. Off-line model construction is supported via a knowledge elicitation approach for identifying key domain requirements and a process for translating these requirements into a library of BN-based cyber-attack models. On-line attack modeling and knowledge acquisition is supported via BN evidence propagation and model parameter learning.
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Study on environment detection and appraisement of mining area with RS
NASA Astrophysics Data System (ADS)
Yang, Fengjie; Hou, Peng; Zhou, Guangzhu; Li, Qingting; Wang, Jie; Cheng, Jianguang
2006-12-01
In this paper, the big coal mining area Yanzhou is selected as the typical research area. According to the special dynamic change characteristic of the environment in the mining area, the environmental dynamic changes are timely monitored with the remote sensing detection technology. Environmental special factors, such as vegetation, water, air, land-over, are extracted by the professional remote sensing image processing software, then the spatial information is managed and analyzed in the geographical information system (GIS) software. As the result, the dynamic monitor and query for change information is achieved, and the special environmental factor dynamic change maps are protracted. On the base of the data coming from the remote sensing image, GIS and the traditional environment monitoring, the environmental quality is appraised with the method of indistinct matrix analysis, the multi-index and the analytical hierarchy process. At last, those provide the credible science foundation for the local environment appraised and the sustained development. In addition, this paper apply the hyper spectrum graphs by the FieldSpec Pro spectroradiometer, together with the analytical data from environmental chemical, to study the growth of vegetation which were seed in the land-over consisting of gangue, which is a new method to study the impact to vegetation that are growing in the soil.
GlycoWorkbench: a tool for the computer-assisted annotation of mass spectra of glycans.
Ceroni, Alessio; Maass, Kai; Geyer, Hildegard; Geyer, Rudolf; Dell, Anne; Haslam, Stuart M
2008-04-01
Mass spectrometry is the main analytical technique currently used to address the challenges of glycomics as it offers unrivalled levels of sensitivity and the ability to handle complex mixtures of different glycan variations. Determination of glycan structures from analysis of MS data is a major bottleneck in high-throughput glycomics projects, and robust solutions to this problem are of critical importance. However, all the approaches currently available have inherent restrictions to the type of glycans they can identify, and none of them have proved to be a definitive tool for glycomics. GlycoWorkbench is a software tool developed by the EUROCarbDB initiative to assist the manual interpretation of MS data. The main task of GlycoWorkbench is to evaluate a set of structures proposed by the user by matching the corresponding theoretical list of fragment masses against the list of peaks derived from the spectrum. The tool provides an easy to use graphical interface, a comprehensive and increasing set of structural constituents, an exhaustive collection of fragmentation types, and a broad list of annotation options. The aim of GlycoWorkbench is to offer complete support for the routine interpretation of MS data. The software is available for download from: http://www.eurocarbdb.org/applications/ms-tools.
Modeling Tool Advances Rotorcraft Design
NASA Technical Reports Server (NTRS)
2007-01-01
Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.
NASA Technical Reports Server (NTRS)
Simons, Rainee N.; Wintucky, Edwin G.; Landon, David G.; Sun, Jun Y.; Winn, James S.; Laraway, Stephen; McIntire, William K.; Metz, John L.; Smith, Francis J.
2011-01-01
The paper presents the first ever research and experimental results regarding the combination of a software-defined multi-Gbps modem and a broadband high power space amplifier when tested with an extended form of the industry standard DVB-S2 and LDPC rate 9/10 FEC codec. The modem supports waveforms including QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK, and 128-QAM. The broadband high power amplifier is a space qualified traveling-wave tube (TWT), which has a passband greater than 3 GHz at 33 GHz, output power of 200 W and efficiency greater than 60 percent. The modem and the TWTA together enabled an unprecedented data rate at 20 Gbps with low BER of 10(exp -9). The presented results include a plot of the received waveform constellation, BER vs. E(sub b)/N(sub 0) and implementation loss for each of the modulation types tested. The above results when included in an RF link budget analysis show that NASA s payload data rate can be increased by at least an order of magnitude (greater than 10X) over current state-of-practice, limited only by the spacecraft EIRP, ground receiver G/T, range, and available spectrum or bandwidth.
Better software, better research: the challenge of preserving your research and your reputation
NASA Astrophysics Data System (ADS)
Chue Hong, N.
2017-12-01
Software is fundamental to research. From short, thrown-together temporary scripts, through an abundance of complex spreadsheets analysing collected data, to the hundreds of software engineers and millions of lines of code behind international efforts such as the Large Hadron Collider and the Square Kilometre Array, software has made an invaluable contribution to advancing our research knowledge. Within the earth and space sciences, data is being generated, collected, processed and analysed in ever greater amounts and detail. However the pace of this improvement leads to challenges around the persistence of research outputs and artefacts. A specific challenge in this field is that often experiments and measurements cannot be repeated, yet the infrastructure used to manage, store and process this data must be continually updated and developed: constant change just to stay still. The UK-based Software Sustainability Institute (SSI) aims to improve research software sustainability, working with researchers, funders, research software engineers, managers, and other stakeholders across the research spectrum. In this talk, I will present lessons learned and good practice based on the work of the Institute and its collaborators. I will summarise some of the work that is being done to improve the integration of infrastructure for managing research outputs, including around software citation and reward, extending data management plans, and improving researcher skills: "better software, better research". Ultimately, being a modern researcher in the geosciences requires you to efficiently balance the pursuit of new knowledge with making your work reusable and reproducible. And as scientists are placed under greater scrutiny about whether others can trust their results, the preservation of your artefacts has a key role in the preservation of your reputation.
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Kubios HRV--heart rate variability analysis software.
Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A
2014-01-01
Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
An online database for plant image analysis software tools.
Lobet, Guillaume; Draye, Xavier; Périlleux, Claire
2013-10-09
Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.
GWAMA: software for genome-wide association meta-analysis.
Mägi, Reedik; Morris, Andrew P
2010-05-28
Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.
Zhao, An-Xin; Tang, Xiao-Jun; Zhang, Zhong-Hua; Liu, Jun-Hua
2014-10-01
The generalized two-dimensional correlation spectroscopy and Fourier transform infrared were used to identify hydrocarbon isomers in the mixed gases for absorption spectra resolution enhancement. The Fourier transform infrared spectrum of n-butane and iso-butane and the two-dimensional correlation infrared spectrum of concentration perturbation were used for analysis as an example. The all band and the main absorption peak wavelengths of Fourier transform infrared spectrum for single component gas showed that the spectra are similar, and if they were mixed together, absorption peaks overlap and peak is difficult to identify. The synchronous and asynchronous spectrum of two-dimensional correlation spectrum can clearly identify the iso-butane and normal butane and their respective characteristic absorption peak intensity. Iso-butane has strong absorption characteristics spectrum lines at 2,893, 2,954 and 2,893 cm(-1), and n-butane at 2,895 and 2,965 cm(-1). The analysis result in this paper preliminary verified that the two-dimensional infrared correlation spectroscopy can be used for resolution enhancement in Fourier transform infrared spectrum quantitative analysis.
Rafiei, Atefeh; Sleno, Lekha
2015-01-15
Data analysis is a key step in mass spectrometry based untargeted metabolomics, starting with the generation of generic peak lists from raw liquid chromatography/mass spectrometry (LC/MS) data. Due to the use of various algorithms by different workflows, the results of different peak-picking strategies often differ widely. Raw LC/HRMS data from two types of biological samples (bile and urine), as well as a standard mixture of 84 metabolites, were processed with four peak-picking softwares: Peakview®, Markerview™, MetabolitePilot™ and XCMS Online. The overlaps between the results of each peak-generating method were then investigated. To gauge the relevance of peak lists, a database search using the METLIN online database was performed to determine which features had accurate masses matching known metabolites as well as a secondary filtering based on MS/MS spectral matching. In this study, only a small proportion of all peaks (less than 10%) were common to all four software programs. Comparison of database searching results showed peaks found uniquely by one workflow have less chance of being found in the METLIN metabolomics database and are even less likely to be confirmed by MS/MS. It was shown that the performance of peak-generating workflows has a direct impact on untargeted metabolomics results. As it was demonstrated that the peaks found in more than one peak detection workflow have higher potential to be identified by accurate mass as well as MS/MS spectrum matching, it is suggested to use the overlap of different peak-picking workflows as preliminary peak lists for more rugged statistical analysis in global metabolomics investigations. Copyright © 2014 John Wiley & Sons, Ltd.
A Comprehensive, Open-source Platform for Mass Spectrometry-based Glycoproteomics Data Analysis.
Liu, Gang; Cheng, Kai; Lo, Chi Y; Li, Jun; Qu, Jun; Neelamegham, Sriram
2017-11-01
Glycosylation is among the most abundant and diverse protein post-translational modifications (PTMs) identified to date. The structural analysis of this PTM is challenging because of the diverse monosaccharides which are not conserved among organisms, the branched nature of glycans, their isomeric structures, and heterogeneity in the glycan distribution at a given site. Glycoproteomics experiments have adopted the traditional high-throughput LC-MS n proteomics workflow to analyze site-specific glycosylation. However, comprehensive computational platforms for data analyses are scarce. To address this limitation, we present a comprehensive, open-source, modular software for glycoproteomics data analysis called GlycoPAT (GlycoProteomics Analysis Toolbox; freely available from www.VirtualGlycome.org/glycopat). The program includes three major advances: (1) "SmallGlyPep," a minimal linear representation of glycopeptides for MS n data analysis. This format allows facile serial fragmentation of both the peptide backbone and PTM at one or more locations. (2) A novel scoring scheme based on calculation of the "Ensemble Score (ES)," a measure that scores and rank-orders MS/MS spectrum for N- and O-linked glycopeptides using cross-correlation and probability based analyses. (3) A false discovery rate (FDR) calculation scheme where decoy glycopeptides are created by simultaneously scrambling the amino acid sequence and by introducing artificial monosaccharides by perturbing the original sugar mass. Parallel computing facilities and user-friendly GUIs (Graphical User Interfaces) are also provided. GlycoPAT is used to catalogue site-specific glycosylation on simple glycoproteins, standard protein mixtures and human plasma cryoprecipitate samples in three common MS/MS fragmentation modes: CID, HCD and ETD. It is also used to identify 960 unique glycopeptides in cell lysates from prostate cancer cells. The results show that the simultaneous consideration of peptide and glycan fragmentation is necessary for high quality MS n spectrum annotation in CID and HCD fragmentation modes. Additionally, they confirm the suitability of GlycoPAT to analyze shotgun glycoproteomics data. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Liu, Fei; Wang, Yuan-zhong; Yang, Chun-yan; Jin, Hang
2015-01-01
The genuineness and producing area of Panax notoginseng were studied based on infrared spectroscopy combined with discriminant analysis. The infrared spectra of 136 taproots of P. notoginseng from 13 planting point in 11 counties were collected and the second derivate spectra were calculated by Omnic 8. 0 software. The infrared spectra and their second derivate spectra in the range 1 800 - 700 cm-1 were used to build model by stepwise discriminant analysis, which was in order to distinguish study on the genuineness of P. notoginseng. The model built based on the second derivate spectra showed the better recognition effect for the genuineness of P. notoginseng. The correct rate of returned classification reached to 100%, and the prediction accuracy was 93. 4%. The stability of model was tested by cross validation and the method was performed extrapolation validation. The second derivate spectra combined with the same discriminant analysis method were used to distinguish the producing area of P. notoginseng. The recognition effect of models built based on different range of spectrum and different numbers of samples were compared and found that when the model was built by collecting 8 samples from each planting point as training sample and the spectrum in the range 1 500 - 1 200 cm-1 , the recognition effect was better, with the correct rate of returned classification reached to 99. 0%, and the prediction accuracy was 76. 5%. The results indicated that infrared spectroscopy combined with discriminant analysis showed good recognition effect for the genuineness of P. notoginseng. The method might be a hopeful new method for identification of genuineness of P. notoginseng in practice. The method could recognize the producing area of P. notoginseng to some extent and could be a new thought for identification of the producing area of P. natoginseng.
1988-10-01
A statistical analysis on the output signals of an acousto - optic spectrum analyzer (AOSA) is performed for the case when the input signal is a...processing, Electronic warfare, Radar countermeasures, Acousto - optic , Spectrum analyzer, Statistical analysis, Detection, Estimation, Canada, Modelling.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
Imai, Shungo; Yamada, Takehiro; Ishiguro, Nobuhisa; Miyamoto, Takenori; Kagami, Keisuke; Tomiyama, Naoki; Niinuma, Yusuke; Nagasaki, Daisuke; Suzuki, Koji; Yamagami, Akira; Kasashi, Kumiko; Kobayashi, Masaki; Iseki, Ken
2017-01-01
Based on the predictive performance in our previous study, we switched the therapeutic drug monitoring (TDM) analysis software for dose setting of vancomycin (VCM) from "Vancomycin MEEK TDM analysis software Ver2.0" (MEEK) to "SHIONOGI-VCM-TDM ver.2009" (VCM-TDM) in January 2015. In the present study, our aim was to validate the effectiveness of the changing VCM TDM analysis software in initial dose setting of VCM. The enrolled patients were divided into two groups, each having 162 patients in total, who received VCM with the initial dose set using MEEK (MEEK group) or VCM-TDM (VCM-TDM group). We compared the rates of attaining the therapeutic range (trough value; 10-20 μg/mL) of serum VCM concentration between the groups. Multivariate logistic regression analysis was performed to confirm that changing the VCM TDM analysis software was an independent factor related to attaining the therapeutic range. Switching the VCM TDM analysis software from MEEK to VCM-TDM improved the rate of attaining the therapeutic range by 21.6% (MEEK group: 42.6% vs. VCM-TDM group: 64.2%, p<0.01). Patient age ≥65 years, concomitant medication (furosemide) and the TDM analysis software used VCM-TDM were considered to be independent factors for attaining the therapeutic range. These results demonstrated the effectiveness of switching the VCM TDM analysis software from MEEK to VCM-TDM for initial dose setting of VCM.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Cost-Efficient Phase Noise Measurement
NASA Astrophysics Data System (ADS)
Perić, Ana; Bjelica, Milan
2014-05-01
In this paper, an automated system for oscillator phase noise measurement is described. The system is primarily intended for use in academic institutions, such as smaller university or research laboratories, as it deploys standard spectrum analyzer and free software. A method to compensate the effect of instrument intrinsic noise is proposed. Through series of experimental tests, good performances of our system are verified and compliance to theoretical expectations is demonstrated.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2002-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time
Narayanan, Ram M; Pooler, Richard K; Martone, Anthony F; Gallagher, Kyle A; Sherbondy, Kelly D
2018-02-22
This paper describes a multichannel super-heterodyne signal analyzer, called the Spectrum Analysis Solution (SAS), which performs multi-purpose spectrum sensing to support spectrally adaptive and cognitive radar applications. The SAS operates from ultrahigh frequency (UHF) to the S-band and features a wideband channel with eight narrowband channels. The wideband channel acts as a monitoring channel that can be used to tune the instantaneous band of the narrowband channels to areas of interest in the spectrum. The data collected from the SAS has been utilized to develop spectrum sensing algorithms for the budding field of spectrum sharing (SS) radar. Bandwidth (BW), average total power, percent occupancy (PO), signal-to-interference-plus-noise ratio (SINR), and power spectral entropy (PSE) have been examined as metrics for the characterization of the spectrum. These metrics are utilized to determine a contiguous optimal sub-band (OSB) for a SS radar transmission in a given spectrum for different modalities. Three OSB algorithms are presented and evaluated: the spectrum sensing multi objective (SS-MO), the spectrum sensing with brute force PSE (SS-BFE), and the spectrum sensing multi-objective with brute force PSE (SS-MO-BFE).
Pooler, Richard K.; Martone, Anthony F.; Gallagher, Kyle A.; Sherbondy, Kelly D.
2018-01-01
This paper describes a multichannel super-heterodyne signal analyzer, called the Spectrum Analysis Solution (SAS), which performs multi-purpose spectrum sensing to support spectrally adaptive and cognitive radar applications. The SAS operates from ultrahigh frequency (UHF) to the S-band and features a wideband channel with eight narrowband channels. The wideband channel acts as a monitoring channel that can be used to tune the instantaneous band of the narrowband channels to areas of interest in the spectrum. The data collected from the SAS has been utilized to develop spectrum sensing algorithms for the budding field of spectrum sharing (SS) radar. Bandwidth (BW), average total power, percent occupancy (PO), signal-to-interference-plus-noise ratio (SINR), and power spectral entropy (PSE) have been examined as metrics for the characterization of the spectrum. These metrics are utilized to determine a contiguous optimal sub-band (OSB) for a SS radar transmission in a given spectrum for different modalities. Three OSB algorithms are presented and evaluated: the spectrum sensing multi objective (SS-MO), the spectrum sensing with brute force PSE (SS-BFE), and the spectrum sensing multi-objective with brute force PSE (SS-MO-BFE). PMID:29470448
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.
Sneak Analysis Application Guidelines
1982-06-01
Hardware Program Change Cost Trend, Airborne Environment ....... ....................... 111 3-11 Relative Software Program Change Costs...113 3-50 Derived Software Program Change Cost by Phase,* Airborne Environment ..... ............... 114 3-51 Derived Software Program Change...Cost by Phase, Ground/Water Environment ... ............. .... 114 3-52 Total Software Program Change Costs ................ 115 3-53 Sneak Analysis
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
NASA Astrophysics Data System (ADS)
Portnoy, David; Fisher, Brian; Phifer, Daniel
2015-06-01
The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal spectral data at 1 s time intervals, which represents data collected by a mobile system operating in a dynamic radiation background environment; and one that represents static measurements with a foreground spectrum (background plus source) and a background spectrum. These data include controlled variations in both Source Related Factors (nuclide, nuclide combinations, activities, distances, collection times, shielding configurations, and background spectra) and Detector Related Factors (currently only gain shifts, but resolution changes and non-linear energy calibration errors will be added soon). The software tools will allow the developer to evaluate the performance impact of each of these factors. Although this first implementation is somewhat limited in scope, considering only NaI-based detection systems and two application domains, it is hoped that (with community feedback) a wider range of detector types and applications will be included in the future. This article describes the methods used for dataset creation, the software validation/performance measurement tools, the performance metrics used, and examples of baseline performance.
Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O
2013-06-01
Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Using lexical analysis to identify emotional distress in psychometric schizotypy.
Abplanalp, Samuel J; Buck, Benjamin; Gonzenbach, Virgilio; Janela, Carlos; Lysaker, Paul H; Minor, Kyle S
2017-09-01
Through the use of lexical analysis software, researchers have demonstrated a greater frequency of negative affect word use in those with schizophrenia and schizotypy compared to the general population. In addition, those with schizotypy endorse greater emotional distress than healthy controls. In this study, our aim was to expand on previous findings in schizotypy to determine whether negative affect word use could be linked to emotional distress. Schizotypy (n=33) and non-schizotypy groups (n=33) completed an open-ended, semi-structured interview and negative affect word use was analyzed using a validated lexical analysis instrument. Emotional distress was assessed using subjective questionnaires of depression and psychological quality of life (QOL). When groups were compared, those with schizotypy used significantly more negative affect words; endorsed greater depression; and reported lower QOL. Within schizotypy, a trend level association between depression and negative affect word use was observed; QOL and negative affect word use showed a significant inverse association. Our findings offer preliminary evidence of the potential effectiveness of lexical analysis as an objective, behavior-based method for identifying emotional distress throughout the schizophrenia-spectrum. Utilizing lexical analysis in schizotypy offers promise for providing researchers with an assessment capable of objectively detecting emotional distress. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Spectrum simulation in DTSA-II.
Ritchie, Nicholas W M
2009-10-01
Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.
NASA Technical Reports Server (NTRS)
Rapp, Richard H.
1998-01-01
This paper documents the development of a degree 360 expansion of the dynamic ocean topography (DOT) of the POCM_4B ocean circulation model. The principles and software used that led to the final model are described. A key principle was the development of interpolated DOT values into land areas to avoid discontinuities at or near the land/ocean interface. The power spectrum of the POCM_4B is also presented with comparisons made between orthonormal (ON) and spherical harmonic magnitudes to degree 24. A merged file of ON and SH computed degree variances is proposed for applications where the DOT power spectrum from low to high (360) degrees is needed.
Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1996-01-01
The NASA OPAD spectrometer system relies heavily on extensive software which repetitively extracts spectral information from the engine plume and reports the amounts of metals which are present in the plume. The development of this software is at a sufficiently advanced stage where it can be used in actual engine tests to provide valuable data on engine operation and health. This activity will continue and, in addition, the OPAD system is planned to be used in flight aboard space vehicles. The two implementations, test-stand and in-flight, may have some differing requirements. For example, the data stored during a test-stand experiment are much more extensive than in the in-flight case. In both cases though, the majority of the requirements are similar. New data from the spectrograph is generated at a rate of once every 0.5 sec or faster. All processing must be completed within this period of time to maintain real-time performance. Every 0.5 sec, the OPAD system must report the amounts of specific metals within the engine plume, given the spectral data. At present, the software in the OPAD system performs this function by solving the inverse problem. It uses powerful physics-based computational models (the SPECTRA code), which receive amounts of metals as inputs to produce the spectral data that would have been observed, had the same metal amounts been present in the engine plume. During the experiment, for every spectrum that is observed, an initial approximation is performed using neural networks to establish an initial metal composition which approximates as accurately as possible the real one. Then, using optimization techniques, the SPECTRA code is repetitively used to produce a fit to the data, by adjusting the metal input amounts until the produced spectrum matches the observed one to within a given level of tolerance. This iterative solution to the original problem of determining the metal composition in the plume requires a relatively long period of time to execute the software in a modern single-processor workstation, and therefore real-time operation is currently not possible. A different number of iterations may be required to perform spectral data fitting per spectral sample. Yet, the OPAD system must be designed to maintain real-time performance in all cases. Although faster single-processor workstations are available for execution of the fitting and SPECTRA software, this option is unattractive due to the excessive cost associated with very fast workstations and also due to the fact that such hardware is not easily expandable to accommodate future versions of the software which may require more processing power. Initial research has already demonstrated that the OPAD software can take advantage of a parallel computer architecture to achieve the necessary speedup. Current work has improved the software by converting it into a form which is easily parallelizable. Timing experiments have been performed to establish the computational complexity and execution speed of major components of the software. This work provides the foundation of future work which will create a fully parallel version of the software executing in a shared-memory multiprocessor system.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Using recurrence plot analysis for software execution interpretation and fault detection
NASA Astrophysics Data System (ADS)
Mosdorf, M.
2015-09-01
This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.
Operations analysis (study 2.1). Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Wolfe, R. R.
1975-01-01
Subjects related to future STS operations concepts were investigated. The majority of effort was directed at assessing the benefits of automated space servicing concepts as related to improvements in payload procurement and shuttle utilization. Another subject was directed at understanding shuttle upper stage software development and recurring costs relative to total program projections. Space serving of automated payloads is addressed by examining the broad spectrum of payload applications with the belief that shared logistic operations will be a major contributor to reduction of future program costs. However, there are certain requirements for support of payload operations, such as availability of the payload, that may place demands upon the shuttle fleet. Because future projections of the NASA Mission Model are only representative of the payload traffic, it is important to recognize that it is the general character of operations that is significant rather than service to any single payload program.
NASA Astrophysics Data System (ADS)
Behrens, J.; Ranitzsch, P. C.-O.; Beck, M.; Beglarian, A.; Erhard, M.; Groh, S.; Hannen, V.; Kraus, M.; Ortjohann, H.-W.; Rest, O.; Schlösser, K.; Thümmler, T.; Valerius, K.; Wierman, K.; Wilkerson, J. F.; Winzen, D.; Zacher, M.; Weinheimer, C.
2017-06-01
The KATRIN experiment aims to determine the neutrino mass scale with a sensitivity of 200 {meV/c^2} (90% C. L.) by a precision measurement of the shape of the tritium β -spectrum in the endpoint region. The energy analysis of the decay electrons is achieved by a MAC-E filter spectrometer. To determine the transmission properties of the KATRIN main spectrometer, a mono-energetic and angular-selective electron source has been developed. In preparation for the second commissioning phase of the main spectrometer, a measurement phase was carried out at the KATRIN monitor spectrometer where the device was operated in a MAC-E filter setup for testing. The results of these measurements are compared with simulations using the particle-tracking software "Kassiopeia", which was developed in the KATRIN collaboration over recent years.
NASA Astrophysics Data System (ADS)
Diwaker
2014-07-01
The electronic, NMR, vibrational, structural properties of a new pyrazoline derivative: 2-(5-(4-Chlorophenyl)-3-(pyridine-2-yl)-4,5-dihydropyrazol-1-yl)benzo[d]thiazole has been studied using Gaussian 09 software package. Using VEDA 4 program we have reported the PED potential energy distribution of normal mode of vibrations of the title compound. We have also reported the 1H and 13C NMR chemical shifts of the title compound using B3LYP level of theory with 6-311++G(2d,2p) basis set. Using time dependent (TD-DFT) approach electronic properties such as HOMO and LUMO energies, electronic spectrum of the title compound has been studied and reported. NBO analysis and MEP surface mapping has also been calculated and reported using ab initio methods.
NASA Astrophysics Data System (ADS)
Various papers on global telecommunications are presented. The general topics addressed include: multiservice integration with optical fibers, multicompany owned telecommunication networks, softworks quality and reliability, advanced on-board processing, impact of new services and systems on operations and maintenance, analytical studies of protocols for data communication networks, topics in packet radio networking, CCITT No. 7 to support new services, document processing and communication, antenna technology and system aspects in satellite communications. Also considered are: communication systems modelling methodology, experimental integrated local area voice/data nets, spread spectrum communications, motion video at the DS-0 rate, optical and data communications, intelligent work stations, switch performance analysis, novel radio communication systems, wireless local networks, ISDN services, LAN communication protocols, user-system interface, radio propagation and performance, mobile satellite system, software for computer networks, VLSI for ISDN terminals, quality management, man-machine interfaces in switching, and local area network performance.
Magnetic biosensors: Modelling and simulation.
Nabaei, Vahid; Chandrawati, Rona; Heidari, Hadi
2018-04-30
In the past few years, magnetoelectronics has emerged as a promising new platform technology in various biosensors for detection, identification, localisation and manipulation of a wide spectrum of biological, physical and chemical agents. The methods are based on the exposure of the magnetic field of a magnetically labelled biomolecule interacting with a complementary biomolecule bound to a magnetic field sensor. This Review presents various schemes of magnetic biosensor techniques from both simulation and modelling as well as analytical and numerical analysis points of view, and the performance variations under magnetic fields at steady and nonstationary states. This is followed by magnetic sensors modelling and simulations using advanced Multiphysics modelling software (e.g. Finite Element Method (FEM) etc.) and home-made developed tools. Furthermore, outlook and future directions of modelling and simulations of magnetic biosensors in different technologies and materials are critically discussed. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Techniques and Tools for Estimating Ionospheric Effects in Interferometric and Polarimetric SAR Data
NASA Technical Reports Server (NTRS)
Rosen, P.; Lavalle, M.; Pi, X.; Buckley, S.; Szeliga, W.; Zebker, H.; Gurrola, E.
2011-01-01
The InSAR Scientific Computing Environment (ISCE) is a flexible, extensible software tool designed for the end-to-end processing and analysis of synthetic aperture radar data. ISCE inherits the core of the ROI_PAC interferometric tool, but contains improvements at all levels of the radar processing chain, including a modular and extensible architecture, new focusing approach, better geocoding of the data, handling of multi-polarization data, radiometric calibration, and estimation and correction of ionospheric effects. In this paper we describe the characteristics of ISCE with emphasis on the ionospheric modules. To detect ionospheric anomalies, ISCE implements the Faraday rotation method using quadpolarimetric images, and the split-spectrum technique using interferometric single-, dual- and quad-polarimetric images. The ability to generate co-registered time series of quad-polarimetric images makes ISCE also an ideal tool to be used for polarimetric-interferometric radar applications.
Narratives of mothers of children with autism spectrum disorders: focus on eating behavior.
Lázaro, Cristiane P; Pondé, Milena P
2017-01-01
To investigate the eating behavior of individuals with autism through their mothers' narratives. The study of narratives was used to report on the narrators' experiences. Data on the eating habits of individuals with autism were collected using semi-structured interviews held individually with the mothers. The interviews were recorded, transcribed and codified using the NVivo software program. Eighteen mothers of boys/young men with autism participated in the study. Analysis yielded three major categories: eating patterns, the family's attitudes to the child's eating habits, and food-related behavior. Results show that autism-related factors may affect the child's food choices. Environmental factors, particularly the parents' behavior, may also play a decisive role, both in reinforcing the child's food choices and in encouraging a healthier and more diversified diet. Professionals should instruct parents regarding their decisive role in reinforcing or discouraging inappropriate mealtime behavior in children with autism.
Gauge backgrounds and zero-mode counting in F-theory
NASA Astrophysics Data System (ADS)
Bies, Martin; Mayrhofer, Christoph; Weigand, Timo
2017-11-01
Computing the exact spectrum of charged massless matter is a crucial step towards understanding the effective field theory describing F-theory vacua in four dimensions. In this work we further develop a coherent framework to determine the charged massless matter in F-theory compactified on elliptic fourfolds, and demonstrate its application in a concrete example. The gauge background is represented, via duality with M-theory, by algebraic cycles modulo rational equivalence. Intersection theory within the Chow ring allows us to extract coherent sheaves on the base of the elliptic fibration whose cohomology groups encode the charged zero-mode spectrum. The dimensions of these cohomology groups are computed with the help of modern techniques from algebraic geometry, which we implement in the software gap. We exemplify this approach in models with an Abelian and non-Abelian gauge group and observe jumps in the exact massless spectrum as the complex structure moduli are varied. An extended mathematical appendix gives a self-contained introduction to the algebro-geometric concepts underlying our framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aartsen, M. G.; Ackermann, M.; Adams, J.
IceCube is a neutrino observatory deployed in the glacial ice at the geographic South Pole. The ν μ energy unfolding described in this paper is based on data taken with IceCube in its 79-string configuration. A sample of muon neutrino charged-current interactions with a purity of 99.5% was selected by means of a multivariate classification process based on machine learning. The subsequent unfolding was performed using the software Truee. The resulting spectrum covers an E ν-range of more than four orders of magnitude from 125 GeV to 3.2 PeV. Compared to the Honda atmospheric neutrino flux model, the energy spectrum shows anmore » excess of more than 1.9σ in four adjacent bins for neutrino energies E ν ≥ 177.8TeV. The obtained spectrum is fully compatible with previous measurements of the atmospheric neutrino flux and recent IceCube measurements of a flux of high-energy astrophysical neutrinos.« less
Aartsen, M. G.; Ackermann, M.; Adams, J.; ...
2017-10-20
IceCube is a neutrino observatory deployed in the glacial ice at the geographic South Pole. The ν μ energy unfolding described in this paper is based on data taken with IceCube in its 79-string configuration. A sample of muon neutrino charged-current interactions with a purity of 99.5% was selected by means of a multivariate classification process based on machine learning. The subsequent unfolding was performed using the software Truee. The resulting spectrum covers an E ν-range of more than four orders of magnitude from 125 GeV to 3.2 PeV. Compared to the Honda atmospheric neutrino flux model, the energy spectrum shows anmore » excess of more than 1.9σ in four adjacent bins for neutrino energies E ν ≥ 177.8TeV. The obtained spectrum is fully compatible with previous measurements of the atmospheric neutrino flux and recent IceCube measurements of a flux of high-energy astrophysical neutrinos.« less
Modeling and investigative studies of Jovian low frequency emissions
NASA Technical Reports Server (NTRS)
Menietti, J. D.; Green, James L.; Six, N. Frank; Gulkis, S.
1986-01-01
Jovian decametric (DAM) and hectometric (HOM) emissions were first observed over the entire spectrum by the Voyager 1 and 2 flybys of the planet. They display unusual arc-like structures on frequency-versus-time spectrograms. Software for the modeling of the Jovian plasma and magnetic field environment was performed. In addition, an extensive library of programs was developed for the retrieval of Voyager Planetary Radio Astronomy (PRA) data in both the high and low frequency bands from new noise-free, recalibrated data tapes. This software allows the option of retrieving data sorted with respect to particular sub-Io longitudes. This has proven to be invaluable in the analyses of the data. Graphics routines were also developed to display the data on color spectrograms.
A Case Study of Coordination in Distributed Agile Software Development
NASA Astrophysics Data System (ADS)
Hole, Steinar; Moe, Nils Brede
Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.
MESTRN: A Deterministic Meson-Muon Transport Code for Space Radiation
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Norbury, John W.; Norman, Ryan B.; Wilson, John W.; Singleterry, Robert C., Jr.; Tripathi, Ram K.
2004-01-01
A safe and efficient exploration of space requires an understanding of space radiations, so that human life and sensitive equipment can be protected. On the way to these sensitive sites, the radiation fields are modified in both quality and quantity. Many of these modifications are thought to be due to the production of pions and muons in the interactions between the radiation and intervening matter. A method used to predict the effects of the presence of these particles on the transport of radiation through materials is developed. This method was then used to develop software, which was used to calculate the fluxes of pions and muons after the transport of a cosmic ray spectrum through aluminum and water. Software descriptions are given in the appendices.
Modeling and investigative studies of Jovian low frequency emissions
NASA Astrophysics Data System (ADS)
Menietti, J. D.; Green, James L.; Six, N. Frank; Gulkis, S.
1986-08-01
Jovian decametric (DAM) and hectometric (HOM) emissions were first observed over the entire spectrum by the Voyager 1 and 2 flybys of the planet. They display unusual arc-like structures on frequency-versus-time spectrograms. Software for the modeling of the Jovian plasma and magnetic field environment was performed. In addition, an extensive library of programs was developed for the retrieval of Voyager Planetary Radio Astronomy (PRA) data in both the high and low frequency bands from new noise-free, recalibrated data tapes. This software allows the option of retrieving data sorted with respect to particular sub-Io longitudes. This has proven to be invaluable in the analyses of the data. Graphics routines were also developed to display the data on color spectrograms.
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
An Analysis of Mission Critical Computer Software in Naval Aviation
1991-03-01
No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a
Simulation and Spectrum Extraction in the Spectroscopic Channel of the SNAP Experiment
NASA Astrophysics Data System (ADS)
Tilquin, Andre; Bonissent, A.; Gerdes, D.; Ealet, A.; Prieto, E.; Macaire, C.; Aumenier, M. H.
2007-05-01
A pixel-level simulation software is described. It is composed of two modules. The first module applies Fourier optics at each active element of the system to construct the PSF at a large variety of wavelengths and spatial locations of the point source. The input is provided by the engineer's design program (Zemax). It describes the optical path and the distortions. The PSF properties are compressed and interpolated using shapelets decomposition and neural network techniques. A second module is used for production jobs. It uses the output of the first module to reconstruct the relevant PSF and integrate it on the detector pixels. Extended and polychromatic sources are approximated by a combination of monochromatic point sources. For the spectrum extraction, we use a fast simulator based on a multidimensional linear interpolation of the pixel response tabulated on a grid of values of wavelength, position on sky and slice number. The prediction of the fast simulator is compared to the observed pixel content, and a chi-square minimization where the parameters are the bin contents is used to build the extracted spectrum. The visible and infrared arms are combined in the same chi-square, providing a single spectrum.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, W. A.; Lepicovsky, J.
1992-01-01
The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, William A.
1992-01-01
The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
NASA Technical Reports Server (NTRS)
1976-01-01
The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.
Material of LAPAN's thermal IR camera equipped with two microbolometers in one aperture
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; Andi M., T.
2017-11-01
Besides the wavelength used, there is another factor that we have to notice in designing an optical system. It is material used which is correct for the spectral bands determined. Basically, due the limitation of the available range and expensive, choosing and determining materials for Infra Red (IR) wavelength are more difficult and complex rather than visible spectrum. We also had the same problem while designing our thermal IR camera equipped with two microbolometers sharing aperture. Two spectral bands, 3 - 4 μm (MWIR) and 8 - 12 μm (LWIR), have been decided to be our thermal IR camera spectrum to address missions, i.e., peat land fire, volcanoes activities, and Sea Surface Temperature (SST). Referring those bands, we chose the appropriate material for LAPAN's IR camera optics. This paper describes material of LAPAN's IR camera equipped with two microbolometer in one aperture. First of all, we were learning and understanding of optical materials properties all matters of IR technology including its bandwidths. Considering some aspects, i.e., Transmission, Index of Refraction, Thermal properties covering the index gradient and coefficient of thermal expansion (CTE), the analysis then has been accomplished. Moreover, we were utilizing a commercial software, Thermal Desktop/Sinda Fluint, to strengthen the process. Some restrictions such as space environment, low cost, and performance mainly durability and transmission, were also cared throughout the trade off the works. The results of all those analysis, either in graphs or in measurement, indicate that the lens of LAPAN's IR camera with sharing aperture is based on Germanium/Zinc Selenide materials.
An overview of platforms for cloud based development.
Fylaktopoulos, G; Goumas, G; Skolarikis, M; Sotiropoulos, A; Maglogiannis, I
2016-01-01
This paper provides an overview of the state of the art technologies for software development in cloud environments. The surveyed systems cover the whole spectrum of cloud-based development including integrated programming environments, code repositories, software modeling, composition and documentation tools, and application management and orchestration. In this work we evaluate the existing cloud development ecosystem based on a wide number of characteristics like applicability (e.g. programming and database technologies supported), productivity enhancement (e.g. editor capabilities, debugging tools), support for collaboration (e.g. repository functionality, version control) and post-development application hosting and we compare the surveyed systems. The conducted survey proves that software engineering in the cloud era has made its initial steps showing potential to provide concrete implementation and execution environments for cloud-based applications. However, a number of important challenges need to be addressed for this approach to be viable. These challenges are discussed in the article, while a conclusion is drawn that although several steps have been made, a compact and reliable solution does not yet exist.
NASA Technical Reports Server (NTRS)
2003-01-01
Topics covered include: Tool for Bending a Metal Tube Precisely in a Confined Space; Multiple-Use Mechanisms for Attachment to Seat Tracks; Force-Measuring Clamps; Cellular Pressure-Actuated Joint; Block QCA Fault-Tolerant Logic Gates; Hybrid VLSI/QCA Architecture for Computing FFTs; Arrays of Carbon Nanotubes as RF Filters in Waveguides; Carbon Nanotubes as Resonators for RF Spectrum Analyzers; Software for Viewing Landsat Mosaic Images; Updated Integrated Mission Program; Software for Sharing and Management of Information; Update on Integrated Optical Design Analyzer; Optical-Quality Thin Polymer Membranes; Rollable Thin Shell Composite-Material Paraboloidal Mirrors; Folded Resonant Horns for Power Ultrasonic Applications; Touchdown Ball-Bearing System for Magnetic Bearings; Flux-Based Deadbeat Control of Induction-Motor Torque; Block Copolymers as Templates for Arrays of Carbon Nanotubes; Throttling Cryogen Boiloff To Control Cryostat Temperature; Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station; Turbulence in Supercritical O2/H2 and C7H16/N2 Mixing Layers; and Time-Resolved Measurements in Optoelectronic Microbioanal.
A new scoring function for top-down spectral deconvolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kou, Qiang; Wu, Si; Liu, Xiaowen
2014-12-18
Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore » that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less
Real-Time Food Authentication Using a Miniature Mass Spectrometer.
Gerbig, Stefanie; Neese, Stephan; Penner, Alexander; Spengler, Bernhard; Schulz, Sabine
2017-10-17
Food adulteration is a threat to public health and the economy. In order to determine food adulteration efficiently, rapid and easy-to-use on-site analytical methods are needed. In this study, a miniaturized mass spectrometer in combination with three ambient ionization methods was used for food authentication. The chemical fingerprints of three milk types, five fish species, and two coffee types were measured using electrospray ionization, desorption electrospray ionization, and low temperature plasma ionization. Minimum sample preparation was needed for the analysis of liquid and solid food samples. Mass spectrometric data was processed using the laboratory-built software MS food classifier, which allows for the definition of specific food profiles from reference data sets using multivariate statistical methods and the subsequent classification of unknown data. Applicability of the obtained mass spectrometric fingerprints for food authentication was evaluated using different data processing methods, leave-10%-out cross-validation, and real-time classification of new data. Classification accuracy of 100% was achieved for the differentiation of milk types and fish species, and a classification accuracy of 96.4% was achieved for coffee types in cross-validation experiments. Measurement of two milk mixtures yielded correct classification of >94%. For real-time classification, the accuracies were comparable. Functionality of the software program and its performance is described. Processing time for a reference data set and a newly acquired spectrum was found to be 12 s and 2 s, respectively. These proof-of-principle experiments show that the combination of a miniaturized mass spectrometer, ambient ionization, and statistical analysis is suitable for on-site real-time food authentication.
Electron-Excited X-Ray Microanalysis at Low Beam Energy: Almost Always an Adventure!
Newbury, Dale E; Ritchie, Nicholas W M
2016-08-01
Scanning electron microscopy with energy-dispersive spectrometry has been applied to the analysis of various materials at low-incident beam energies, E 0≤5 keV, using peak fitting and following the measured standards/matrix corrections protocol embedded in the National Institute of Standards and Technology Desktop Spectrum Analyzer-II analytical software engine. Low beam energy analysis provides improved spatial resolution laterally and in-depth. The lower beam energy restricts the atomic shells that can be ionized, reducing the number of X-ray peak families available to the analyst. At E 0=5 keV, all elements of the periodic table except H and He can be measured. As the beam energy is reduced below 5 keV, elements become inaccessible due to lack of excitation of useful characteristic X-ray peaks. The shallow sampling depth of low beam energy microanalysis makes the technique more sensitive to surface compositional modification due to formation of oxides and other reaction layers. Accurate and precise analysis is possible with the use of appropriate standards and by accumulating high count spectra of unknowns and standards (>1 million counts integrated from 0.1 keV to E 0).
A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics.
Halloran, John T; Rocke, David M
2018-05-04
Percolator is an important tool for greatly improving the results of a database search and subsequent downstream analysis. Using support vector machines (SVMs), Percolator recalibrates peptide-spectrum matches based on the learned decision boundary between targets and decoys. To improve analysis time for large-scale data sets, we update Percolator's SVM learning engine through software and algorithmic optimizations rather than heuristic approaches that necessitate the careful study of their impact on learned parameters across different search settings and data sets. We show that by optimizing Percolator's original learning algorithm, l 2 -SVM-MFN, large-scale SVM learning requires nearly only a third of the original runtime. Furthermore, we show that by employing the widely used Trust Region Newton (TRON) algorithm instead of l 2 -SVM-MFN, large-scale Percolator SVM learning is reduced to nearly only a fifth of the original runtime. Importantly, these speedups only affect the speed at which Percolator converges to a global solution and do not alter recalibration performance. The upgraded versions of both l 2 -SVM-MFN and TRON are optimized within the Percolator codebase for multithreaded and single-thread use and are available under Apache license at bitbucket.org/jthalloran/percolator_upgrade .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrie, G.M.; Perry, E.M.; Kirkham, R.R.
1997-09-01
This report describes the work performed at the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy`s Office of Nonproliferation and National Security, Office of Research and Development (NN-20). The work supports the NN-20 Broad Area Search and Analysis, a program initiated by NN-20 to improve the detection and classification of undeclared weapons facilities. Ongoing PNNL research activities are described in three main components: image collection, information processing, and change analysis. The Multispectral Airborne Imaging System, which was developed to collect georeferenced imagery in the visible through infrared regions of the spectrum, and flown on a light aircraftmore » platform, will supply current land use conditions. The image information extraction software (dynamic clustering and end-member extraction) uses imagery, like the multispectral data collected by the PNNL multispectral system, to efficiently generate landcover information. The advanced change detection uses a priori (benchmark) information, current landcover conditions, and user-supplied rules to rank suspect areas by probable risk of undeclared facilities or proliferation activities. These components, both separately and combined, provide important tools for improving the detection of undeclared facilities.« less
GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data
NASA Astrophysics Data System (ADS)
Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.
2016-08-01
The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.
An Exploration of Software-Based GNSS Signal Processing at Multiple Frequencies
NASA Astrophysics Data System (ADS)
Pasqual Paul, Manuel; Elosegui, Pedro; Lind, Frank; Vazquez, Antonio; Pankratius, Victor
2017-01-01
The Global Navigation Satellite System (GNSS; i.e., GPS, GLONASS, Galileo, and other constellations) has recently grown into numerous areas that go far beyond the traditional scope in navigation. In the geosciences, for example, high-precision GPS has become a powerful tool for a myriad of geophysical applications such as in geodynamics, seismology, paleoclimate, cryosphere, and remote sensing of the atmosphere. Positioning with millimeter-level accuracy can be achieved through carrier-phase-based, multi-frequency signal processing, which mitigates various biases and error sources such as those arising from ionospheric effects. Today, however, most receivers with multi-frequency capabilities are highly specialized hardware receiving systems with proprietary and closed designs, limited interfaces, and significant acquisition costs. This work explores alternatives that are entirely software-based, using Software-Defined Radio (SDR) receivers as a way to digitize the entire spectrum of interest. It presents an overview of existing open-source frameworks and outlines the next steps towards converting GPS software receivers from single-frequency to dual-frequency, geodetic-quality systems. In the future, this development will lead to a more flexible multi-constellation GNSS processing architecture that can be easily reused in different contexts, as well as to further miniaturization of receivers.
Integration of an expert system into a user interface language demonstration
NASA Technical Reports Server (NTRS)
Stclair, D. C.
1986-01-01
The need for a User Interface Language (UIL) has been recognized by the Space Station Program Office as a necessary tool to aid in minimizing the cost of software generation by multiple users. Previous history in the Space Shuttle Program has shown that many different areas of software generation, such as operations, integration, testing, etc., have each used a different user command language although the types of operations being performed were similar in many respects. Since the Space Station represents a much more complex software task, a common user command language--a user interface language--is required to support the large spectrum of space station software developers and users. To assist in the selection of an appropriate set of definitions for a UIL, a series of demonstration programs was generated with which to test UIL concepts against specific Space Station scenarios using operators for the astronaut and scientific community. Because of the importance of expert system in the space station, it was decided that an expert system should be embedded in the UIL. This would not only provide insight into the UIL components required but would indicate the effectiveness with which an expert system could function in such an environment.
Developing Dependable Software for a System-of-Systems
2005-03-01
the combined casualties of Union and Confederate forces totaled 26,134 soldiers on a single day of battle. [48] The war of attrition concept was a...attrition to a transformational concept of full-spectrum dominance: the ability of US forces, operating unilaterally or in combination with...the complex system-of-systems, these possible combinations are practically limitless. System "unravelings" seem to have an intelligence of their own
powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks
NASA Astrophysics Data System (ADS)
Murray, Steven G.
2018-05-01
powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.
Subsystem Testing and Flight Test Instrumentation.
1981-04-01
systems has made the job of the tester increasingly difficult. These systems are being " , designed to accomplish the entire spectrum of tasks from pure...52 destinations, targets, and avoidance areas. The software program also allows the aircrew to designate two weapon delivery programs from the...The basic design dW objective of the system is to provide an increased capability for weapons delivery against preplanned targets when operating at high
ERIC Educational Resources Information Center
Kitazoe, Noriko; Fujita, Naofumi; Izumoto, Yuji; Terada, Shin-ichi; Hatakenaka, Yuhei
2017-01-01
The purpose of this study was to investigate whether the individuals in the general population with high scores on the Autism Spectrum Quotient constituted a single homogeneous group or not. A cohort of university students (n = 4901) was investigated by cluster analysis based on the original five subscales of the Autism Spectrum Quotient. Based on…
Report of AAPM Task Group 162: Software for planar image quality metrology.
Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J
2018-02-01
The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.
Elements of strategic capability for software outsourcing enterprises based on the resource
NASA Astrophysics Data System (ADS)
Shi, Wengeng
2011-10-01
Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.
Roberson, Robin; Cameroni, Irene; Toso, Laura; Abebe, Daniel; Bissel, Stephanie; Spong, Catherine Y
2009-02-01
Fetal alcohol syndrome (FAS) is the leading cause of a spectrum of preventable nongenetic learning and behavioral disorders. In adult (FAS) mice, we measured phosphorylated cyclic adenosine monophosphate response element of binding protein (pCREB) staining in hippocampal subregions to evaluate a possible mechanism underlying FAS learning deficits. Pregnant C57BL6/J mice were treated on gestational day 8 with alcohol or control (saline). After learning assessment, the offspring were perfused for immunohistochemistry and brain sections probed using SER 133 pCREB antibody. Relative staining density was assessed using National Institutes of Health Image software. Statistical analysis included analysis of variance with P < .05 considered significant. In all hippocampal subregions, pCREB staining was greater in the control animals than in the alcohol-treated group (P < or = .0001). In utero alcohol exposure decreased pCREB activity in hippocampal subregions of adult mice. The dentate gyrus had the most robust cumulative decrease in pCREB staining, suggesting FAS adult learning deficits may correlate to enhanced dentate gyrus neurodegeneration.
NASA Astrophysics Data System (ADS)
Suarez, J.; Ochoa, L.; Saavedra, F.
2017-07-01
Remote sensing has always been the best investigation tool for planetary sciences. In this research have been used data of Surface albedo, electromagnetic spectra and satelital imagery in search of understanding glacier dynamics in some bodies of the solar system, and how it's related to their compositions and associated geological processes, this methodology is very common in icy moons studies. Through analytic software's some albedos map's and geomorphological analysis were made that allow interpretation of different types of ice in the glacier's and it's interaction with other materials, almost all the images were worked in the visible and infrared ranges of the spectrum; spectral data were later used to connect the reflectance whit chemical and reologic properties of the compounds studied. It have been concluded that the albedo analysis is an effective tool to differentiate materials in the bodies surfaces, but the application of spectral data is necessary to know the exact compounds of the glaciers and to have a better understanding of the icy bodies.
Design and Analysis of a Neuromemristive Reservoir Computing Architecture for Biosignal Processing
Kudithipudi, Dhireesha; Saleh, Qutaiba; Merkel, Cory; Thesing, James; Wysocki, Bryant
2016-01-01
Reservoir computing (RC) is gaining traction in several signal processing domains, owing to its non-linear stateful computation, spatiotemporal encoding, and reduced training complexity over recurrent neural networks (RNNs). Previous studies have shown the effectiveness of software-based RCs for a wide spectrum of applications. A parallel body of work indicates that realizing RNN architectures using custom integrated circuits and reconfigurable hardware platforms yields significant improvements in power and latency. In this research, we propose a neuromemristive RC architecture, with doubly twisted toroidal structure, that is validated for biosignal processing applications. We exploit the device mismatch to implement the random weight distributions within the reservoir and propose mixed-signal subthreshold circuits for energy efficiency. A comprehensive analysis is performed to compare the efficiency of the neuromemristive RC architecture in both digital(reconfigurable) and subthreshold mixed-signal realizations. Both Electroencephalogram (EEG) and Electromyogram (EMG) biosignal benchmarks are used for validating the RC designs. The proposed RC architecture demonstrated an accuracy of 90 and 84% for epileptic seizure detection and EMG prosthetic finger control, respectively. PMID:26869876
NASA Astrophysics Data System (ADS)
Anandhi, S.; Shyju, T. S.; Gopalakrishnan, R.
2010-11-01
The present article reports the growth of single crystals of a complex Orthonitroaniline with picric acid (2[C 6H 6N 2O 2]·C 6H 2(NO 2) 3OH) (ONAP) by solution growth (slow evaporation) method at room temperature. Single crystal XRD, UV-vis spectral analysis and TGA/DTA studies were carried out. FT-IR and Raman spectra were recorded to explore information of the functional groups. The high-resolution X-ray diffraction curve reveals the internal structural low angle boundaries. The PL spectrum of the title compound shows green emission. Dielectric behaviour was investigated at 33 and 70 °C. The dipole moment and first-order hyperpolarizability ( β) values were evaluated by using Gaussian 98 W software package with the help of B3LYP the density functional theory (DFT) method. The possible modes of vibrations are theoretically predicted by factor group analysis. The mechanical stability of the grown crystal was tested with Vicker's microhardness tester and the work hardening coefficient of the grown material was estimated.
Proton Induced X-Ray Emission (PIXE): Determining the Concentration of Samples
NASA Astrophysics Data System (ADS)
McCarthy, Mallory; Rodriguez Manso, Alis; Pajouhafsar, Yasmin; J Yennello, Sherry
2017-09-01
We used Proton Induced X-ray Emission (PIXE) as an analysis technique to determine the composition of samples, in particular, the elemental constituents and the concentrations. Each of the samples are bombarded with protons, which in result displaces a lower level electron and causes a higher level electron to fall into its place. This displacement produces characteristic x-rays that are `fingerprints' for each element. The protons supplied for the bombardment are produced and accelerated by the K150 proton beam in the Cyclotron Institute at Texas A&M University. The products are detected by three x-ray detectors: XR-100CR Si-PIN, XR-100SDD, and XR-100T CdTe. The peaks of the spectrum are analyzed using a software analysis tool, GUPIXWIN, to determine the concentration of the known elements of each particular sample. The goals of this work are to test run the Proton Induced X-Ray Emission experimental set up at Texas A&M University (TAMU) and to determine the concentration of thin films containing KBr given by the TAMU Chemical Engineering Department.
The Comparison of VLBI Data Analysis Using Software Globl and Globk
NASA Astrophysics Data System (ADS)
Guangli, W.; Xiaoya, W.; Jinling, L.; Wenyao, Z.
The comparison of different geodetic data analysis software is one of the quite of- ten mentioned topics. In this paper we try to find out the difference between software GLOBL and GLOBK when use them to process the same set of VLBI data. GLOBL is a software developed by VLBI team, geodesy branch, GSFC/NASA to process geode- tic VLBI data using algorithm of arc-parameter-elimination, while GLOBK using al- gorithm of kalman filtering is mainly used in GPS data analysis, and it is also used in VLBI data analysis. Our work focus on whether there are significant difference when use the two softwares to analyze the same VLBI data set and investigate the reasons caused the difference.
NASA Astrophysics Data System (ADS)
Newbury, Dale E.; Ritchie, Nicholas W. M.
2013-05-01
The typical strategy for analysis of a microscopic particle by scanning electron microscopy/energy dispersive spectrometry x-ray microanalysis (SEM/EDS) is to use a fixed beam placed at the particle center or to continuously overscan to gather an "averaged" x-ray spectrum. While useful, such strategies inevitably concede any possibility of recognizing microstructure within the particle, and such fine scale structure is often critical for understanding the origins, behavior, and fate of particles. Elemental imaging by x-ray mapping has been a mainstay of SEM/EDS analytical practice for many years, but the time penalty associated with mapping with older EDS technology has discouraged its general use and reserved it more for detailed studies that justified the time investment. The emergence of the high throughput, high peak stability silicon drift detector (SDD-EDS) has enabled a more effective particle mapping strategy: "flash" x-ray spectrum image maps can now be recorded in seconds that capture the spatial distribution of major (concentration, C > 0.1 mass fraction) and minor (0.01 <= C <= 0.1) constituents. New SEM/SDD-EDS instrument configurations feature multiple SDDs that view the specimen from widely spaced azimuthal angles. Multiple, simultaneous measurements from different angles enable x-ray spectrometry and mapping that can minimize the strong geometric effects of particles. The NIST DTSA-II software engine is a powerful aid for quantitatively analyzing EDS spectra measured individually as well as for mapping information (available free for Java platforms at: http://www.cstl.nist.gov/div837/837.02/epq/dtsa2/index.html).
Pavlic, Marion; Libiseller, Kathrin; Oberacher, Herbert
2006-09-01
The potential of the combined use of ESI-QqTOF-MS and ESI-QqTOF-MS/MS with mass-spectral library search for the identification of therapeutic and illicit drugs has been evaluated. Reserpine was used for standardizing experimental conditions and for characterization of the performance of the applied mass spectrometric system. Experiments revealed that because of the mass accuracy, the stability of calibration, and the reproducibility of fragmentation, the QqTOF mass spectrometer is an appropriate platform for establishment of a tandem-mass-spectral library. Three-hundred and nineteen substances were used as reference samples to build the spectral library. For each reference compound, product-ion spectra were acquired at ten different collision-energy values between 5 eV and 50 eV. For identification of unknown compounds, a library search algorithm was developed. The closeness of matching between a measured product-ion spectrum and a spectrum stored in the library was characterized by a value called "match probability", which took into account the number of matched fragment ions, the number of fragment ions observed in the two spectra, and the sum of the intensity differences calculated for matching fragments. A large value for the match probability indicated a close match between the measured and the reference spectrum. A unique feature of the library search algorithm-an implemented spectral purification option-enables characterization of multi-contributor fragment-ion spectra. With the aid of this software feature, substances comprising only 1.0% of the total amount of binary mixtures were unequivocally assigned, in addition to the isobaric main contributors. The spectral library was successfully applied to the characterization of 39 forensic casework samples.
ACES: Space shuttle flight software analysis expert system
NASA Technical Reports Server (NTRS)
Satterwhite, R. Scott
1990-01-01
The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
Meta-Analysis of Parent-Mediated Interventions for Young Children with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Nevill, Rose E.; Lecavalier, Luc; Stratis, Elizabeth A.
2018-01-01
A number of studies of parent-mediated interventions in autism spectrum disorder have been published in the last 15 years. We reviewed 19 randomized clinical trials of parent-mediated interventions for children with autism spectrum disorder between the ages of 1 and 6 years and conducted a meta-analysis on their efficacy. Meta-analysis outcomes…
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
Teaching meta-analysis using MetaLight.
Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark
2012-10-18
Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.
Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software
NASA Technical Reports Server (NTRS)
Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole;
2014-01-01
STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.
NASA Technical Reports Server (NTRS)
Moran, Susanne I.
2004-01-01
The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
Design and evaluation of a THz time domain imaging system using standard optical design software.
Brückner, Claudia; Pradarutti, Boris; Müller, Ralf; Riehemann, Stefan; Notni, Gunther; Tünnermann, Andreas
2008-09-20
A terahertz (THz) time domain imaging system is analyzed and optimized with standard optical design software (ZEMAX). Special requirements to the illumination optics and imaging optics are presented. In the optimized system, off-axis parabolic mirrors and lenses are combined. The system has a numerical aperture of 0.4 and is diffraction limited for field points up to 4 mm and wavelengths down to 750 microm. ZEONEX is used as the lens material. Higher aspherical coefficients are used for correction of spherical aberration and reduction of lens thickness. The lenses were manufactured by ultraprecision machining. For optimization of the system, ray tracing and wave-optical methods were combined. We show how the ZEMAX Gaussian beam analysis tool can be used to evaluate illumination optics. The resolution of the THz system was tested with a wire and a slit target, line gratings of different period, and a Siemens star. The behavior of the temporal line spread function can be modeled with the polychromatic coherent line spread function feature in ZEMAX. The spectral and temporal resolutions of the line gratings are compared with the respective modulation transfer function of ZEMAX. For maximum resolution, the system has to be diffraction limited down to the smallest wavelength of the spectrum of the THz pulse. Then, the resolution on time domain analysis of the pulse maximum can be estimated with the spectral resolution of the center of gravity wavelength. The system resolution near the optical axis on time domain analysis of the pulse maximum is 1 line pair/mm with an intensity contrast of 0.22. The Siemens star is used for estimation of the resolution of the whole system. An eight channel electro-optic sampling system was used for detection. The resolution on time domain analysis of the pulse maximum of all eight channels could be determined with the Siemens star to be 0.7 line pairs/mm.
NASA Technical Reports Server (NTRS)
Dunn, William R.; Corliss, Lloyd D.
1991-01-01
Paper examines issue of software safety. Presents four case histories of software-safety analysis. Concludes that, to be safe, software, for all practical purposes, must be free of errors. Backup systems still needed to prevent catastrophic software failures.
Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G
2013-01-16
Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Uber, James G.
1988-01-01
Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Analysis of a hardware and software fault tolerant processor for critical applications
NASA Technical Reports Server (NTRS)
Dugan, Joanne B.
1993-01-01
Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.
NMR spectrum analysis for CrAs at ambient pressure
NASA Astrophysics Data System (ADS)
Kotegawa, H.; Nakahara, S.; Matsushima, K.; Tou, H.; Matsuoka, E.; Sugawara, H.; Harima, H.
2018-05-01
We report NMR spectrum analysis for CrAs, which was recently reported to be superconducting under pressure. The NMR spectrum obtained by the powdered single crystals shows a typical powder pattern reproduced by the electric field gradient (EFG) parameters and isotropic Knight shift, indicating anisotropy of Knight shift is not remarkable in CrAs. For the oriented sample, the spectrum can be understood by considering that the crystals are aligned for H ∥ b . The temperature dependence of Knight shift was successfully obtained from NMR spectrum with large nuclear quadrupole interaction.
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.
Analysis and application of Fourier transform spectroscopy in atmospheric remote sensing
NASA Technical Reports Server (NTRS)
Park, J. H.
1984-01-01
An analysis method for Fourier transform spectroscopy is summarized with applications to various types of distortion in atmospheric absorption spectra. This analysis method includes the fast Fourier transform method for simulating the interferometric spectrum and the nonlinear least-squares method for retrieving the information from a measured spectrum. It is shown that spectral distortions can be simulated quite well and that the correct information can be retrieved from a distorted spectrum by this analysis technique.
State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation
2014-07-01
preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data
A Method for Populating the Knowledge Base of AFIT’s Domain-Oriented Application Composition System
1993-12-01
Analysis ( FODA ). The approach identifies prominent features (similarities) and distinctive features (differences) of software systems within an... analysis approaches we have summarized, the re- searchers described FODA in sufficient detail to use on large domain analysis projects (ones with...Software Technology Center, July 1991. 18. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
FunRich proteomics software analysis, let the fun begin!
Benito-Martin, Alberto; Peinado, Héctor
2015-08-01
Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony, Stephen
The Sandia hyperspectral upper-bound spectrum algorithm (hyper-UBS) is a cosmic ray despiking algorithm for hyperspectral data sets. When naturally-occurring, high-energy (gigaelectronvolt) cosmic rays impact the earth’s atmosphere, they create an avalanche of secondary particles which will register as a large, positive spike on any spectroscopic detector they hit. Cosmic ray spikes are therefore an unavoidable spectroscopic contaminant which can interfere with subsequent analysis. A variety of cosmic ray despiking algorithms already exist and can potentially be applied to hyperspectral data matrices, most notably the upper-bound spectrum data matrices (UBS-DM) algorithm by Dongmao Zhang and Dor Ben-Amotz which served as themore » basis for the hyper-UBS algorithm. However, the existing algorithms either cannot be applied to hyperspectral data, require information that is not always available, introduce undesired spectral bias, or have otherwise limited effectiveness for some experimentally relevant conditions. Hyper-UBS is more effective at removing a wider variety of cosmic ray spikes from hyperspectral data without introducing undesired spectral bias. In addition to the core algorithm the Sandia hyper-UBS software package includes additional source code useful in evaluating the effectiveness of the hyper-UBS algorithm. The accompanying source code includes code to generate simulated hyperspectral data contaminated by cosmic ray spikes, several existing despiking algorithms, and code to evaluate the performance of the despiking algorithms on simulated data.« less
Dickinson, Kathleen; Place, Maurice
2016-06-01
Problems with social functioning are a major area of difficulty for children with autism. Such problems have the potential to exert a negative influence on several aspects of the children's functioning, including their ability to access education. This study looked to examine if a computer-based activity program could improve the social functioning of these children. Using a pooled subject design, 100 children with autistic spectrum disorder were randomly allocated, controlling where possible for age and gender, to either an intervention or a control group. The children in the intervention group were encouraged to use the Nintendo (Kyoto, Japan) Wii™ and the software package "Mario & Sonic at the Olympics" in addition to their routine school physical education classes over a 9-month period. The control group attended only the routine physical education classes. After 1 year, analysis of the changes in the scores of teacher-completed measures of social functioning showed that boys in the intervention group had made statistically significant improvement in their functioning when compared with controls. The number of girls in the study was too small for any change to reach statistical significance. This type of intervention appears to have potential as a mechanism to produce improvement in the social functioning, at least of boys, as part of a physical education program.
A Meta-Analysis of the Social Communication Questionnaire: Screening for Autism Spectrum Disorder
ERIC Educational Resources Information Center
Chesnut, Steven R.; Wei, Tianlan; Barnard-Brak, Lucy; Richman, David M.
2017-01-01
The current meta-analysis examines the previous research on the utility of the Social Communication Questionnaire as a screening instrument for autism spectrum disorder. Previously published reports have highlighted the inconsistencies between Social Communication Questionnaire-screening results and formal autism spectrum disorder diagnoses. The…
Reconfigurable, Cognitive Software-Defined Radio
NASA Technical Reports Server (NTRS)
Bhat, Arvind
2015-01-01
Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
Research on response spectrum of dam based on scenario earthquake
NASA Astrophysics Data System (ADS)
Zhang, Xiaoliang; Zhang, Yushan
2017-10-01
Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.
Second Generation Product Line Engineering Takes Hold in the DoD
2014-01-01
Feature- Oriented Domain Analysis ( FODA ) Feasibility Study” (CMU/SEI-90- TR-021, ADA235785). Pittsburgh, PA: Software Engineering Institute...software product line engineering and software architecture documentation and analysis . Clements is co-author of three practitioner-oriented books about
2008-09-01
software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1
Software Defined Network Monitoring Scheme Using Spectral Graph Theory and Phantom Nodes
2014-09-01
networks is the emergence of software - defined networking ( SDN ) [1]. SDN has existed for the...Chapter III for network monitoring. A. SOFTWARE DEFINED NETWORKS SDNs provide a new and innovative method to simplify network hardware by logically...and R. Giladi, “Performance analysis of software - defined networking ( SDN ),” in Proc. of IEEE 21st International Symposium on Modeling, Analysis
Leak detection in medium density polyethylene (MDPE) pipe using pressure transient method
NASA Astrophysics Data System (ADS)
Amin, M. M.; Ghazali, M. F.; PiRemli, M. A.; Hamat, A. M. A.; Adnan, N. F.
2015-12-01
Water is an essential part of commodity for a daily life usage for an average person, from personal uses such as residential or commercial consumers to industries utilization. This study emphasizes on detection of leaking in medium density polyethylene (MDPE) pipe using pressure transient method. This type of pipe is used to analyze the position of the leakage in the pipeline by using Ensemble Empirical Mode Decomposition Method (EEMD) with signal masking. Water hammer would induce an impulse throughout the pipeline that caused the system turns into a surge of water wave. Thus, solenoid valve is used to create a water hammer through the pipelines. The data from the pressure sensor is collected using DASYLab software. The data analysis of the pressure signal will be decomposed into a series of wave composition using EEMD signal masking method in matrix laboratory (MATLAB) software. The series of decomposition of signals is then carefully selected which reflected intrinsic mode function (IMF). These IMFs will be displayed by using a mathematical algorithm, known as Hilbert transform (HT) spectrum. The IMF signal was analysed to capture the differences. The analyzed data is compared with the actual measurement of the leakage in term of percentage error. The error recorded is below than 1% and it is proved that this method highly reliable and accurate for leak detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kartsaklis, Christos; Hernandez, Oscar R
Interrogating the structure of a program for patterns of interest is attractive to the broader spectrum of software engineering. The very approach by which a pattern is constructed remains a concern for the source code mining community. This paper presents a pattern programming model, for the C and Fortran programming languages, using a compiler directives approach. We discuss our specification, called HERCULES/PL, throughout a number of examples and show how different patterns can be constructed, plus some preliminary results.
NASA Astrophysics Data System (ADS)
Gusev, A. A.; Chuluunbaatar, O.; Vinitsky, S. I.; Derbov, V. L.; Hai, L. L.; Kazaryan, E. M.; Sarkisyan, H. A.
2018-04-01
We present new calculation schemes using high-order finite element method implemented on unstructured grids with triangle elements for solving boundary-value problems that describe axially symmetric quantum dots. The efficiency of the algorithms and software is demonstrated by benchmark calculations of the energy spectrum, the envelope eigenfunctions of electron, hole and exciton states, and the direct interband light absorption in conical and spheroidal impenetrable quantum dots.
Chaos in War: Is It Present and What Does It Mean?
1994-06-01
the Poincare map. The results of this work indicated that chaos is, in fact, present in warfare. The implications of this result include IV...confirmed the validity of our software and provided us with the Poincare maps and the power spectrum. LTC Pentland of the School for Advanced...nearly identical initial conditions for the logistics equation 20 8 . Poincare map 22 9. Phase-space trajectories of chaotic systems 23 10
Coexistence Analysis of Civil Unmanned Aircraft Systems at Low Altitudes
NASA Astrophysics Data System (ADS)
Zhou, Yuzhe
2016-11-01
The requirement of unmanned aircraft systems in civil areas is growing. However, provisioning of flight efficiency and safety of unmanned aircraft has critical requirements on wireless communication spectrum resources. Current researches mainly focus on spectrum availability. In this paper, the unmanned aircraft system communication models, including the coverage model and data rate model, and two coexistence analysis procedures, i. e. the interference and noise ratio criterion and frequency-distance-direction criterion, are proposed to analyze spectrum requirements and interference results of the civil unmanned aircraft systems at low altitudes. In addition, explicit explanations are provided. The proposed coexistence analysis criteria are applied to assess unmanned aircraft systems' uplink and downlink interference performances and to support corresponding spectrum planning. Numerical results demonstrate that the proposed assessments and analysis procedures satisfy requirements of flexible spectrum accessing and safe coexistence among multiple unmanned aircraft systems.
NASA Astrophysics Data System (ADS)
Zhang, G. Q.; To, S.
2014-08-01
Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.
Development of new vibration energy flow analysis software and its applications to vehicle systems
NASA Astrophysics Data System (ADS)
Kim, D.-J.; Hong, S.-Y.; Park, Y.-H.
2005-09-01
The Energy flow analysis (EFA) offers very promising results in predicting the noise and vibration responses of system structures in medium-to-high frequency ranges. We have developed the Energy flow finite element method (EFFEM) based software, EFADSC++ R4, for the vibration analysis. The software can analyze the system structures composed of beam, plate, spring-damper, rigid body elements and many other components developed, and has many useful functions in analysis. For convenient use of the software, the main functions of the whole software are modularized into translator, model-converter, and solver. The translator module makes it possible to use finite element (FE) model for the vibration analysis. The model-converter module changes FE model into energy flow finite element (EFFE) model, and generates joint elements to cover the vibrational attenuation in the complex structures composed of various elements and can solve the joint element equations by using the wave tra! nsmission approach very quickly. The solver module supports the various direct and iterative solvers for multi-DOF structures. The predictions of vibration for real vehicles by using the developed software were performed successfully.
The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.
Zamawe, F C
2015-03-01
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
Description of the GMAO OSSE for Weather Analysis Software Package: Version 3
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.;
2017-01-01
The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.
ERIC Educational Resources Information Center
Margerum-Leys, Jon; Kupperman, Jeff; Boyle-Heimann, Kristen
This paper presents perspectives on the use of data analysis software in the process of qualitative research. These perspectives were gained in the conduct of three qualitative research studies that differed in theoretical frames, areas of interests, and scope. Their common use of a particular data analysis software package allows the exploration…
ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273
ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
[A basic research to share Fourier transform near-infrared spectrum information resource].
Zhang, Lu-Da; Li, Jun-Hui; Zhao, Long-Lian; Zhao, Li-Li; Qin, Fang-Li; Yan, Yan-Lu
2004-08-01
A method to share the information resource in the database of Fourier transform near-infrared(FTNIR) spectrum information of agricultural products and utilize the spectrum information sufficiently is explored in this paper. Mapping spectrum information from one instrument to another is studied to express the spectrum information accurately between the instruments. Then mapping spectrum information is used to establish a mathematical model of quantitative analysis without including standard samples. The analysis result is that the relative coefficient r is 0.941 and the relative error is 3.28% between the model estimate values and the Kjeldahl's value for the protein content of twenty-two wheat samples, while the relative coefficient r is 0.963 and the relative error is 2.4% for the other model, which is established by using standard samples. It is shown that the spectrum information can be shared by using the mapping spectrum information. So it can be concluded that the spectrum information in one FTNIR spectrum information database can be transformed to another instrument's mapping spectrum information, which makes full use of the information resource in the database of FTNIR spectrum information to realize the resource sharing between different instruments.
Software selection based on analysis and forecasting methods, practised in 1C
NASA Astrophysics Data System (ADS)
Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.
2015-09-01
The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.
Haramija, Marko
2018-03-01
Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.
New software for statistical analysis of Cambridge Structural Database data
Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.
2011-01-01
A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784
Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.
Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko
2017-11-01
To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.
NASA Astrophysics Data System (ADS)
Chernov, Anton; Kurkin, Andrey; Pelinovsky, Efim; Yalciner, Ahmet; Zaytsev, Andrey
2010-05-01
A short cut numerical method for evaluation of the modes of free oscillations of the basins which have irregular geometry and bathymetry was presented in the paper (Yalciner A.C., Pelinovsky E., 2007). In the method, a single wave is inputted to the basin as an initial impulse. The respective agitation in the basin is computed by using the numerical method solving the nonlinear form of long wave equations. The time histories of water surface fluctuations at different locations due to propagation of the waves in relation to the initial impulse are stored and analyzed by the fast Fourier transform technique (FFT) and energy spectrum curves for each location are obtained. The frequencies of each mode of free oscillations are determined from the peaks of the spectrum curves. Some main features were added for this method and will be discussed here: 1. Instead of small number of gauges which were manually installed in the studied area the information from numerical simulation now is recorded on the regular net of the «simulation» gauges which was place everywhere on the sea surface in the depth deeper than "coast" level with the fixed presetted distance between gauges. The spectral analysis of wave records was produced by Welch periodorgam method instead of simple FFT so it's possible to get spectral power estimation for wave process and determine confidence interval for spectra peaks. 2. After the power spectral estimation procedure the common peak of studied seiche can be found and mean spectral amplitudes for this peak were calculated numerically by a Simpson integration method for all gauges in the basin and the mean spectral amplitudes spatial distribution map can be ploted. The spatial distribution helps to study structure of seiche and determine effected dangerous areas. 3. Nested grid module in the NAMI-DANCE - nonlinear shallow water equations calculation software package was developed. This is very important feature for complicated different scale (ocean - sea - bay - harbor) phenomenons studying. The new developed software was tested for Mediterranian, Sea of Okhotsk and South China sea regions. This software can be usefull in local tsunami mapping and tsunami propagation in the coastal zone. References: Yalciner A.C., Pelinovsky E. A short cut numerical method for determination of periods of free oscillations for basins with irregular geometry and bathymetry // Ocean engineering. V. 34. 2007. С. 747 - 757
Structural Dynamics and Data Analysis
NASA Technical Reports Server (NTRS)
Luthman, Briana L.
2013-01-01
This project consists of two parts, the first will be the post-flight analysis of data from a Delta IV launch vehicle, and the second will be a Finite Element Analysis of a CubeSat. Shock and vibration data was collected on WGS-5 (Wideband Global SATCOM- 5) which was launched on a Delta IV launch vehicle. Using CAM (CAlculation with Matrices) software, the data is to be plotted into Time History, Shock Response Spectrum, and SPL (Sound Pressure Level) curves. In this format the data is to be reviewed and compared to flight instrumentation data from previous flights of the same launch vehicle. This is done to ensure the current mission environments, such as shock, random vibration, and acoustics, are not out of family with existing flight experience. In family means the peaks on the SRS curve for WGS-5 are similar to the peaks from the previous flights and there are no major outliers. The curves from the data will then be compiled into a useful format so that is can be peer reviewed then presented before an engineering review board if required. Also, the reviewed data will be uploaded to the Engineering Review Board Information System (ERBIS) to archive. The second part of this project is conducting Finite Element Analysis of a CubeSat. In 2010, Merritt Island High School partnered with NASA to design, build and launch a CubeSat. The team is now called StangSat in honor of their mascot, the mustang. Over the past few years, the StangSat team has built a satellite and has now been manifested for flight on a SpaceX Falcon 9 launch in 2014. To prepare for the final launch, a test flight was conducted in Mojave, California. StangSat was launched on a Prospector 18D, a high altitude rocket made by Garvey Spacecraft Corporation, along with their sister satellite CP9 built by California Polytechnic University. However, StangSat was damaged during an off nominal landing and this project will give beneficial insights into what loads the CubeSat experienced during the crash. During the year, the MIHS students generated a SolidWorks (CAD software) geometry model of StangSat. This model will be imported into FEMAP (Finite Element Analysis (FEA) Software) and a finite element model wiiJ be created to predict the loads encountered during the crash of this rocket. This analysis will require learning how to import CAD models into the FEM, mesh and add constraints and concentrated masses to represent components inside the CubeSat frame, such as circuit boards, batteries and accelerometers. During the analysis the loads will be varied, in effort to duplicate the damage to the CubeSat. Results will then be peer reviewed and documented.
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Using Combined SFTA and SFMECA Techniques for Space Critical Software
NASA Astrophysics Data System (ADS)
Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.
2012-01-01
This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.
Software development predictors, error analysis, reliability models and software metric analysis
NASA Technical Reports Server (NTRS)
Basili, Victor
1983-01-01
The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.
Automated daily quality control analysis for mammography in a multi-unit imaging center.
Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli
2018-01-01
Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.
Primordial power spectrum: a complete analysis with the WMAP nine-year data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazra, Dhiraj Kumar; Shafieloo, Arman; Souradeep, Tarun, E-mail: dhiraj@apctp.org, E-mail: arman@apctp.org, E-mail: tarun@iucaa.ernet.in
2013-07-01
We have improved further the error sensitive Richardson-Lucy deconvolution algorithm making it applicable directly on the un-binned measured angular power spectrum of Cosmic Microwave Background observations to reconstruct the form of the primordial power spectrum. This improvement makes the application of the method significantly more straight forward by removing some intermediate stages of analysis allowing a reconstruction of the primordial spectrum with higher efficiency and precision and with lower computational expenses. Applying the modified algorithm we fit the WMAP 9 year data using the optimized reconstructed form of the primordial spectrum with more than 300 improvement in χ{sup 2}{sub eff}more » with respect to the best fit power-law. This is clearly beyond the reach of other alternative approaches and reflects the efficiency of the proposed method in the reconstruction process and allow us to look for any possible feature in the primordial spectrum projected in the CMB data. Though the proposed method allow us to look at various possibilities for the form of the primordial spectrum, all having good fit to the data, proper error-analysis is needed to test for consistency of theoretical models since, along with possible physical artefacts, most of the features in the reconstructed spectrum might be arising from fitting noises in the CMB data. Reconstructed error-band for the form of the primordial spectrum using many realizations of the data, all bootstrapped and based on WMAP 9 year data, shows proper consistency of power-law form of the primordial spectrum with the WMAP 9 data at all wave numbers. Including WMAP polarization data in to the analysis have not improved much our results due to its low quality but we expect Planck data will allow us to make a full analysis on CMB observations on both temperature and polarization separately and in combination.« less
Gerber, Jeffrey S; Ross, Rachael K; Bryan, Matthew; Localio, A Russell; Szymczak, Julia E; Wasserman, Richard; Barkman, Darlene; Odeniyi, Folasade; Conaboy, Kathryn; Bell, Louis; Zaoutis, Theoklis E; Fiks, Alexander G
2017-12-19
Acute respiratory tract infections account for the majority of antibiotic exposure in children, and broad-spectrum antibiotic prescribing for acute respiratory tract infections is increasing. It is not clear whether broad-spectrum treatment is associated with improved outcomes compared with narrow-spectrum treatment. To compare the effectiveness of broad-spectrum and narrow-spectrum antibiotic treatment for acute respiratory tract infections in children. A retrospective cohort study assessing clinical outcomes and a prospective cohort study assessing patient-centered outcomes of children between the ages of 6 months and 12 years diagnosed with an acute respiratory tract infection and prescribed an oral antibiotic between January 2015 and April 2016 in a network of 31 pediatric primary care practices in Pennsylvania and New Jersey. Stratified and propensity score-matched analyses to account for confounding by clinician and by patient-level characteristics, respectively, were implemented for both cohorts. Broad-spectrum antibiotics vs narrow-spectrum antibiotics. In the retrospective cohort, the primary outcomes were treatment failure and adverse events 14 days after diagnosis. In the prospective cohort, the primary outcomes were quality of life, other patient-centered outcomes, and patient-reported adverse events. Of 30 159 children in the retrospective cohort (19 179 with acute otitis media; 6746, group A streptococcal pharyngitis; and 4234, acute sinusitis), 4307 (14%) were prescribed broad-spectrum antibiotics including amoxicillin-clavulanate, cephalosporins, and macrolides. Broad-spectrum treatment was not associated with a lower rate of treatment failure (3.4% for broad-spectrum antibiotics vs 3.1% for narrow-spectrum antibiotics; risk difference for full matched analysis, 0.3% [95% CI, -0.4% to 0.9%]). Of 2472 children enrolled in the prospective cohort (1100 with acute otitis media; 705, group A streptococcal pharyngitis; and 667, acute sinusitis), 868 (35%) were prescribed broad-spectrum antibiotics. Broad-spectrum antibiotics were associated with a slightly worse child quality of life (score of 90.2 for broad-spectrum antibiotics vs 91.5 for narrow-spectrum antibiotics; score difference for full matched analysis, -1.4% [95% CI, -2.4% to -0.4%]) but not with other patient-centered outcomes. Broad-spectrum treatment was associated with a higher risk of adverse events documented by the clinician (3.7% for broad-spectrum antibiotics vs 2.7% for narrow-spectrum antibiotics; risk difference for full matched analysis, 1.1% [95% CI, 0.4% to 1.8%]) and reported by the patient (35.6% for broad-spectrum antibiotics vs 25.1% for narrow-spectrum antibiotics; risk difference for full matched analysis, 12.2% [95% CI, 7.3% to 17.2%]). Among children with acute respiratory tract infections, broad-spectrum antibiotics were not associated with better clinical or patient-centered outcomes compared with narrow-spectrum antibiotics, and were associated with higher rates of adverse events. These data support the use of narrow-spectrum antibiotics for most children with acute respiratory tract infections.
The uses of cognitive training technologies in the treatment of autism spectrum disorders.
Wass, Sam V; Porayska-Pomsta, Kaska
2014-11-01
In this review, we focus on research that has used technology to provide cognitive training - i.e. to improve performance on some measurable aspect of behaviour - in individuals with autism spectrum disorders. We review technology-enhanced interventions that target three different cognitive domains: (a) emotion and face recognition, (b) language and literacy, and (c) social skills. The interventions reviewed allow for interaction through different modes, including point-and-click and eye-gaze contingent software, and are delivered through diverse implementations, including virtual reality and robotics. In each case, we examine the evidence of the degree of post-training improvement observed following the intervention, including evidence of transfer to altered behaviour in ecologically valid contexts. We conclude that a number of technological interventions have found that observed improvements within the computerised training paradigm fail to generalise to altered behaviour in more naturalistic settings, which may result from problems that people with autism spectrum disorders experience in generalising and extrapolating knowledge. However, we also point to several promising findings in this area. We discuss possible directions for future work. © The Author(s) 2013.