Structural Analysis and Design Software
NASA Technical Reports Server (NTRS)
1997-01-01
Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.
Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software
Williams, Linda; Grayson, Diana; Gosbee, John
2001-01-01
Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.
Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software
Williams, Linda; Grayson, Diana; Gosbee, John
2002-01-01
Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.
software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements
Evaluation of copy number variation detection for a SNP array platform
2014-01-01
Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668
A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...
Incorporating Manual and Autonomous Code Generation
NASA Technical Reports Server (NTRS)
McComas, David
1998-01-01
Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.
1993-12-01
proposed a domain analysis approach called Feature-Oriented Domain Analysis ( FODA ). The approach identifies prominent features (similarities) and...characteristics of software systems in the domain. Unlike the other domain analysis approaches we have summarized, the re- searchers described FODA in...Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software Engineering Institute, Carnegie Mellon University, Novem- ber 1990. 19. Lee, Kenneth
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.
2010-12-01
A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.
Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter
2012-09-01
Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.
Progress on automated data analysis algorithms for ultrasonic inspection of composites
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Forsyth, David S.; Welter, John T.
2015-03-01
Progress is presented on the development and demonstration of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. New algorithms have been implemented to reliably identify indications in time-of-flight images near the front and back walls of composite panels. Adaptive call criteria have also been applied to address sensitivity to variation in backwall signal level, panel thickness variation, and internal signal noise. ADA processing results are presented for a variety of test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions. Software tools have been developed to support both ADA algorithm design and certification, producing a statistical evaluation of indication results and false calls using a matching process with predefined truth tables. Parametric studies were performed to evaluate detection and false call results with respect to varying algorithm settings.
1988-09-01
software programs capable of being used on a microcomputer will be considered for analysis. No software intended for use on a miniframe or mainframe...Dial-A-Log consists of a program written in a computer language called L-10 that is run on a DEC-20 miniframe . The combination of the specific...proliferation of software dealing with microcomputers. Instead, they were geared more towards managing the use of miniframe or mainframe computer
Reference Management Software: A Comparative Analysis of Four Products
ERIC Educational Resources Information Center
Gilmour, Ron; Cobus-Kuo, Laura
2011-01-01
Reference management (RM) software is widely used by researchers in the health and natural sciences. Librarians are often called upon to provide support for these products. The present study compares four prominent RMs: CiteULike, RefWorks, Mendeley, and Zotero, in terms of features offered and the accuracy of the bibliographies that they…
FFI: A software tool for ecological monitoring
Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
RSEIS and RFOC: Seismic Analysis in R
NASA Astrophysics Data System (ADS)
Lees, J. M.
2015-12-01
Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.
2013-01-02
intensity data from the SNP array were normalized using the Affymetrix GeneChip Targeted Genotyping Analysis Software ( GTGS ). To assess robustness of SNP...calls, genotypes were called using three algorithms: (i) GTGS , (ii) illuminus (27), and (iii) a heuristic algorithm based on discrete cutoffs of
MOSAIC: Software for creating mosaics from collections of images
NASA Technical Reports Server (NTRS)
Varosi, F.; Gezari, D. Y.
1992-01-01
We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
A Review of Feature Extraction Software for Microarray Gene Expression Data
Tan, Ching Siang; Ting, Wai Soon; Mohamad, Mohd Saberi; Chan, Weng Howe; Deris, Safaai; Ali Shah, Zuraini
2014-01-01
When gene expression data are too large to be processed, they are transformed into a reduced representation set of genes. Transforming large-scale gene expression data into a set of genes is called feature extraction. If the genes extracted are carefully chosen, this gene set can extract the relevant information from the large-scale gene expression data, allowing further analysis by using this reduced representation instead of the full size data. In this paper, we review numerous software applications that can be used for feature extraction. The software reviewed is mainly for Principal Component Analysis (PCA), Independent Component Analysis (ICA), Partial Least Squares (PLS), and Local Linear Embedding (LLE). A summary and sources of the software are provided in the last section for each feature extraction method. PMID:25250315
Comparison Campaign of VLBI Data Analysis Software - First Results
NASA Technical Reports Server (NTRS)
Plank, Lucia; Bohm, Johannes; Schuh, Harald
2010-01-01
During the development of the Vienna VLBI Software VieVS at the Institute of Geodesy and Geophysics at Vienna University of Technology, a special comparison setup was developed with the goal of easily finding links between deviations of results achieved with different software packages and certain parameters of the observation. The object of comparison is the computed time delay, a value calculated for each observation including all relevant models and corrections that need to be applied in geodetic VLBI analysis. Besides investigating the effects of the various models on the total delay, results of comparisons between VieVS and Occam 6.1 are shown. Using the same methods, a Comparison Campaign of VLBI data analysis software called DeDeCC is about to be launched within the IVS soon.
Best Manufacturing Practices Survey Conducted at Litton Data Systems Division, Van Nuys, California
1988-10-01
Hardware and Software ................................ 10 DESIGN RELEASE Engineering Change Order Processing and Analysis...structured using bridges to isolate local traffic. Long term plans call for a wide-band network. ENGINEERING CHANGE ORDER PROCESSING AND ANALYSIS
Advanced gamma ray balloon experiment ground checkout and data analysis
NASA Technical Reports Server (NTRS)
Blackstone, M.
1976-01-01
A software programming package to be used in the ground checkout and handling of data from the advanced gamma ray balloon experiment is described. The Operator's Manual permits someone unfamiliar with the inner workings of the software system (called LEO) to operate on the experimental data as it comes from the Pulse Code Modulation interface, converting it to a form for later analysis, and monitoring the program of an experiment. A Programmer's Manual is included.
Pedagogy and Related Criteria: The Selection of Software for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Samuels, Jeffrey D.
2013-01-01
Computer-Assisted Language Learning (CALL) is an established field of academic inquiry with distinct applications for second language teaching and learning. Many CALL professionals direct language labs or language resource centers (LRCs) in which CALL software applications and generic software applications support language learning programs and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan
X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan; ...
2018-02-01
X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Software Project Management and Measurement on the World-Wide-Web (WWW)
NASA Technical Reports Server (NTRS)
Callahan, John; Ramakrishnan, Sudhaka
1996-01-01
We briefly describe a system for forms-based, work-flow management that helps members of a software development team overcome geographical barriers to collaboration. Our system, called the Web Integrated Software Environment (WISE), is implemented as a World-Wide-Web service that allows for management and measurement of software development projects based on dynamic analysis of change activity in the workflow. WISE tracks issues in a software development process, provides informal communication between the users with different roles, supports to-do lists, and helps in software process improvement. WISE minimizes the time devoted to metrics collection and analysis by providing implicit delivery of messages between users based on the content of project documents. The use of a database in WISE is hidden from the users who view WISE as maintaining a personal 'to-do list' of tasks related to the many projects on which they may play different roles.
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
Vehicle management and mission planning systems with shuttle applications
NASA Technical Reports Server (NTRS)
1972-01-01
A preliminary definition of a concept for an automated system is presented that will support the effective management and planning of space shuttle operations. It is called the Vehicle Management and Mission Planning System (VMMPS). In addition to defining the system and its functions, some of the software requirements of the system are identified and a phased and evolutionary method is recommended for software design, development, and implementation. The concept is composed of eight software subsystems supervised by an executive system. These subsystems are mission design and analysis, flight scheduler, launch operations, vehicle operations, payload support operations, crew support, information management, and flight operations support. In addition to presenting the proposed system, a discussion of the evolutionary software development philosophy that the Mission Planning and Analysis Division (MPAD) would propose to use in developing the required supporting software is included. A preliminary software development schedule is also included.
Miller, Brian S; Calderan, Susannah; Gillespie, Douglas; Weatherup, Graham; Leaper, Russell; Collins, Kym; Double, Michael C
2016-03-01
Directional frequency analysis and recording (DIFAR) sonobuoys can allow real-time acoustic localization of baleen whales for underwater tracking and remote sensing, but limited availability of hardware and software has prevented wider usage. These software limitations were addressed by developing a module in the open-source software PAMGuard. A case study is presented demonstrating that this software provides greater efficiency and accessibility than previous methods for detecting, localizing, and tracking Antarctic blue whales in real time. Additionally, this software can easily be extended to track other low and mid frequency sounds including those from other cetaceans, pinnipeds, icebergs, shipping, and seismic airguns.
White, Gary C.; Hines, J.E.
2004-01-01
The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).
GPS Software Packages Deliver Positioning Solutions
NASA Technical Reports Server (NTRS)
2010-01-01
"To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."
State analysis requirements database for engineering complex embedded systems
NASA Technical Reports Server (NTRS)
Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.
IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.
Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M
2016-04-01
Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.
BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device
NASA Astrophysics Data System (ADS)
Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.
2010-12-01
The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.
Engineering Complex Embedded Systems with State Analysis and the Mission Data System
NASA Technical Reports Server (NTRS)
Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.
Loop-Mediated Isothermal Amplification (LAMP) Signature Identification Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torres, C.
2009-03-17
This is an extendable open-source Loop-mediated isothermal AMPlification (LAMP) signature design program called LAVA (LAMP Assay Versatile Analysis). LAVA was created in response to limitations of existing LAMP signature programs.
Mining collections of compounds with Screening Assistant 2
2012-01-01
Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565
Mining collections of compounds with Screening Assistant 2.
Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc
2012-08-31
High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.
Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan
2018-01-01
X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements. PMID:29875507
Andrews, Ross N; Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan
2018-02-01
X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
Object-oriented design of medical imaging software.
Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R
1994-01-01
A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.
A theoretical basis for the analysis of multiversion software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.
NASA Astrophysics Data System (ADS)
Murphy, Elizabeth Drummond
As advances in technology are applied in complex, semi-automated domains, human controllers are distanced from the controlled process. This physical and psychological distance may both facilitate and degrade human performance. To investigate cognitive issues in spacecraft ground-control operations, the present experimental research was undertaken. The primary issue concerned the ability of operations analysts who do not monitor operations to make timely, accurate decisions when autonomous software calls for human help. Another key issue involved the potential effects of spatial-visualization ability (SVA) in environments that present data in graphical formats. Hypotheses were derived largely from previous findings and predictions in the literature. Undergraduate psychology students were assigned at random to a monitoring condition or an on-call condition in a scaled environment. The experimental task required subjects to decide on the veracity of a problem diagnosis delivered by a software process on-board a simulated spacecraft. To support decision-making, tabular and graphical data displays presented information on system status. A level of software confidence in the problem diagnosis was displayed, and subjects reported their own level of confidence in their decisions. Contrary to expectations, the performance of on-call subjects did not differ significantly from that of continuous monitors. Analysis yielded a significant interaction of sex and condition: Females in the on-call condition had the lowest mean accuracy. Results included a preference for bar charts over line graphs and faster performance with tables than with line graphs. A significant correlation was found between subjective confidence and decision accuracy. SVA was found to be predictive of accuracy but not speed; and SVA was found to be a stronger predictor of performance for males than for females. Low-SVA subjects reported that they relied more on software confidence than did medium- or high-SVA subjects. These and other findings have implications for the design of user interfaces to support human decision-making in on-call situations and to accommodate low-SVA users.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Post-Mission Assessment for Tactical Training-Trends Analysis (PMATT-TA): Usability Analysis Report
2014-07-01
information, PMATT-TA also supports data calls to understand fleet readiness and proficiency. Additionally, PMATT-TA addresses a need for a digitally based...Software Quality Journal, 4(2), 115-130. Gray, W. D., & Salzman, M. C. (1998). Damaged merchandise ? A review of experiments that compare usability
SIENA Customer Problem Statement and Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. Sauer; R. Clay; C. Adams
2000-08-01
This document describes the problem domain and functional requirements of the SIENA framework. The software requirements and system architecture of SIENA are specified in separate documents (called SIENA Software Requirement Specification and SIENA Software Architecture, respectively). While currently this version of the document describes the problems and captures the requirements within the Analysis domain (concentrating on finite element models), it is our intention to subsequent y expand this document to describe problems and capture requirements from the Design and Manufacturing domains. In addition, SIENA is designed to be extendible to support and integrate elements from the other domains (see SIENAmore » Software Architecture document).« less
Software analysis in the semantic web
NASA Astrophysics Data System (ADS)
Taylor, Joshua; Hall, Robert T.
2013-05-01
Many approaches in software analysis, particularly dynamic malware analyis, benefit greatly from the use of linked data and other Semantic Web technology. In this paper, we describe AIS, Inc.'s Semantic Extractor (SemEx) component from the Malware Analysis and Attribution through Genetic Information (MAAGI) effort, funded under DARPA's Cyber Genome program. The SemEx generates OWL-based semantic models of high and low level behaviors in malware samples from system call traces generated by AIS's introspective hypervisor, IntroVirtTM. Within MAAGI, these semantic models were used by modules that cluster malware samples by functionality, and construct "genealogical" malware lineages. Herein, we describe the design, implementation, and use of the SemEx, as well as the C2DB, an OWL ontology used for representing software behavior and cyber-environments.
Dynamic simulation of train derailments
DOT National Transportation Integrated Search
2006-11-05
This paper describes a planar rigid-body model to examine the gross motions of rail cars in a train derailment. The model is implemented using a commercial software package called ADAMS (Automatic Dynamic Analysis of Mechanical Systems). The results ...
CWA 15793 2011 Planning and Implementation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Alan; Nail, George
This software, built on an open source platform called Electron (runs on Chromium and Node.js), is designed to assist organizations in the implementation of a biorisk management system consistent with the requirements of the international, publicly available guidance document CEN Workshop Agreement 15793:2011 (CWA 15793). The software includes tools for conducting organizational gap analysis against CWA 15793 requirements, planning tools to support the implementation of CWA 15793 requirements, and performance monitoring support. The gap analysis questions are based on the text of CWA 15793, and its associated guidance document, CEN Workshop Agreement 16393:2012. The authors have secured permission from themore » publisher of CWA 15793, the European Committee for Standardization (CEN), to use language from the document in the software, with the understanding that the software will be made available freely, without charge.« less
Using Software Design Methods in CALL
ERIC Educational Resources Information Center
Ward, Monica
2006-01-01
The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…
2013-01-01
Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455
Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian
2013-11-09
The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.
Data and Analysis Center for Software
1990-03-01
is available to DACS users. 7.4 Bibliographic Services Bibliographic inquiries to the DACS are received in many forms: by letter, by telephone call, by...of potential users concerning the DACS and its products and services. 9-4 10.0 TASK 9 - SPECIAL STUDIES AND PROJECTS 10.1 Introduction There are many ...problems related to software technology that can be solved through the full service capabilities provided by the DACS. Many of these are sizable
Process membership in asynchronous environments
NASA Technical Reports Server (NTRS)
Ricciardi, Aleta M.; Birman, Kenneth P.
1993-01-01
The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.
Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology
Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron
2010-01-01
Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237
Hardisty, Frank; Robinson, Anthony C.
2010-01-01
In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423
Developing tools for digital radar image data evaluation
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.; Raggam, J.
1986-01-01
The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.
NASA Astrophysics Data System (ADS)
Hart, D. M.; Merchant, B. J.; Abbott, R. E.
2012-12-01
The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Linear Discriminant Analysis on a Spreadsheet.
ERIC Educational Resources Information Center
Busbey, Arthur Bresnahan III
1989-01-01
Described is a software package, "Trapeze," within which a routine called LinDis can be used. Discussed are teaching methods, the linear discriminant model and equations, the LinDis worksheet, and an example. The set up for this routine is included. (CW)
Probability and Statistics in Sensor Performance Modeling
2010-12-01
language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of
A seismic analysis for masonry constructions: The different schematization methods of masonry walls
NASA Astrophysics Data System (ADS)
Olivito, Renato. S.; Codispoti, Rosamaria; Scuro, Carmelo
2017-11-01
Seismic analysis of masonry structures is usually analyzed through the use of structural calculation software based on equivalent frames method or to macro-elements method. In these approaches, the masonry walls are divided into vertical elements, masonry walls, and horizontal elements, so-called spandrel elements, interconnected by rigid nodes. The aim of this work is to make a critical comparison between different schematization methods of masonry wall underlining the structural importance of the spandrel elements. In order to implement the methods, two different structural calculation software were used and an existing masonry building has been examined.
Towards Model-Driven End-User Development in CALL
ERIC Educational Resources Information Center
Farmer, Rod; Gruba, Paul
2006-01-01
The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…
Successful Use of CALL Software: An Investigation from the User's Perspective
ERIC Educational Resources Information Center
Scagnoli, Norma; Yontz, Ruth; Choo, Jinhee
2014-01-01
This study explores the use and implementation of computer-assisted language learning (CALL) software in graduate professional education. The investigation looked into self-reported information on graduate students' use of ESL (English as a Second Language) software to improve language skills and their competencies in professional English…
Nakano, Shogo; Asano, Yasuhisa
2015-02-03
Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.
NASA Astrophysics Data System (ADS)
Nakano, Shogo; Asano, Yasuhisa
2015-02-01
Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.
VoroTop: Voronoi cell topology visualization and analysis toolkit
NASA Astrophysics Data System (ADS)
Lazar, Emanuel A.
2018-01-01
This paper introduces a new open-source software program called VoroTop, which uses Voronoi topology to analyze local structure in atomic systems. Strengths of this approach include its abilities to analyze high-temperature systems and to characterize complex structure such as grain boundaries. This approach enables the automated analysis of systems and mechanisms previously not possible.
ERIC Educational Resources Information Center
Tang, Michael; David, Hyerle; Byrne, Roxanne; Tran, John
2012-01-01
This paper is a mathematical (Boolean) analysis a set of cognitive maps called Thinking Maps[R], based on Albert Upton's semantic principles developed in his seminal works, Design for Thinking (1961) and Creative Analysis (1961). Albert Upton can be seen as a brilliant thinker who was before his time or after his time depending on the future of…
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
[The analysis of threshold effect using Empower Stats software].
Lin, Lin; Chen, Chang-zhong; Yu, Xiao-dan
2013-11-01
In many studies about biomedical research factors influence on the outcome variable, it has no influence or has a positive effect within a certain range. Exceeding a certain threshold value, the size of the effect and/or orientation will change, which called threshold effect. Whether there are threshold effects in the analysis of factors (x) on the outcome variable (y), it can be observed through a smooth curve fitting to see whether there is a piecewise linear relationship. And then using segmented regression model, LRT test and Bootstrap resampling method to analyze the threshold effect. Empower Stats software developed by American X & Y Solutions Inc has a threshold effect analysis module. You can input the threshold value at a given threshold segmentation simulated data. You may not input the threshold, but determined the optimal threshold analog data by the software automatically, and calculated the threshold confidence intervals.
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms.
Puritz, Jonathan B; Hollenbeck, Christopher M; Gold, John R
2014-01-01
Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com.
dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms
Hollenbeck, Christopher M.; Gold, John R.
2014-01-01
Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com. PMID:24949246
A parallel and sensitive software tool for methylation analysis on multicore platforms.
Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín
2015-10-01
DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
VCFR: A package to manipulate and visualize variant call format data in R
USDA-ARS?s Scientific Manuscript database
Software to call single nucleotide polymorphisms or related genetic variants has converged on the variant call format (vcf) as their output format of choice. This has created a need for tools to work with vcf files. While an increasing number of software exists to read vcf data, many of them only ex...
Software Engineering Tools for Scientific Models
NASA Technical Reports Server (NTRS)
Abrams, Marc; Saboo, Pallabi; Sonsini, Mike
2013-01-01
Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.
ToxMiner Software Interface for Visualizing and Analyzing ToxCast Data
The ToxCast dataset represents a collection of assays and endpoints that will require both standard statistical approaches as well as customized data analysis workflows. To analyze this unique dataset, we have developed an integrated database with Javabased interface called ToxMi...
3D Visualization for Phoenix Mars Lander Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Keely, Leslie; Lees, David; Stoker, Carol
2012-01-01
Planetary surface exploration missions present considerable operational challenges in the form of substantial communication delays, limited communication windows, and limited communication bandwidth. A 3D visualization software was developed and delivered to the 2008 Phoenix Mars Lander (PML) mission. The components of the system include an interactive 3D visualization environment called Mercator, terrain reconstruction software called the Ames Stereo Pipeline, and a server providing distributed access to terrain models. The software was successfully utilized during the mission for science analysis, site understanding, and science operations activity planning. A terrain server was implemented that provided distribution of terrain models from a central repository to clients running the Mercator software. The Ames Stereo Pipeline generates accurate, high-resolution, texture-mapped, 3D terrain models from stereo image pairs. These terrain models can then be visualized within the Mercator environment. The central cross-cutting goal for these tools is to provide an easy-to-use, high-quality, full-featured visualization environment that enhances the mission science team s ability to develop low-risk productive science activity plans. In addition, for the Mercator and Viz visualization environments, extensibility and adaptability to different missions and application areas are key design goals.
Usability analysis of 2D graphics software for designing technical clothing.
Teodoroski, Rita de Cassia Clark; Espíndola, Edilene Zilma; Silva, Enéias; Moro, Antônio Renato Pereira; Pereira, Vera Lucia D V
2012-01-01
With the advent of technology, the computer became a working tool increasingly present in companies. Its purpose is to increase production and reduce the inherent errors in manual production. The aim of this study was to analyze the usability of 2D graphics software in creating clothing designs by a professional during his work. The movements of the mouse, keyboard and graphical tools were monitored in real time by software Camtasia 7® installed on the user's computer. To register the use of mouse and keyboard we used auxiliary software called MouseMeter®, which quantifies the number of times they pressed the right, middle and left mouse's buttons, the keyboard and also the distance traveled in meters by the cursor on the screen. Data was collected in periods of 15 minutes, 1 hour and 8 hours, consecutively. The results showed that the job is considered repetitive and high demands physical efforts, which can lead to the appearance of repetitive strain injuries. Thus, the goal of minimizing operator efforts and thereby enhance the usability of the examined tool, becomes imperative to replace the mouse by a device called tablet, which also offers an electronic pen and a drawing platform for design development.
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
Sptrace is a general-purpose space utilization tracing system that is conceptually similar to the commercial Purify product used to detect leaks and other memory usage errors. It is designed to monitor space utilization in any sort of heap, i.e., a region of data storage on some device (nominally memory; possibly shared and possibly persistent) with a flat address space. This software can trace usage of shared and/or non-volatile storage in addition to private RAM (random access memory). Sptrace is implemented as a set of C function calls that are invoked from within the software that is being examined. The function calls fall into two broad classes: (1) functions that are embedded within the heap management software [e.g., JPL's SDR (Simple Data Recorder) and PSM (Personal Space Management) systems] to enable heap usage analysis by populating a virtual time-sequenced log of usage activity, and (2) reporting functions that are embedded within the application program whose behavior is suspect. For ease of use, these functions may be wrapped privately inside public functions offered by the heap management software. Sptrace can be used for VxWorks or RTEMS realtime systems as easily as for Linux or OS/X systems.
Microscopy image segmentation tool: Robust image data analysis
NASA Astrophysics Data System (ADS)
Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.
2014-03-01
We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Upgrade of DRAMA-ESA's Space Debris Mitigation Analysis Tool Suite
NASA Astrophysics Data System (ADS)
Gelhaus, Johannes; Sanchez-Ortiz, Noelia; Braun, Vitali; Kebschull, Christopher; de Oliveira, Joaquim Correia; Dominguez-Gonzalez, Raul; Wiedemann, Carsten; Krag, Holger; Vorsmann, Peter
2013-08-01
One decade ago ESA started the dev elopment of the first version of the software tool called DRAMA (Debris Risk Assessment and Mitigation Analysis) to enable ESA space programs to assess their compliance with the recommendations in the European Code of Conduct for Space Debris Mitigation. This tool was maintained, upgraded and extended during the last year and is now a combination of five individual tools, each addressing a different aspect of debris mitigation. This paper gives an overview of the new DRAMA software in general. Both, the main tools ARES, OSCAR, MIDAS, CROC and SARA will be discussed and the environment used by DRAMA will be explained shortly.
Innovative Techniques Simplify Vibration Analysis
NASA Technical Reports Server (NTRS)
2010-01-01
In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."
ERIC Educational Resources Information Center
Luan, Jing; Willett, Terrence
This paper discusses data mining--an end-to-end (ETE) data analysis tool that is used by researchers in higher education. It also relates data mining and other software programs to a brand new concept called "Knowledge Management." The paper culminates in the Tier Knowledge Management Model (TKMM), which seeks to provide a stable…
What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?
ERIC Educational Resources Information Center
Cushion, Steve
2006-01-01
We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…
Requirements for VICTORIA Class Fire Control System: Contact Management Function
2014-07-01
Canadian Navy ( RCN ) is currently upgrading the fire control system, which will include moving the software to new modular consoles which have screens...Development RCN Royal Canadian Navy SAC Sensor Analysis Coordinator; also called Command Display Console (CDC) operator SAR Search and Rescue SME
Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki
2012-01-01
Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.
NASA Technical Reports Server (NTRS)
Katz, Daniel
2004-01-01
PVM Wrapper is a software library that makes it possible for code that utilizes the Parallel Virtual Machine (PVM) software library to run using the message-passing interface (MPI) software library, without needing to rewrite the entire code. PVM and MPI are the two most common software libraries used for applications that involve passing of messages among parallel computers. Since about 1996, MPI has been the de facto standard. Codes written when PVM was popular often feature patterns of {"initsend," "pack," "send"} and {"receive," "unpack"} calls. In many cases, these calls are not contiguous and one set of calls may even exist over multiple subroutines. These characteristics make it difficult to obtain equivalent functionality via a single MPI "send" call. Because PVM Wrapper is written to run with MPI- 1.2, some PVM functions are not permitted and must be replaced - a task that requires some programming expertise. The "pvm_spawn" and "pvm_parent" function calls are not replaced, but a programmer can use "mpirun" and knowledge of the ranks of parent and child tasks with supplied macroinstructions to enable execution of codes that use "pvm_spawn" and "pvm_parent."
NASA Technical Reports Server (NTRS)
Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam
2011-01-01
Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding [8]. In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.
NASA Astrophysics Data System (ADS)
Soltis, Joseph M.; Savage, Anne; Leong, Kirsten M.
2004-05-01
The most commonly occurring elephant vocalization is the rumble, a frequency-modulated call with infrasonic components. Upwards of ten distinct rumble subtypes have been proposed, but little quantitative work on the acoustic properties of rumbles has been conducted. Rumble vocalizations (N=269) from six females housed at Disney's Animal Kingdom were analyzed. Vocalizations were recorded from microphones in collars around subject necks, and rumbles were digitized and measured using SIGNAL software. Sixteen acoustic variables were measured for each call, extracting both source and filter features. Multidimensional scaling analysis indicates that there are no acoustically distinct rumble subtypes, but that there is quantitative variation across rumbles. Discriminant function analysis showed that the acoustic characteristics of rumbles differ across females. A classification success rate of 65% was achieved when assigning unselected rumbles to one of the six females (test set =64 calls) according to the functions derived from the originally selected calls (training set =205 calls). The rumble is best viewed as a single call type with graded variation, but information regarding individual identity is encoded in female rumbles.
Baedecker, P.A.; Grossman, J.N.
1995-01-01
A PC based system has been developed for the analysis of gamma-ray spectra and for the complete reduction of data from INAA experiments, including software to average the results from mulitple lines and multiple countings and to produce a final report of analysis. Graphics algorithms may be called for the analysis of complex spectral features, to compare the data from alternate photopeaks and to evaluate detector performance during a given counting cycle. A database of results for control samples can be used to prepare quality control charts to evaluate long term precision and to search for systemic variations in data on reference samples as a function of time. The entire software library can be accessed through a user-friendly menu interface with internal help.
USDA-ARS?s Scientific Manuscript database
Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...
Hardware Acceleration for Cyber Security
2010-11-01
perform different approaches. It includes behavioral analysis, by means of NetFlow monitoring, as well as packet content analysis, so called Deep...Interface (API). The example of such application is NetFlow exporter described in [5]. • We provide modified libpcap library using libsze2 API. This...cards. The software applications using NIFIC include FlowMon NetFlow /IPFIX generator, Wireshark packet analyzer, iptables - Linux kernel firewall, deep
STARS: a software application for the EBEX autonomous daytime star cameras
NASA Astrophysics Data System (ADS)
Chapman, Daniel; Didier, Joy; Hanany, Shaul; Hillbrand, Seth; Limon, Michele; Miller, Amber; Reichborn-Kjennerud, Britt; Tucker, Greg; Vinokurov, Yury
2014-07-01
The E and B Experiment (EBEX) is a balloon-borne telescope designed to probe polarization signals in the CMB resulting from primordial gravitational waves, gravitational lensing, and Galactic dust emission. EBEX completed an 11 day flight over Antarctica in January 2013 and data analysis is underway. EBEX employs two star cameras to achieve its real-time and post-flight pointing requirements. We wrote a software application called STARS to operate, command, and collect data from each of the star cameras, and to interface them with the main flight computer. We paid special attention to make the software robust against potential in-flight failures. We report on the implementation, testing, and successful in flight performance of STARS.
NASA Astrophysics Data System (ADS)
Kang, Won-Seok; Son, Chang-Sik; Lee, Sangho; Choi, Rock-Hyun; Ha, Yeong-Mi
2017-07-01
In this paper, we introduce a wellness software platform, called WellnessHumanCare, is a semi-automatic wellness management software platform which has the functions of complex wellness data acquisition(mental, physical and environmental one) with smart wearable devices, complex wellness condition analysis, private-aware online/offline recommendation, real-time monitoring apps (Smartphone-based, Web-based) and so on and we has demonstrated a wellness management service with 79 participants (experimental group: 39, control group: 40) who has worked at experimental group (H Corp.) and control group (K Corp.), Korea and 3 months in order to show the efficiency of the WellnessHumanCare.
Hybrid PV/diesel solar power system design using multi-level factor analysis optimization
NASA Astrophysics Data System (ADS)
Drake, Joshua P.
Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.
MIDAS: Software for the detection and analysis of lunar impact flashes
NASA Astrophysics Data System (ADS)
Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús
2015-06-01
Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.
Are cellular phone blocking applications effective for novice teen drivers?
Creaser, Janet I; Edwards, Christopher J; Morris, Nichole L; Donath, Max
2015-09-01
Distracted driving is a significant concern for novice teen drivers. Although cellular phone bans are applied in many jurisdictions to restrict cellular phone use, teen drivers often report making calls and texts while driving. The Minnesota Teen Driver Study incorporated cellular phone blocking functions via a software application for 182 novice teen drivers in two treatment conditions. The first condition included 92 teens who ran a driver support application on a smartphone that also blocked phone usage. The second condition included 90 teens who ran the same application with phone blocking but which also reported back to parents about monitored risky behaviors (e.g., speeding). A third control group consisting of 92 novice teen drivers had the application and phone-based software installed on the phones to record cellular phone (but not block it) use while driving. The two treatment groups made significantly fewer calls and texts per mile driven compared to the control group. The control group data also demonstrated a higher propensity to text while driving rather than making calls. Software that blocks cellular phone use (except 911) while driving can be effective at mitigating calling and texting for novice teen drivers. However, subjective data indicates that some teens were motivated to find ways around the software, as well as to use another teen's phone while driving when they were unable to use theirs. Cellular phone bans for calling and texting are the first step to changing behaviors associated with texting and driving, particularly among novice teen drivers. Blocking software has the additional potential to reduce impulsive calling and texting while driving among novice teen drivers who might logically know the risks, but for whom it is difficult to ignore calling or texting while driving. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.
2005-12-01
The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.
Optical analysis of electro-optical systems by MTF calculus
NASA Astrophysics Data System (ADS)
Barbarini, Elisa Signoreto; Dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fátima Maria Mitsue; Castro Neto, Jarbas C.; Rodrigues, Evandro Luís Linhari
2011-08-01
One of the widely used methods for performance analysis of an optical system is the determination of the Modulation Transfer Function (MTF). The MTF represents a quantitative and direct measure of image quality, and, besides being an objective test, it can be used on concatenated optical system. This paper presents the application of software called SMTF (software modulation transfer function), built in C++ and Open CV platforms for MTF calculation on electro-optical system. Through this technique, it is possible to develop specific method to measure the real time performance of a digital fundus camera, an infrared sensor and an ophthalmological surgery microscope. Each optical instrument mentioned has a particular device to measure the MTF response, which is being developed. Then the MTF information assists the analysis of the optical system alignment, and also defines its resolution limit by the MTF graphic. The result obtained from the implemented software is compared with the theoretical MTF curve from the analyzed systems.
Bridging CALL & HCI: Input from Participatory Design
ERIC Educational Resources Information Center
Cardenas-Claros, Monica S.; Gruba, Paul A.
2010-01-01
Participatory design (PD), or the collaboration between software engineers and end users throughout the design process, may help improve CALL design practices. In this case study, four ESL learners, a software designer, and a language teacher created and evaluated a series of paper prototypes concerning help options in computer-based second…
ERIC Educational Resources Information Center
Eastment, David
Despite the evolution of software for computer-assisted language learning (CALL), teacher resistance remains high. Early software for language instruction was almost exclusively designed for drill and practice. That approach was later replaced by a model in which the computer provided a stimulus for students, most often as a partner in games.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miron, M.S.; Christopher, C.; Hirshfield, S.
1978-05-01
Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less
Constructing graph models for software system development and analysis
NASA Astrophysics Data System (ADS)
Pogrebnoy, Andrey V.
2017-01-01
We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.
Estimation of Geodetic and Geodynamical Parameters with VieVS
NASA Technical Reports Server (NTRS)
Spicakova, Hana; Bohm, Johannes; Bohm, Sigrid; Nilsson, tobias; Pany, Andrea; Plank, Lucia; Teke, Kamil; Schuh, Harald
2010-01-01
Since 2008 the VLBI group at the Institute of Geodesy and Geophysics at TU Vienna has focused on the development of a new VLBI data analysis software called VieVS (Vienna VLBI Software). One part of the program, currently under development, is a unit for parameter estimation in so-called global solutions, where the connection of the single sessions is done by stacking at the normal equation level. We can determine time independent geodynamical parameters such as Love and Shida numbers of the solid Earth tides. Apart from the estimation of the constant nominal values of Love and Shida numbers for the second degree of the tidal potential, it is possible to determine frequency dependent values in the diurnal band together with the resonance frequency of Free Core Nutation. In this paper we show first results obtained from the 24-hour IVS R1 and R4 sessions.
The development of a program analysis environment for Ada
NASA Technical Reports Server (NTRS)
Brown, David B.; Carlisle, Homer W.; Chang, Kai-Hsiung; Cross, James H.; Deason, William H.; Haga, Kevin D.; Huggins, John R.; Keleher, William R. A.; Starke, Benjamin B.; Weyrich, Orville R.
1989-01-01
A unit level, Ada software module testing system, called Query Utility Environment for Software Testing of Ada (QUEST/Ada), is described. The project calls for the design and development of a prototype system. QUEST/Ada design began with a definition of the overall system structure and a description of component dependencies. The project team was divided into three groups to resolve the preliminary designs of the parser/scanner: the test data generator, and the test coverage analyzer. The Phase 1 report is a working document from which the system documentation will evolve. It provides history, a guide to report sections, a literature review, the definition of the system structure and high level interfaces, descriptions of the prototype scope, the three major components, and the plan for the remainder of the project. The appendices include specifications, statistics, two papers derived from the current research, a preliminary users' manual, and the proposal and work plan for Phase 2.
NASA Technical Reports Server (NTRS)
Goltz, G.; Kaiser, L. M.; Weiner, H.
1977-01-01
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document establishes the software requirements for the DSPA computer program, discusses the processing that occurs within the program, and defines the necessary interfaces for operation.
Browning, Brian L.; Yu, Zhaoxia
2009-01-01
We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040
Orbiter Flying Qualities (OFQ) Workstation user's guide
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.
1988-01-01
This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.
ERIC Educational Resources Information Center
Cordier, Deborah
2009-01-01
A renewed focus on foreign language (FL) learning and speech for communication has resulted in computer-assisted language learning (CALL) software developed with Automatic Speech Recognition (ASR). ASR features for FL pronunciation (Lafford, 2004) are functional components of CALL designs used for FL teaching and learning. The ASR features…
Automating Structural Analysis of Spacecraft Vehicles
NASA Technical Reports Server (NTRS)
Hrinda, Glenn A.
2004-01-01
A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure. Analysis based on the HyperSizer structural sizing software will be discussed. Design trades required to optimize structural weight will be presented.
Generic Kalman Filter Software
NASA Technical Reports Server (NTRS)
Lisano, Michael E., II; Crues, Edwin Z.
2005-01-01
The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.
Improving communication among nurses and patients.
Unluturk, Mehmet S; Ozcanhan, Mehmet H; Dalkilic, Gokhan
2015-07-01
Patients use nurse call systems to signal nurses for medical help. Traditional push button-flashing lamp call systems are not integrated with other hospital automation systems. Therefore, nurse response time becomes a matter of personal discretion. The improvement obtained by integrating a pager system into the nurse call systems does not increase care efficiency, because unnecessary visits are still not eliminated. To obtain an immediate response and a purposeful visit by a nurse; regardless of the location of nurse in hospital, traditional systems have to be improved by intelligent telephone system integration. The results of the developed Nurse Call System Software (NCSS), the Wireless Phone System Software (WPSS), the Location System Software (LSS) and the communication protocol are provided, together with detailed XML message structures. The benefits of the proposed system are also discussed and the direction of future work is presented. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
[Electromyography Analysis of Rapid Eye Movement Sleep Behavior Disorder].
Nakano, Natsuko; Kinoshita, Fumiya; Takada, Hiroki; Nakayama, Meiho
2018-01-01
Polysomnography (PSG), which records physiological phenomena including brain waves, breathing status, and muscle tonus, is useful for the diagnosis of sleep disorders as a gold standard. However, measurement and analysis are complex for several specific sleep disorders, such as rapid eye movement (REM) sleep behavior disorder (RBD). Usually, brain waves during REM sleep indicate an awakening pattern under relaxed conditions of skeletal and antigravity muscles. However, these muscles are activated during REM sleep when patients suffer from RBD. These activated muscle movements during REM, so-called REM without atonia (RWA) recorded by PSG, may be related to a neurodegenerative disease such as Parkinson's disease. Thus, careful analysis of RWA is significant not only physically, but also clinically. Commonly, manual viewing measurement analysis of RWA is time-consuming. Therefore, quantitative studies on RWA are rarely reported. A software program, developed from Microsoft Office Excel ® , was used to semiautomatically analyze the RWA ratio extracted from PSG to compare with manual viewing measurement analysis. In addition, a quantitative muscle tonus study was carried out to evaluate the effect of medication on RBD patients. Using this new software program, we were able to analyze RWA on the same cases in approximately 15 min as compared with 60 min in the manual viewing measurement analysis. This software program can not only quantify RWA easily but also identify RWA waves for either phasic or tonic bursts. We consider that this software program will support physicians and scientists in their future research on RBD. We are planning to offer this software program for free to physicians and scientists.
NASA Astrophysics Data System (ADS)
Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.
2009-08-01
A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.
Programming Makes Software; Support Makes Users
NASA Astrophysics Data System (ADS)
Batcheller, A. L.
2010-12-01
Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.
TBell: A mathematical tool for analyzing decision tables
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Chen, Zewei
1994-01-01
This paper describes the development of mathematical theory and software to analyze specifications that are developed using decision tables. A decision table is a tabular format for specifying a complex set of rules that chooses one of a number of alternative actions. The report also describes a prototype tool, called TBell, that automates certain types of analysis.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
NASA Technical Reports Server (NTRS)
Djorgovski, S. George
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.
Reliability Analysis for AFTI-F16 SRFCS Using ASSIST and SURE
NASA Technical Reports Server (NTRS)
Wu, N. Eva
2001-01-01
This paper reports the results of a study on reliability analysis of an AFTI-16 Self-Repairing Flight Control System (SRFCS) using software tools SURE (Semi-Markov Unreliability Range Evaluator and ASSIST (Abstract Semi-Markov Specification Interface to the SURE Tool). The purpose of the study is to investigate the potential utility of the software tools in the ongoing effort of the NASA Aviation Safety Program, where the class of systems must be extended beyond the originally intended serving class of electronic digital processors. The study concludes that SURE and ASSIST are applicable to reliability, analysis of flight control systems. They are especially efficient for sensitivity analysis that quantifies the dependence of system reliability on model parameters. The study also confirms an earlier finding on the dominant role of a parameter called a failure coverage. The paper will remark on issues related to the improvement of coverage and the optimization of redundancy level.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
A Practical Guide to the Technology and Adoption of Software Process Automation
1994-03-01
IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4
Overview of Current Activities in Combustion Instability
2015-10-02
and avoid liquid rocket engine combustion stability problems Approach: 1) Develop a SOA combustion stability software package called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion
Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers
NASA Technical Reports Server (NTRS)
Bjorner, Nikolaj
2010-01-01
The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings
Q-nexus: a comprehensive and efficient analysis pipeline designed for ChIP-nexus.
Hansen, Peter; Hecht, Jochen; Ibn-Salem, Jonas; Menkuec, Benjamin S; Roskosch, Sebastian; Truss, Matthias; Robinson, Peter N
2016-11-04
ChIP-nexus, an extension of the ChIP-exo protocol, can be used to map the borders of protein-bound DNA sequences at nucleotide resolution, requires less input DNA and enables selective PCR duplicate removal using random barcodes. However, the use of random barcodes requires additional preprocessing of the mapping data, which complicates the computational analysis. To date, only a very limited number of software packages are available for the analysis of ChIP-exo data, which have not yet been systematically tested and compared on ChIP-nexus data. Here, we present a comprehensive software package for ChIP-nexus data that exploits the random barcodes for selective removal of PCR duplicates and for quality control. Furthermore, we developed bespoke methods to estimate the width of the protected region resulting from protein-DNA binding and to infer binding positions from ChIP-nexus data. Finally, we applied our peak calling method as well as the two other methods MACE and MACS2 to the available ChIP-nexus data. The Q-nexus software is efficient and easy to use. Novel statistics about duplication rates in consideration of random barcodes are calculated. Our method for the estimation of the width of the protected region yields unbiased signatures that are highly reproducible for biological replicates and at the same time very specific for the respective factors analyzed. As judged by the irreproducible discovery rate (IDR), our peak calling algorithm shows a substantially better reproducibility. An implementation of Q-nexus is available at http://charite.github.io/Q/ .
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
The component-based architecture of the HELIOS medical software engineering environment.
Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C
1994-12-01
The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications.
Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering
NASA Astrophysics Data System (ADS)
Atkinson, Colin
The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.
Estimating the Earth's geometry, rotation and gravity field using a multi-satellite SLR solution
NASA Astrophysics Data System (ADS)
Stefka, V.; Blossfeld, M.; Mueller, H.; Gerstl, M.; Panafidina, N.
2012-12-01
Satellite Laser Ranging (SLR) is the unique technique to determine station coordinates, Earth Orientation Parameter (EOP) and Stokes coefficients of the Earth's gravity field in one common adjustment. These parameters form the so called "three pillars" (Plag & Pearlman, 2009) of the Global Geodetic Observing System (GGOS). In its function as official analysis center of the International Laser Ranging Service (ILRS), DGFI is developing and maintaining software to process SLR observations called "DGFI Orbit and Geodetic parameter estimation Software" (DOGS). The software is used to analyze SLR observations and to compute multi-satellite solutions. To take benefit of different orbit performances (e.g. inclination and altitude), a solution using ten different spherical satellites (ETALON1/2, LAGEOS1/2, STELLA, STARLETTE, AJISAI, LARETS, LARES, BLITS) covering the period of 12 years of observations is computed. The satellites are relatively weighted using a variance component estimation (VCE). The obtained weights are analyzed w.r.t. the potential of the satellite to monitor changes in the Earths geometry, rotation and gravity field. The estimated parameters (station coordinates and EOP) are validated w.r.t. official time series of the IERS. The Stokes coefficients are compared to recent gravity field solutions.
Estimating the Earth's gravity field using a multi-satellite SLR solution
NASA Astrophysics Data System (ADS)
Bloßfeld, Mathis; Stefka, Vojtech; Müller, Horst; Gerstl, Michael
2013-04-01
Satellite Laser Ranging (SLR) is the unique technique to determine station coordinates, Earth Orientation Parameter (EOP) and Stokes coefficients of the Earth's gravity field in one common adjustment. These parameters form the so called "three pillars" (Plag & Pearlman, 2009) of the Global Geodetic Observing System (GGOS). In its function as official analysis center of the International Laser Ranging Service (ILRS), DGFI is developing and maintaining software to process SLR observations called "DGFI Orbit and Geodetic parameter estimation Software" (DOGS). The software is used to analyze SLR observations and to compute multi-satellite solutions. To take benefit of different orbit performances (e.g. inclination and altitude), a solution using ten different spherical satellites (ETALON1/2, LAGEOS1/2, STELLA, STARLETTE, AJISAI, LARETS, LARES, BLITS) covering 12 years of observations is computed. The satellites are relatively weighted using a variance component estimation (VCE). The obtained weights are analyzed w.r.t. the potential of the satellite to monitor changes in the Earths geometry, rotation and gravity field. The estimated parameters (station coordinates and EOP) are validated w.r.t. official time series of the IERS. The obtained Stokes coefficients are compared to recent gravity field solutions and discussed in detail.
NASA Technical Reports Server (NTRS)
Greenspan, Sol; Feblowitz, Mark
1992-01-01
ACME is an experimental environment for investigating new approaches to modeling and analysis of system requirements and designs. ACME is built on and extends object-oriented conceptual modeling techniques and knowledge representation and reasoning (KRR) tools. The most immediate intended use for ACME is to help represent, understand, and communicate system designs during the early stages of system planning and requirements engineering. While our research is ostensibly aimed at software systems in general, we are particularly motivated to make an impact in the telecommunications domain, especially in the area referred to as Intelligent Networks (IN's). IN systems contain the software to provide services to users of a telecommunications network (e.g., call processing services, information services, etc.) as well as the software that provides the internal infrastructure for providing the services (e.g., resource management, billing, etc.). The software includes not only systems developed by the network proprietors but also by a growing group of independent service software providers.
Hou, Lin; Sun, Ning; Mane, Shrikant; Sayward, Fred; Rajeevan, Nallakkandi; Cheung, Kei-Hoi; Cho, Kelly; Pyarajan, Saiju; Aslan, Mihaela; Miller, Perry; Harvey, Philip D.; Gaziano, J. Michael; Concato, John; Zhao, Hongyu
2017-01-01
A key step in genomic studies is to assess high throughput measurements across millions of markers for each participant’s DNA, either using microarrays or sequencing techniques. Accurate genotype calling is essential for downstream statistical analysis of genotype-phenotype associations, and next generation sequencing (NGS) has recently become a more common approach in genomic studies. How the accuracy of variant calling in NGS-based studies affects downstream association analysis has not, however, been studied using empirical data in which both microarrays and NGS were available. In this article, we investigate the impact of variant calling errors on the statistical power to identify associations between single nucleotides and disease, and on associations between multiple rare variants and disease. Both differential and nondifferential genotyping errors are considered. Our results show that the power of burden tests for rare variants is strongly influenced by the specificity in variant calling, but is rather robust with regard to sensitivity. By using the variant calling accuracies estimated from a substudy of a Cooperative Studies Program project conducted by the Department of Veterans Affairs, we show that the power of association tests is mostly retained with commonly adopted variant calling pipelines. An R package, GWAS.PC, is provided to accommodate power analysis that takes account of genotyping errors (http://zhaocenter.org/software/). PMID:28019059
Gaburro, Julie; Duchemin, Jean-Bernard; Paradkar, Prasad N; Nahavandi, Saeid; Bhatti, Asim
2016-11-18
Widespread in the tropics, the mosquito Aedes aegypti is an important vector of many viruses, posing a significant threat to human health. Vector monitoring often requires fecundity estimation by counting eggs laid by female mosquitoes. Traditionally, manual data analyses have been used but this requires a lot of effort and is the methods are prone to errors. An easy tool to assess the number of eggs laid would facilitate experimentation and vector control operations. This study introduces a built-in software called ICount allowing automatic egg counting of the mosquito vector, Aedes aegypti. ICount egg estimation compared to manual counting is statistically equivalent, making the software effective for automatic and semi-automatic data analysis. This technique also allows rapid analysis compared to manual methods. Finally, the software has been used to assess p-cresol oviposition choices under laboratory conditions in order to test the system with different egg densities. ICount is a powerful tool for fast and precise egg count analysis, freeing experimenters from manual data processing. Software access is free and its user-friendly interface allows easy use by non-experts. Its efficiency has been tested in our laboratory with oviposition dual choices of Aedes aegypti females. The next step will be the development of a mobile application, based on the ICount platform, for vector monitoring surveys in the field.
Minerva: using a software program to improve resident performance during independent call
NASA Astrophysics Data System (ADS)
Itri, Jason N.; Redfern, Regina O.; Cook, Tessa; Scanlon, Mary H.
2010-03-01
We have developed an application called Minerva that allows tracking of resident discrepancy rates and missed cases. Minerva mines the radiology information system (RIS) for preliminary interpretations provided by residents during independent call and copies both the preliminary and final interpretations to a database. Both versions are displayed for direct comparison by Minerva and classified as 'in agreement', 'minor discrepancy' or 'major discrepancy' by the resident program director. Minerva compiles statistics comparing minor, major and total discrepancy rates for individual residents relative to the overall group. Discrepant cases are categorized according to date, modality and body part and reviewed for trends in missed cases. The rate of minor, major and total discrepancies for residents on-call at our institution was similar to rates previously published, including a 2.4% major discrepancy rate for second year radiology residents in the DePICTORS study and a 2.6% major discrepancy rate for resident at a community hospital. Trend analysis of missed cases was used to generate a topic-specific resident missed case conference on acromioclavicular (AC) joint separation injuries, which resulted in a 75% decrease in the number of missed cases related to AC separation subsequent to the conference. Using a software program to track of minor and major discrepancy rates for residents taking independent call using modified RadPeer scoring guidelines provides a competency-based metric to determine resident performance. Topic-specific conferences using the cases identified by Minerva can result in a decrease in missed cases.
Operator function modeling: An approach to cognitive task analysis in supervisory control systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1987-01-01
In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).
Design and Effects of Scenario Educational Software.
ERIC Educational Resources Information Center
Keegan, Mark
1993-01-01
Describes the development of educational computer software called scenario software that was designed to incorporate advances in cognitive, affective, and physiological research. Instructional methods are outlined; the need to change from didactic methods to discovery learning is explained; and scenario software design features are discussed. (24…
ICCE Policy Statement on Network and Multiple Machine Software.
ERIC Educational Resources Information Center
International Council for Computers in Education, Eugene, OR.
Designed to provide educators with guidance for the lawful reproduction of computer software, this document contains suggested guidelines, sample forms, and several short articles concerning software copyright and license agreements. The initial policy statement calls for educators to provide software developers (or their agents) with a…
PSGMiner: A modular software for polysomnographic analysis.
Umut, İlhan
2016-06-01
Sleep disorders affect a great percentage of the population. The diagnosis of these disorders is usually made by polysomnography. This paper details the development of new software to carry out feature extraction in order to perform robust analysis and classification of sleep events using polysomnographic data. The software, called PSGMiner, is a tool, which visualizes, processes and classifies bioelectrical data. The purpose of this program is to provide researchers with a platform with which to test new hypotheses by creating tests to check for correlations that are not available in commercially available software. The software is freely available under the GPL3 License. PSGMiner is composed of a number of diverse modules such as feature extraction, annotation, and machine learning modules, all of which are accessible from the main module. Using the software, it is possible to extract features of polysomnography using digital signal processing and statistical methods and to perform different analyses. The features can be classified through the use of five classification algorithms. PSGMiner offers an architecture designed for integrating new methods. Automatic scoring, which is available in almost all commercial PSG software, is not inherently available in this program, though it can be implemented by two different methodologies (machine learning and algorithms). While similar software focuses on a certain signal or event composed of a small number of modules with no expansion possibility, the software introduced here can handle all polysomnographic signals and events. The software simplifies the processing of polysomnographic signals for researchers and physicians that are not experts in computer programming. It can find correlations between different events which could help predict an oncoming event such as sleep apnea. The software could also be used for educational purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Trajectory Software With Upper Atmosphere Model
NASA Technical Reports Server (NTRS)
Barrett, Charles
2012-01-01
The Trajectory Software Applications 6.0 for the Dec Alpha platform has an implementation of the Jacchia-Lineberry Upper Atmosphere Density Model used in the Mission Control Center for International Space Station support. Previous trajectory software required an upper atmosphere to support atmosphere drag calculations in the Mission Control Center. The Functional operation will differ depending on the end-use of the module. In general, the calling routine will use function-calling arguments to specify input to the processor. The atmosphere model will then compute and return atmospheric density at the time of interest.
A Role-Playing Game for a Software Engineering Lab: Developing a Product Line
ERIC Educational Resources Information Center
Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio
2012-01-01
Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…
Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B
2017-07-15
Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.
Designing Educational Software for Tomorrow.
ERIC Educational Resources Information Center
Harvey, Wayne
Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…
STAMPS: development and verification of swallowing kinematic analysis software.
Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo
2017-10-17
Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P < 0.001) for displacement and velocity. The Bland-Altman plots showed good agreement between the measurements and the reference values. STAMPS provides precise and reliable kinematic measurements and multiple practical functionalities for spatiotemporal analysis. The software is expected to be useful for researchers who are interested in the swallowing motion analysis.
Third-Party Software's Trust Quagmire.
Voas, J; Hurlburt, G
2015-12-01
Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.
Using Kill-Chain Analysis to Develop Surface Ship CONOPs to Defend Against Anti-Ship Cruise Missiles
2010-06-01
used to analyze this problem. The first was a software product from the Palisade Corporation called @Risk for Excel (version 5.5) with Precision...matching range cells in Table 4. Table 5 is for the case with no soft-kill mechanisms used by the ASCM and the numeric values do not take into
Complexity: an internet resource for analysis of DNA sequence complexity
Orlov, Y. L.; Potapov, V. N.
2004-01-01
The search for DNA regions with low complexity is one of the pivotal tasks of modern structural analysis of complete genomes. The low complexity may be preconditioned by strong inequality in nucleotide content (biased composition), by tandem or dispersed repeats or by palindrome-hairpin structures, as well as by a combination of all these factors. Several numerical measures of textual complexity, including combinatorial and linguistic ones, together with complexity estimation using a modified Lempel–Ziv algorithm, have been implemented in a software tool called ‘Complexity’ (http://wwwmgs.bionet.nsc.ru/mgs/programs/low_complexity/). The software enables a user to search for low-complexity regions in long sequences, e.g. complete bacterial genomes or eukaryotic chromosomes. In addition, it estimates the complexity of groups of aligned sequences. PMID:15215465
NASA Technical Reports Server (NTRS)
Withey, James V.
1986-01-01
The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.
Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh
2016-09-15
Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.
VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II
Watzlaf, Valerie J.M.; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti
2011-01-01
In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR. PMID:25945177
VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II.
Watzlaf, Valerie J M; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti
2011-01-01
In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR.
Automatic building information model query generation
Jiang, Yufei; Yu, Nan; Ming, Jiang; ...
2015-12-01
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Automatic building information model query generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Yufei; Yu, Nan; Ming, Jiang
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
NASA Technical Reports Server (NTRS)
Byrnes, D. V.; Carney, P. C.; Underwood, J. W.; Vogt, E. D.
1974-01-01
The six month effort was responsible for the development, test, conversion, and documentation of computer software for the mission analysis of missions to halo orbits about libration points in the earth-sun system. The software consisting of two programs called NOMNAL and ERRAN is part of the Space Trajectories Error Analysis Programs. The program NOMNAL targets a transfer trajectory from earth on a given launch date to a specified halo orbit on a required arrival date. Either impulsive or finite thrust insertion maneuvers into halo orbit are permitted by the program. The transfer trajectory is consistent with a realistic launch profile input by the user. The second program ERRAN conducts error analyses of the targeted transfer trajectory. Measurements including range, doppler, star-planet angles, and apparent planet diameter are processed in a Kalman-Schmidt filter to determine the trajectory knowledge uncertainty.
Applying Hypertext Structures to Software Documentation.
ERIC Educational Resources Information Center
French, James C.; And Others
1997-01-01
Describes a prototype system for software documentation management called SLEUTH (Software Literacy Enhancing Usefulness to Humans) being developed at the University of Virginia. Highlights include information retrieval techniques, hypertext links that are installed automatically, a WAIS (Wide Area Information Server) search engine, user…
[Utility of Smartphone in Home Care Medicine - First Trial].
Takeshige, Toshiyuki; Hirano, Chiho; Nakagawa, Midori; Yoshioka, Rentaro
2015-12-01
The use of video calls for home care can reduce anxiety and offer patients peace of mind. The most suitable terminals at facilities to support home care have been iPad Air and iPhone with FaceTime software. However, usage has been limited to specific terminals. In order to eliminate the need for special terminals and software, we have developed a program that has been customized to meet the needs of facilities using Web Real Time Communication(WebRTC)in cooperation with the University of Aizu. With this software, video calls can accommodate the large number of home care patients.
ACME Priority Metrics (A-PRIME)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Zender, Charlie; Van Roekel, Luke
A-PRIME, is a collection of scripts designed to provide Accelerated Climate Model for Energy (ACME) model developers and analysts with a variety of analysis of the model needed to determine if the model is producing the desired results, depending on the goals of the simulation. The software is csh scripts based at the top level to enable scientist to provide the input parameters. Within the scripts, the csh scripts calls code to perform the postprocessing of the raw data analysis and create plots for visual assessment.
Optimized Next-Generation Sequencing Genotype-Haplotype Calling for Genome Variability Analysis
Navarro, Javier; Nevado, Bruno; Hernández, Porfidio; Vera, Gonzalo; Ramos-Onsins, Sebastián E
2017-01-01
The accurate estimation of nucleotide variability using next-generation sequencing data is challenged by the high number of sequencing errors produced by new sequencing technologies, especially for nonmodel species, where reference sequences may not be available and the read depth may be low due to limited budgets. The most popular single-nucleotide polymorphism (SNP) callers are designed to obtain a high SNP recovery and low false discovery rate but are not designed to account appropriately the frequency of the variants. Instead, algorithms designed to account for the frequency of SNPs give precise results for estimating the levels and the patterns of variability. These algorithms are focused on the unbiased estimation of the variability and not on the high recovery of SNPs. Here, we implemented a fast and optimized parallel algorithm that includes the method developed by Roesti et al and Lynch, which estimates the genotype of each individual at each site, considering the possibility to call both bases from the genotype, a single one or none. This algorithm does not consider the reference and therefore is independent of biases related to the reference nucleotide specified. The pipeline starts from a BAM file converted to pileup or mpileup format and the software outputs a FASTA file. The new program not only reduces the running times but also, given the improved use of resources, it allows its usage with smaller computers and large parallel computers, expanding its benefits to a wider range of researchers. The output file can be analyzed using software for population genetics analysis, such as the R library PopGenome, the software VariScan, and the program mstatspop for analysis considering positions with missing data. PMID:28894353
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization.
Jung, Sang-Kyu; McDonald, Karen
2011-08-16
Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization
2011-01-01
Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net. PMID:21846353
North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg
2007-10-11
Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.
PyMICE: APython library for analysis of IntelliCage data.
Dzik, Jakub M; Puścian, Alicja; Mijakowska, Zofia; Radwanska, Kasia; Łęski, Szymon
2018-04-01
IntelliCage is an automated system for recording the behavior of a group of mice housed together. It produces rich, detailed behavioral data calling for new methods and software for their analysis. Here we present PyMICE, a free and open-source library for analysis of IntelliCage data in the Python programming language. We describe the design and demonstrate the use of the library through a series of examples. PyMICE provides easy and intuitive access to IntelliCage data, and thus facilitates the possibility of using numerous other Python scientific libraries to form a complete data analysis workflow.
cyvcf2: fast, flexible variant analysis with Python.
Pedersen, Brent S; Quinlan, Aaron R
2017-06-15
Variant call format (VCF) files document the genetic variation observed after DNA sequencing, alignment and variant calling of a sample cohort. Given the complexity of the VCF format as well as the diverse variant annotations and genotype metadata, there is a need for fast, flexible methods enabling intuitive analysis of the variant data within VCF and BCF files. We introduce cyvcf2 , a Python library and software package for fast parsing and querying of VCF and BCF files and illustrate its speed, simplicity and utility. bpederse@gmail.com or aaronquinlan@gmail.com. cyvcf2 is available from https://github.com/brentp/cyvcf2 under the MIT license and from common python package managers. Detailed documentation is available at http://brentp.github.io/cyvcf2/. © The Author 2017. Published by Oxford University Press.
Knickpoint finder: A software tool that improves neotectonic analysis
NASA Astrophysics Data System (ADS)
Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.
2015-03-01
This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.
Establishment of the Data and Analysis Center for Software (DACS).
1982-01-01
they wished to renew their participation, a call for volunteers was published in several professional journals. Approximately 150 individuals responded...but they must also be educated in the benefits to be gained through their use. The promotional mechanisms, brochures, newsletters, advertisements...seminars, and active participation in professional and technical organizations. 2.1.2 Implementation Approach These goals were implemented on a build
2015-07-10
Kramer noted in the Q4 2010 Earnings Call: “Our innovation engine again delivered in 2010. The percentage of new products in our overall lineup is...their use (not to redistribute the code, reverse engineer it, etc.) and their intended use. They also agree to abide by the ITAR procedures which have
The Children of the Computer Generation: An Analysis of the Family Computer Fad in Japan.
ERIC Educational Resources Information Center
Ishigaki, Emiko Hannah
Results of a survey of grade school and junior high school students suggest that Japan is now caught up in a TV game fad called Family Computer (Fami-Com). Fami-Com is a household electric machine for video games that allows players to use more than 100 currently marketed software products. Since its introduction in 1983, the popularity of the…
Automated sequence analysis and editing software for HIV drug resistance testing.
Struck, Daniel; Wallis, Carole L; Denisov, Gennady; Lambert, Christine; Servais, Jean-Yves; Viana, Raquel V; Letsoalo, Esrom; Bronze, Michelle; Aitken, Sue C; Schuurman, Rob; Stevens, Wendy; Schmit, Jean Claude; Rinke de Wit, Tobias; Perez Bercoff, Danielle
2012-05-01
Access to antiretroviral treatment in resource-limited-settings is inevitably paralleled by the emergence of HIV drug resistance. Monitoring treatment efficacy and HIV drugs resistance testing are therefore of increasing importance in resource-limited settings. Yet low-cost technologies and procedures suited to the particular context and constraints of such settings are still lacking. The ART-A (Affordable Resistance Testing for Africa) consortium brought together public and private partners to address this issue. To develop an automated sequence analysis and editing software to support high throughput automated sequencing. The ART-A Software was designed to automatically process and edit ABI chromatograms or FASTA files from HIV-1 isolates. The ART-A Software performs the basecalling, assigns quality values, aligns query sequences against a set reference, infers a consensus sequence, identifies the HIV type and subtype, translates the nucleotide sequence to amino acids and reports insertions/deletions, premature stop codons, ambiguities and mixed calls. The results can be automatically exported to Excel to identify mutations. Automated analysis was compared to manual analysis using a panel of 1624 PR-RT sequences generated in 3 different laboratories. Discrepancies between manual and automated sequence analysis were 0.69% at the nucleotide level and 0.57% at the amino acid level (668,047 AA analyzed), and discordances at major resistance mutations were recorded in 62 cases (4.83% of differences, 0.04% of all AA) for PR and 171 (6.18% of differences, 0.03% of all AA) cases for RT. The ART-A Software is a time-sparing tool for pre-analyzing HIV and viral quasispecies sequences in high throughput laboratories and highlighting positions requiring attention. Copyright © 2012 Elsevier B.V. All rights reserved.
2013-01-01
Amplification of the human epidermal growth factor receptor 2 (HER2) is a prognostic marker for poor clinical outcome and a predictive marker for therapeutic response to targeted therapies in breast cancer patients. With the introduction of anti-HER2 therapies, accurate assessment of HER2 status has become essential. Fluorescence in situ hybridization (FISH) is a widely used technique for the determination of HER2 status in breast cancer. However, the manual signal enumeration is time-consuming. Therefore, several companies like MetaSystem have developed automated image analysis software. Some of these signal enumeration software employ the so called “tile-sampling classifier”, a programming algorithm through which the software quantifies fluorescent signals in images on the basis of square tiles of fixed dimensions. Considering that the size of tile does not always correspond to the size of a single tumor cell nucleus, some users argue that this analysis method might not completely reflect the biology of cells. For that reason, MetaSystems has developed a new classifier which is able to recognize nuclei within tissue sections in order to determine the HER2 amplification status on nuclei basis. We call this new programming algorithm “nuclei-sampling classifier”. In this study, we evaluated the accuracy of the “nuclei-sampling classifier” in determining HER2 gene amplification by FISH in nuclei of breast cancer cells. To this aim, we randomly selected from our cohort 64 breast cancer specimens (32 nonamplified and 32 amplified) and we compared results obtained through manual scoring and through this new classifier. The new classifier automatically recognized individual nuclei. The automated analysis was followed by an optional human correction, during which the user interacted with the software in order to improve the selection of cell nuclei automatically selected. Overall concordance between manual scoring and automated nuclei-sampling analysis was 98.4% (100% for nonamplified cases and 96.9% for amplified cases). However, after human correction, concordance between the two methods was 100%. We conclude that the nuclei-based classifier is a new available tool for automated quantitative HER2 FISH signals analysis in nuclei in breast cancer specimen and it can be used for clinical purposes. PMID:23379971
The Validation of a Software Evaluation Instrument.
ERIC Educational Resources Information Center
Schmitt, Dorren Rafael
This study, conducted at six southern universities, analyzed the validity and reliability of a researcher developed instrument designed to evaluate educational software in secondary mathematics. The instrument called the Instrument for Software Evaluation for Educators uses measurement scales, presents a summary section of the evaluation, and…
Software Certification for Temporal Properties With Affordable Tool Qualification
NASA Technical Reports Server (NTRS)
Xia, Songtao; DiVito, Benedetto L.
2005-01-01
It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.
Embracing Open Software Development in Solar Physics
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.
2012-12-01
We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.
NASA Astrophysics Data System (ADS)
Braun, N.; Hauth, T.; Pulvermacher, C.; Ritter, M.
2017-10-01
Today’s analyses for high-energy physics (HEP) experiments involve processing a large amount of data with highly specialized algorithms. The contemporary workflow from recorded data to final results is based on the execution of small scripts - often written in Python or ROOT macros which call complex compiled algorithms in the background - to perform fitting procedures and generate plots. During recent years interactive programming environments, such as Jupyter, became popular. Jupyter allows to develop Python-based applications, so-called notebooks, which bundle code, documentation and results, e.g. plots. Advantages over classical script-based approaches is the feature to recompute only parts of the analysis code, which allows for fast and iterative development, and a web-based user frontend, which can be hosted centrally and only requires a browser on the user side. In our novel approach, Python and Jupyter are tightly integrated into the Belle II Analysis Software Framework (basf2), currently being developed for the Belle II experiment in Japan. This allows to develop code in Jupyter notebooks for every aspect of the event simulation, reconstruction and analysis chain. These interactive notebooks can be hosted as a centralized web service via jupyterhub with docker and used by all scientists of the Belle II Collaboration. Because of its generality and encapsulation, the setup can easily be scaled to large installations.
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Forsyth, David S.; Welter, John T.
2016-02-01
To address the data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis (ADA) algorithms have been developed to make calls on indications that satisfy the detection criteria and minimize false calls. The original design followed standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. However, certain complex panels with varying shape, ply drops and the presence of bonds can complicate this interpretation process. In this paper, enhancements to the automated data analysis algorithms are introduced to address these challenges. To estimate the thickness of the part and presence of bonds without prior information, an algorithm tracks potential backwall or bond-line signals, and evaluates a combination of spatial, amplitude, and time-of-flight metrics to identify bonded sections. Once part boundaries, thickness transitions and bonded regions are identified, feature extraction algorithms are applied to multiple sets of through-thickness and backwall C-scan images, for evaluation of both first layer through thickness and layers under bonds. ADA processing results are presented for a variety of complex test specimens with inserted materials and other test discontinuities. Lastly, enhancements to the ADA software interface are presented, which improve the software usability for final data review by the inspectors and support the certification process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoiber, Marcus H.; Brown, James B.
This software implements the first base caller for nanopore data that calls bases directly from raw data. The basecRAWller algorithm has two major advantages over current nanopore base calling software: (1) streaming base calling and (2) base calling from information rich raw signal. The ability to perform truly streaming base calling as signal is received from the sequencer can be very powerful as this is one of the major advantages of this technology as compared to other sequencing technologies. As such enabling as much streaming potential as possible will be incredibly important as this technology continues to become more widelymore » applied in biosciences. All other base callers currently employ the Viterbi algorithm which requires the whole sequence to employ the complete base calling procedure and thus precludes a natural streaming base calling procedure. The other major advantage of the basecRAWller algorithm is the prediction of bases from raw signal which contains much richer information than the segmented chunks that current algorithms employ. This leads to the potential for much more accurate base calls which would make this technology much more valuable to all of the growing user base for this technology.« less
Informed-Proteomics: open-source software package for top-down proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher
Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent needmore » to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.« less
Haraksingh, Rajini R; Abyzov, Alexej; Urban, Alexander Eckehart
2017-04-24
High-resolution microarray technology is routinely used in basic research and clinical practice to efficiently detect copy number variants (CNVs) across the entire human genome. A new generation of arrays combining high probe densities with optimized designs will comprise essential tools for genome analysis in the coming years. We systematically compared the genome-wide CNV detection power of all 17 available array designs from the Affymetrix, Agilent, and Illumina platforms by hybridizing the well-characterized genome of 1000 Genomes Project subject NA12878 to all arrays, and performing data analysis using both manufacturer-recommended and platform-independent software. We benchmarked the resulting CNV call sets from each array using a gold standard set of CNVs for this genome derived from 1000 Genomes Project whole genome sequencing data. The arrays tested comprise both SNP and aCGH platforms with varying designs and contain between ~0.5 to ~4.6 million probes. Across the arrays CNV detection varied widely in number of CNV calls (4-489), CNV size range (~40 bp to ~8 Mbp), and percentage of non-validated CNVs (0-86%). We discovered strikingly strong effects of specific array design principles on performance. For example, some SNP array designs with the largest numbers of probes and extensive exonic coverage produced a considerable number of CNV calls that could not be validated, compared to designs with probe numbers that are sometimes an order of magnitude smaller. This effect was only partially ameliorated using different analysis software and optimizing data analysis parameters. High-resolution microarrays will continue to be used as reliable, cost- and time-efficient tools for CNV analysis. However, different applications tolerate different limitations in CNV detection. Our study quantified how these arrays differ in total number and size range of detected CNVs as well as sensitivity, and determined how each array balances these attributes. This analysis will inform appropriate array selection for future CNV studies, and allow better assessment of the CNV-analytical power of both published and ongoing array-based genomics studies. Furthermore, our findings emphasize the importance of concurrent use of multiple analysis algorithms and independent experimental validation in array-based CNV detection studies.
Evaluating a Service-Oriented Architecture
2007-09-01
See the description on page 13. SaaS Software as a service ( SaaS ) is a software delivery model where customers don’t own a copy of the application... serviceability REST Representational State Transfer RIA rich internet application RPC remote procedure call SaaS software as a service SAML Security...Evaluating a Service -Oriented Architecture Phil Bianco, Software Engineering Institute Rick Kotermanski, Summa Technologies Paulo Merson
ICC-CLASS: isotopically-coded cleavable crosslinking analysis software suite
2010-01-01
Background Successful application of crosslinking combined with mass spectrometry for studying proteins and protein complexes requires specifically-designed crosslinking reagents, experimental techniques, and data analysis software. Using isotopically-coded ("heavy and light") versions of the crosslinker and cleavable crosslinking reagents is analytically advantageous for mass spectrometric applications and provides a "handle" that can be used to distinguish crosslinked peptides of different types, and to increase the confidence of the identification of the crosslinks. Results Here, we describe a program suite designed for the analysis of mass spectrometric data obtained with isotopically-coded cleavable crosslinkers. The suite contains three programs called: DX, DXDX, and DXMSMS. DX searches the mass spectra for the presence of ion signal doublets resulting from the light and heavy isotopic forms of the isotopically-coded crosslinking reagent used. DXDX searches for possible mass matches between cleaved and uncleaved isotopically-coded crosslinks based on the established chemistry of the cleavage reaction for a given crosslinking reagent. DXMSMS assigns the crosslinks to the known protein sequences, based on the isotopically-coded and un-coded MS/MS fragmentation data of uncleaved and cleaved peptide crosslinks. Conclusion The combination of these three programs, which are tailored to the analytical features of the specific isotopically-coded cleavable crosslinking reagents used, represents a powerful software tool for automated high-accuracy peptide crosslink identification. See: http://www.creativemolecules.com/CM_Software.htm PMID:20109223
Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver
2016-09-02
Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.
NASA Astrophysics Data System (ADS)
Maćkowiak-Pawłowska, Maja; Przybyła, Piotr
2018-05-01
The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.
Discovering objects in a blood recipient information system.
Qiu, D; Junghans, G; Marquardt, K; Kroll, H; Mueller-Eckhardt, C; Dudeck, J
1995-01-01
Application of object-oriented (OO) methodologies has been generally considered as a solution to the problem of improving the software development process and managing the so-called software crisis. Among them, object-oriented analysis (OOA) is the most essential and is a vital prerequisite for the successful use of other OO methodologies. Though there are already a good deal of OOA methods published, the most important aspect common to all these methods: discovering objects classes truly relevant to the given problem domain, has remained a subject to be intensively researched. In this paper, using the successful development of a blood recipient information system as an example, we present our approach which is based on the conceptual framework of responsibility-driven OOA. In the discussion, we also suggest that it may be inadequate to simply attribute the software crisis to the waterfall model of the software development life-cycle. We are convinced that the real causes for the failure of some software and information systems should be sought in the methodologies used in some crucial phases of the software development process. Furthermore, a software system can also fail if object classes essential to the problem domain are not discovered, implemented and visualized, so that the real-world situation cannot be faithfully traced by it.
Fragman: an R package for fragment analysis.
Covarrubias-Pazaran, Giovanny; Diaz-Garcia, Luis; Schlautman, Brandon; Salazar, Walter; Zalapa, Juan
2016-04-21
Determination of microsatellite lengths or other DNA fragment types is an important initial component of many genetic studies such as mutation detection, linkage and quantitative trait loci (QTL) mapping, genetic diversity, pedigree analysis, and detection of heterozygosity. A handful of commercial and freely available software programs exist for fragment analysis; however, most of them are platform dependent and lack high-throughput applicability. We present the R package Fragman to serve as a freely available and platform independent resource for automatic scoring of DNA fragment lengths diversity panels and biparental populations. The program analyzes DNA fragment lengths generated in Applied Biosystems® (ABI) either manually or automatically by providing panels or bins. The package contains additional tools for converting the allele calls to GenAlEx, JoinMap® and OneMap software formats mainly used for genetic diversity and generating linkage maps in plant and animal populations. Easy plotting functions and multiplexing friendly capabilities are some of the strengths of this R package. Fragment analysis using a unique set of cranberry (Vaccinium macrocarpon) genotypes based on microsatellite markers is used to highlight the capabilities of Fragman. Fragman is a valuable new tool for genetic analysis. The package produces equivalent results to other popular software for fragment analysis while possessing unique advantages and the possibility of automation for high-throughput experiments by exploiting the power of R.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid
Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less
A Call for Bioimaging Software Usability
Carpenter, Anne E.; Kamentsky, Lee; Eliceiri, Kevin W.
2013-01-01
Bioimaging software developed in a research setting often fails to be widely used by the scientific community. We suggest that, to maximize both the public’s and researchers’ investments, usability should be a more highly valued goal. We describe specific characteristics of usability towards which bioimaging software projects should aim. PMID:22743771
Software Engineering Basics: A Primer for the Project Manager.
1982-06-01
computer software (45, 46]. It is named after Ada Augusta who is generally credited as having been the first programmer as an assistant to Charles ... Babbage , and is called, appropriately enough, ADA. The development of one common programming language for tactical software clearly has the p-.tential for
The theory of interface slicing
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
Interface slicing is a new tool which was developed to facilitate reuse-based software engineering, by addressing the following problems, needs, and issues: (1) size of systems incorporating reused modules; (2) knowledge requirements for program modification; (3) program understanding for reverse engineering; (4) module granularity and domain management; and (5) time and space complexity of conventional slicing. The definition of a form of static program analysis called interface slicing is addressed.
Automating the Transformational Development of Software. Volume 1.
1983-03-01
DRACO system [Neighbors 80] uses meta-rules to derive information about which new transformations will be applicable after a particular transformation has...transformation over another. The new model, as Incorporated in a system called Glitter, explicitly represents transformation goals, methods, and selection...done anew for each new problem (compare this with Neighbor’s Draco system [Neighbors 80] which attempts to reuse domain analysis). o Is the user
NASA Technical Reports Server (NTRS)
Djorgovski, S. G.
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has produced real, published results.
A data analysis expert system for large established distributed databases
NASA Technical Reports Server (NTRS)
Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick
1987-01-01
A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Walton, Owen
2015-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MACGMC composite material analysis code. The resulting code is called FEAMACCARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMACCARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMACCARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
Manikandan, Narayanan; Subha, Srinivasan
2016-01-01
Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.
Manikandan, Narayanan; Subha, Srinivasan
2016-01-01
Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used. PMID:26881271
A Microarray Tool Provides Pathway and GO Term Analysis.
Koch, Martin; Royer, Hans-Dieter; Wiese, Michael
2011-12-01
Analysis of gene expression profiles is no longer exclusively a task for bioinformatic experts. However, gaining statistically significant results is challenging and requires both biological knowledge and computational know-how. Here we present a novel, user-friendly microarray reporting tool called maRt. The software provides access to bioinformatic resources, like gene ontology terms and biological pathways by use of the DAVID and the BioMart web-service. Results are summarized in structured HTML reports, each presenting a different layer of information. In these report, contents of diverse sources are integrated and interlinked. To speed up processing, maRt takes advantage of the multi-core technology of modern desktop computers by using parallel processing. Since the software is built upon a RCP infrastructure it might be an outset for developers aiming to integrate novel R based applications. Installer, documentation and various kinds of tutorials are available under LGPL license at the website of our institute http://www.pharma.uni-bonn.de/www/mart. This software is free for academic use. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T
2015-08-23
Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.
Classification of communication signals of the little brown bat
NASA Astrophysics Data System (ADS)
Melendez, Karla V.; Jones, Douglas L.; Feng, Albert S.
2005-09-01
Little brown bats, Myotis lucifugus, are known for their ability to echolocate and utilize their echolocation system to navigate, locate, and identify prey. Their echolocation signals have been characterized in detail, but their communication signals are poorly understood despite their widespread use during the social interactions. The goal of this study was to characterize the communication signals of little brown bats. Sound recordings were made overnight on five individual bats (housed separately from a large group of captive bats) for 7 nights, using a Pettersson ultrasound detector D240x bat detector and Nagra ARES-BB digital recorder. The spectral and temporal characteristics of recorded sounds were first analyzed using BATSOUND software from Pettersson. Sounds were first classified by visual observation of calls' temporal pattern and spectral composition, and later using an automatic classification scheme based on multivariate statistical parameters in MATLAB. Human- and machine-based analysis revealed five discrete classes of bat's communication signals: downward frequency-modulated calls, constant frequency calls, broadband noise bursts, broadband chirps, and broadband click trains. Future studies will focus on analysis of calls' spectrotemporal modulations to discriminate any subclasses that may exist. [Research supported by Grant R01-DC-04998 from the National Institute for Deafness and Communication Disorders.
NASA Astrophysics Data System (ADS)
Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon
Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software
1994-09-01
report for the Properties of User Interface Software Architetures ", draft DISCUS Working Group, Programmers Tutorial, MITRE paper, SEI. Carnegie...execution that we have defined called asynchronous remote procedure call (ARPC) [15], which allows concurrency in amounts proportional to the amount of...demonstration project to use STARS DoD software budget and the proportion concepts. IBM is one of the prime is expected to be increased during the contractors
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
A Mechanized Decision Support System for Academic Scheduling.
1986-03-01
an operational system called software. The first step in the development phase is Design . Designers destribute software control by factoring the Data...SUBJECT TERMS (Continue on reverse if necessary and identify by block number) ELD GROUP SUB-GROUP Scheduling, Decision Support System , Software Design ...scheduling system . It will also examine software - design techniques to identify the most appropriate method- ology for this problem. " - Chapter 3 will
Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station
NASA Technical Reports Server (NTRS)
Kirby, Randy L.; Mann, David; Prenger, Stephen G.; Craig, Wayne; Greenwood, Andrew; Morsics, Jonathan; Fricker, Charles H.; Quach, Son; Lechese, Paul
2003-01-01
United Space Alliance (USA) developed and used a new software development method to meet technical, schedule, and budget challenges faced during the development and delivery of the new Shuttle Telemetry Ground Station at Kennedy Space Center. This method, called Collaborative Software Development, enabled KSC to effectively leverage industrial software and build additional capabilities to meet shuttle system and operational requirements. Application of this method resulted in reduced time to market, reduced development cost, improved product quality, and improved programmer competence while developing technologies of benefit to a small company in California (AP Labs Inc.). Many modifications were made to the baseline software product (VMEwindow), which improved its quality and functionality. In addition, six new software capabilities were developed, which are the subject of this article and add useful functionality to the VMEwindow environment. These new software programs are written in C or VXWorks and are used in conjunction with other ground station software packages, such as VMEwindow, Matlab, Dataviews, and PVWave. The Space Shuttle Telemetry Ground Station receives frequency-modulation (FM) and pulse-code-modulated (PCM) signals from the shuttle and support equipment. The hardware architecture (see figure) includes Sun workstations connected to multiple PCM- and FM-processing VersaModule Eurocard (VME) chassis. A reflective memory network transports raw data from PCM Processors (PCMPs) to the programmable digital-to-analog (D/A) converters, strip chart recorders, and analysis and controller workstations.
Kiyohara, Kosuke; Wake, Kanako; Watanabe, Soichi; Arima, Takuji; Sato, Yasuto; Kojimahara, Noriko; Taki, Masao; Cardis, Elisabeth; Yamaguchi, Naohito
2018-03-01
This study examined changes in recall accuracy for mobile phone calls over a long period. Japanese students' actual call statuses were monitored for 1 month using software-modified phones (SMPs). Three face-to-face interviews were conducted to obtain information regarding self-reported call status during the monitoring period: first interview: immediately after the monitoring period; second interview: after 10-12 months; third interview: after 48-55 months. Using the SMP records as the "gold standard", phone call recall accuracy was assessed for each interview. Data for 94 participants were analyzed. The number of calls made was underestimated considerably and the duration of calls was overestimated slightly in all interviews. Agreement between self-report and SMP records regarding the number of calls, duration of calls and laterality (i.e., use of the dominant ear while making calls) gradually deteriorated with the increase in the interval following the monitoring period (number of calls: first interview: Pearson's r=0.641, third interview: 0.396; duration of calls: first interview: Pearson's r=0.763, third interview: 0.356; laterality: first interview: weighted-κ=0.677, third interview: 0.448). Thus, recall accuracy for mobile phone calls would be consistently imperfect over a long period, and the results of related epidemiological studies should be interpreted carefully.
An Introduction to MAMA (Meta-Analysis of MicroArray data) System.
Zhang, Zhe; Fenstermacher, David
2005-01-01
Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.
1981-01-01
A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.
CSM Testbed Development and Large-Scale Structural Applications
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.
1989-01-01
A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Advance Directives and Do Not Resuscitate Orders
... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...
A comparison of acoustic montoring methods for common anurans of the northeastern United States
Brauer, Corinne; Donovan, Therese; Mickey, Ruth M.; Katz, Jonathan; Mitchell, Brian R.
2016-01-01
Many anuran monitoring programs now include autonomous recording units (ARUs). These devices collect audio data for extended periods of time with little maintenance and at sites where traditional call surveys might be difficult. Additionally, computer software programs have grown increasingly accurate at automatically identifying the calls of species. However, increased automation may cause increased error. We collected 435 min of audio data with 2 types of ARUs at 10 wetland sites in Vermont and New York, USA, from 1 May to 1 July 2010. For each minute, we determined presence or absence of 4 anuran species (Hyla versicolor, Pseudacris crucifer, Anaxyrus americanus, and Lithobates clamitans) using 1) traditional human identification versus 2) computer-mediated identification with software package, Song Scope® (Wildlife Acoustics, Concord, MA). Detections were compared with a data set consisting of verified calls in order to quantify false positive, false negative, true positive, and true negative rates. Multinomial logistic regression analysis revealed a strong (P < 0.001) 3-way interaction between the ARU recorder type, identification method, and focal species, as well as a trend in the main effect of rain (P = 0.059). Overall, human surveyors had the lowest total error rate (<2%) compared with 18–31% total errors with automated methods. Total error rates varied by species, ranging from 4% for A. americanus to 26% for L. clamitans. The presence of rain may reduce false negative rates. For survey minutes where anurans were known to be calling, the odds of a false negative were increased when fewer individuals of the same species were calling.
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stamp, Jason E.; Eddy, John P.; Jensen, Richard P.
Microgrids are a focus of localized energy production that support resiliency, security, local con- trol, and increased access to renewable resources (among other potential benefits). The Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) Joint Capa- bility Technology Demonstration (JCTD) program between the Department of Defense (DOD), Department of Energy (DOE), and Department of Homeland Security (DHS) resulted in the pre- liminary design and deployment of three microgrids at military installations. This paper is focused on the analysis process and supporting software used to determine optimal designs for energy surety microgrids (ESMs) in the SPIDERS project. There aremore » two key pieces of software, an ex- isting software application developed by Sandia National Laboratories (SNL) called Technology Management Optimization (TMO) and a new simulation developed for SPIDERS called the per- formance reliability model (PRM). TMO is a decision support tool that performs multi-objective optimization over a mixed discrete/continuous search space for which the performance measures are unrestricted in form. The PRM is able to statistically quantify the performance and reliability of a microgrid operating in islanded mode (disconnected from any utility power source). Together, these two software applications were used as part of the ESM process to generate the preliminary designs presented by SNL-led DOE team to the DOD. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military instal- lations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Tarek Abdallah, Melanie Johnson, and Harold Sanborn of the U.S. Army Corps of Engineers Construction Engineering Research Laboratory * Colleagues from Sandia National Laboratories (SNL) for their reviews, suggestions, and participation in the work.« less
Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM
NASA Astrophysics Data System (ADS)
Cajka, R.; Vaskova, J.; Vasek, J.
2018-04-01
For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.
caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oster, S.; Langella, S.; Hastings, S.
To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL:
Gesture Analysis for Astronomy Presentation Software
NASA Astrophysics Data System (ADS)
Robinson, Marc A.
Astronomy presentation software in a planetarium setting provides a visually stimulating way to introduce varied scientific concepts, including computer science concepts, to a wide audience. However, the underlying computational complexity and opportunities for discussion are often overshadowed by the brilliance of the presentation itself. To bring this discussion back out into the open, a method needs to be developed to make the computer science applications more visible. This thesis introduces the GAAPS system, which endeavors to implement free-hand gesture-based control of astronomy presentation software, with the goal of providing that talking point to begin the discussion of computer science concepts in a planetarium setting. The GAAPS system incorporates gesture capture and analysis in a unique environment presenting unique challenges, and introduces a novel algorithm called a Bounding Box Tree to create and select features for this particular gesture data. This thesis also analyzes several different machine learning techniques to determine a well-suited technique for the classification of this particular data set, with an artificial neural network being chosen as the implemented algorithm. The results of this work will allow for the desired introduction of computer science discussion into the specific setting used, as well as provide for future work pertaining to gesture recognition with astronomy presentation software.
ERIC Educational Resources Information Center
Jolicoeur, Karen; Berger, Dale E.
1986-01-01
Examination of methods used by two software review services in evaluating microcomputer courseware--EPIE (Educational Products Information Exchange) and MicroSIFT (Microcomputer Software and Information for Teachers)--found low correlations between their recommendations for 82 programs. This lack of agreement casts doubts on the usefulness of…
Lessons from 30 Years of Flight Software
NASA Technical Reports Server (NTRS)
McComas, David C.
2015-01-01
This presentation takes a brief historical look at flight software over the past 30 years, extracts lessons learned and shows how many of the lessons learned are embodied in the Flight Software product line called the core Flight System (cFS). It also captures the lessons learned from developing and applying the cFS.
Active Learning through Modeling: Introduction to Software Development in the Business Curriculum
ERIC Educational Resources Information Center
Roussev, Boris; Rousseva, Yvonna
2004-01-01
Modern software practices call for the active involvement of business people in the software process. Therefore, programming has become an indispensable part of the information systems component of the core curriculum at business schools. In this paper, we present a model-based approach to teaching introduction to programming to general business…
Software Application for Computer Aided Vocabulary Learning in a Blended Learning Environment
ERIC Educational Resources Information Center
Essam, Rasha
2010-01-01
This study focuses on the effect of computer-aided vocabulary learning software called "ArabCAVL" on students' vocabulary acquisition. It was hypothesized that students who use the ArabCAVL software in blended learning environment will surpass students who use traditional vocabulary learning strategies in face-to-face learning…
Marathon: An Open Source Software Library for the Analysis of Markov-Chain Monte Carlo Algorithms
Rechner, Steffen; Berger, Annabell
2016-01-01
We present the software library marathon, which is designed to support the analysis of sampling algorithms that are based on the Markov-Chain Monte Carlo principle. The main application of this library is the computation of properties of so-called state graphs, which represent the structure of Markov chains. We demonstrate applications and the usefulness of marathon by investigating the quality of several bounding methods on four well-known Markov chains for sampling perfect matchings and bipartite graphs. In a set of experiments, we compute the total mixing time and several of its bounds for a large number of input instances. We find that the upper bound gained by the famous canonical path method is often several magnitudes larger than the total mixing time and deteriorates with growing input size. In contrast, the spectral bound is found to be a precise approximation of the total mixing time. PMID:26824442
NASA Technical Reports Server (NTRS)
2002-01-01
Under a Phase II SBIR contract, Kennedy and Lumina Decision Systems, Inc., jointly developed the Schedule and Cost Risk Analysis Modeling (SCRAM) system, based on a version of Lumina's flagship software product, Analytica(R). Acclaimed as "the best single decision-analysis program yet produced" by MacWorld magazine, Analytica is a "visual" tool used in decision-making environments worldwide to build, revise, and present business models, minus the time-consuming difficulty commonly associated with spreadsheets. With Analytica as their platform, Kennedy and Lumina created the SCRAM system in response to NASA's need to identify the importance of major delays in Shuttle ground processing, a critical function in project management and process improvement. As part of the SCRAM development project, Lumina designed a version of Analytica called the Analytica Design Engine (ADE) that can be easily incorporated into larger software systems. ADE was commercialized and utilized in many other developments, including web-based decision support.
NASA Astrophysics Data System (ADS)
Spicakova, H.; Plank, L.; Nilsson, T.; Böhm, J.; Schuh, H.
2011-07-01
The Vienna VLBI Software (VieVS) has been developed at the Institute of Geodesy and Geophysics at TU Vienna since 2008. In this presentation, we present the module Vie_glob which is the part of VieVS that allows the parameter estimation from multiple VLBI sessions in a so-called global solution. We focus on the determination of the terrestrial reference frame (TRF) using all suitable VLBI sessions since 1984. We compare different analysis options like the choice of loading corrections or of one of the models for the tropospheric delays. The effect of atmosphere loading corrections on station heights if neglected at observation level will be shown. Time series of station positions (using a previously determined TRF as a priori values) are presented and compared to other estimates of site positions from individual IVS (International VLBI Service for Geodesy and Astrometry) Analysis Centers.
Flight Software Development for the CHEOPS Instrument with the CORDET Framework
NASA Astrophysics Data System (ADS)
Cechticky, V.; Ottensamer, R.; Pasetti, A.
2015-09-01
CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)
Software and Algorithms for Biomedical Image Data Processing and Visualization
NASA Technical Reports Server (NTRS)
Talukder, Ashit; Lambert, James; Lam, Raymond
2004-01-01
A new software equipped with novel image processing algorithms and graphical-user-interface (GUI) tools has been designed for automated analysis and processing of large amounts of biomedical image data. The software, called PlaqTrak, has been specifically used for analysis of plaque on teeth of patients. New algorithms have been developed and implemented to segment teeth of interest from surrounding gum, and a real-time image-based morphing procedure is used to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The PlaqTrak system integrates these components into a single software suite with an easy-to-use GUI (see Figure 1) that allows users to do an end-to-end run of a patient s record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image. The automated and accurate processing of the captured images to segment each tooth [see Figure 2(a)] and then detect plaque on a tooth-by-tooth basis is a critical component of the PlaqTrak system to do clinical trials and analysis with minimal human intervention. These features offer distinct advantages over other competing systems that analyze groups of teeth or synthetic teeth. PlaqTrak divides each segmented tooth into eight regions using an advanced graphics morphing procedure [see results on a chipped tooth in Figure 2(b)], and a pattern recognition classifier is then used to locate plaque [red regions in Figure 2(d)] and enamel regions. The morphing allows analysis within regions of teeth, thereby facilitating detailed statistical analysis such as the amount of plaque present on the biting surfaces on teeth. This software system is applicable to a host of biomedical applications, such as cell analysis and life detection, or robotic applications, such as product inspection or assembly of parts in space and industry.
The definitive analysis of the Bendandi's methodology performed with a specific software
NASA Astrophysics Data System (ADS)
Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro
2015-04-01
The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.
Student Evaluation of CALL Tools during the Design Process
ERIC Educational Resources Information Center
Nesbitt, Dallas
2013-01-01
This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…
Management Aspects of Software Maintenance.
1984-09-01
educated in * the complex nature of software maintenance to be able to properly evaluate and manage the software maintenance effort. In this...maintenance and improvement may be called "software evolution". The soft- ware manager must be Educated in the complex nature cf soft- Iware maintenance to be...complaint of error or request for modification is also studied in order to determine what action needs tc be taken. 2. Define Objective and Approach :
Reliability Engineering for Service Oriented Architectures
2013-02-01
Common Object Request Broker Architecture Ecosystem In software , an ecosystem is a set of applications and/or services that grad- ually build up over time...Enterprise Service Bus Foreign In an SOA context: Any SOA, service or software which the owners of the calling software do not have control of, either...SOA Service Oriented Architecture SRE Software Reliability Engineering System Mode Many systems exhibit different modes of operation. E.g. the cockpit
NASA Astrophysics Data System (ADS)
Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer
2015-05-01
Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.
2010-03-01
submenus and toolbar with icon buttons 4. The IFOTA shall conform to Defense Information Infrastructure Common Operating Environment ( DII COE) and...him my business card , but it might come in the package we request via AFRL). PSYOP Instructor IWST is now called IWT (??) SME MD MD Instructor...Engineering and Software Engineering CTA Cognitive Task Analysis DII COE Defense Information Infrastructure Common Operating Environment EJB Enterprise Java
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poliakoff, David; Legendre, Matt
2017-03-29
GOTCHA is a runtime API intercepting function calls between shared libraries. It is intended to be used by HPC Tools (i.e., performance analysis tools like Open/SpeedShop, HPCToolkit, TAU, etc.). 2:18 PMThese other tools can use Gotch to intercept interesting functions, such as MPI functions, and collect performance metrics about those functions. We intend for this to be open-source software that gets adopted by other open-s0urse tools that are used at LLNL.
Transient loads analysis for space flight applications
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.
1992-01-01
A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.
Interdisciplinary analysis procedures in the modeling and control of large space-based structures
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.
1987-01-01
The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.
NASA Software Assurance's Roles in Research and Technology
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2010-01-01
This slide presentation reviews the interactions between the scientist and engineers doing research and technology and the software developers and others who are doing software assurance. There is a discussion of the role of the Safety and Mission Assurance (SMA) in developing software to be used for research and technology, and the importance of this role as the technology moves to the higher levels of the technology readiness levels (TRLs). There is also a call to change the way the development of software is developed.
Proposing an Evidence-Based Strategy for Software Requirements Engineering.
Lindoerfer, Doris; Mansmann, Ulrich
2016-01-01
This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.
Basic to Advanced InSAR Processing: GMTSAR
NASA Astrophysics Data System (ADS)
Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.
2017-12-01
Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.
UWB Tracking Software Development
NASA Technical Reports Server (NTRS)
Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda
2006-01-01
An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.
OCEAN-PC and a distributed network for ocean data
NASA Technical Reports Server (NTRS)
Mclain, Douglas R.
1992-01-01
The Intergovernmental Oceanographic Commission (IOC) wishes to develop an integrated software package for oceanographic data entry and access in developing countries. The software, called 'OCEAN-PC', would run on low cost PC microcomputers and would encourage and standardize: (1) entry of local ocean observations; (2) quality control of the local data; (3) merging local data with historical data; (4) improved display and analysis of the merged data; and (5) international data exchange. OCEAN-PC will link existing MS-DOS oceanographic programs and data sets with table-driven format conversions. Since many ocean data sets are now being distributed on optical discs (Compact Discs - Read Only Memory, CD-ROM, Mass et al. 1987), OCEAN-PC will emphasize access to CD-ROMs.
Unluturk, Mehmet S
2012-06-01
Nurse call system is an electrically functioning system by which patients can call upon from a bedside station or from a duty station. An intermittent tone shall be heard and a corridor lamp located outside the room starts blinking with a slow or a faster rate depending on the call origination. It is essential to alert nurses on time so that they can offer care and comfort without any delay. There are currently many devices available for a nurse call system to improve communication between nurses and patients such as pagers, RFID (radio frequency identification) badges, wireless phones and so on. To integrate all these devices into an existing nurse call system and make they communicate with each other, we propose software client applications called bridges in this paper. We also propose a window server application called SEE (Supervised Event Executive) that delivers messages among these devices. A single hardware dongle is utilized for authentication and copy protection for SEE. Protecting SEE with securities provided by dongle only is a weak defense against hackers. In this paper, we develop some defense patterns for hackers such as calculating checksums in runtime, making calls to dongle from multiple places in code and handling errors properly by logging them into database.
NASA Astrophysics Data System (ADS)
García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel
2013-08-01
This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In this study, domain ontology about the field of lead-lag compensator design has been built and used for automatic exercise generation, graphical user interface population and interaction with the user at any level of detail, including explanations about why things occur. An application called Onto-CELE (ontology-based control engineering learning environment) uses the ontology for implementing a learning environment that can be used for self and lifelong learning purposes. The experience has shown that the use of knowledge models as the basis for educational software applications is capable of showing students the whole complexity of the analysis and design processes at any level of detail. A practical experience with postgraduate students has shown the mentioned benefits and possibilities of the approach.
Designing software for operational decision support through coloured Petri nets
NASA Astrophysics Data System (ADS)
Maggi, F. M.; Westergaard, M.
2017-05-01
Operational support provides, during the execution of a business process, replies to questions such as 'how do I end the execution of the process in the cheapest way?' and 'is my execution compliant with some expected behaviour?' These questions may be asked several times during a single execution and, to answer them, dedicated software components (the so-called operational support providers) need to be invoked. Therefore, an infrastructure is needed to handle multiple providers, maintain data between queries about the same execution and discard information when it is no longer needed. In this paper, we use coloured Petri nets (CPNs) to model and analyse software implementing such an infrastructure. This analysis is needed to clarify the requirements before implementation and to guarantee that the resulting software is correct. To this aim, we present techniques to represent and analyse state spaces with 250 million states on a normal PC. We show how the specified requirements have been implemented as a plug-in of the process mining tool ProM and how the operational support in ProM can be used in combination with an existing operational support provider.
ERIC Educational Resources Information Center
Handley, Zöe
2014-01-01
This paper argues that the goal of Computer-Assisted Language Learning (CALL) research should be to construct a reliable evidence-base with "engineering power" and generality upon which the design of future CALL software and activities can be based. In order to establish such an evidence base for future CALL design, it suggests that CALL…
How to choose the right statistical software?-a method increasing the post-purchase satisfaction.
Cavaliere, Roberto
2015-12-01
Nowadays, we live in the "data era" where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher's personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it's not enough at all and might lead to a "dead end" situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called "gray literature", even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher's own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011.
1988-12-01
software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value
Unobtrusive integration of data management with fMRI analysis.
Poliakov, Andrew V; Hertzenberg, Xenia; Moore, Eider B; Corina, David P; Ojemann, George A; Brinkley, James F
2007-01-01
This note describes a software utility, called X-batch which addresses two pressing issues typically faced by functional magnetic resonance imaging (fMRI) neuroimaging laboratories (1) analysis automation and (2) data management. The first issue is addressed by providing a simple batch mode processing tool for the popular SPM software package (http://www.fil.ion. ucl.ac.uk/spm/; Welcome Department of Imaging Neuroscience, London, UK). The second is addressed by transparently recording metadata describing all aspects of the batch job (e.g., subject demographics, analysis parameters, locations and names of created files, date and time of analysis, and so on). These metadata are recorded as instances of an extended version of the Protégé-based Experiment Lab Book ontology created by the Dartmouth fMRI Data Center. The resulting instantiated ontology provides a detailed record of all fMRI analyses performed, and as such can be part of larger systems for neuroimaging data management, sharing, and visualization. The X-batch system is in use in our own fMRI research, and is available for download at http://X-batch.sourceforge.net/.
NASA Astrophysics Data System (ADS)
Beauchamp, James W.
2002-11-01
Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.
LD2SNPing: linkage disequilibrium plotter and RFLP enzyme mining for tag SNPs
Chang, Hsueh-Wei; Chuang, Li-Yeh; Chang, Yan-Jhu; Cheng, Yu-Huei; Hung, Yu-Chen; Chen, Hsiang-Chi; Yang, Cheng-Hong
2009-01-01
Background Linkage disequilibrium (LD) mapping is commonly used to evaluate markers for genome-wide association studies. Most types of LD software focus strictly on LD analysis and visualization, but lack supporting services for genotyping. Results We developed a freeware called LD2SNPing, which provides a complete package of mining tools for genotyping and LD analysis environments. The software provides SNP ID- and gene-centric online retrievals for SNP information and tag SNP selection from dbSNP/NCBI and HapMap, respectively. Restriction fragment length polymorphism (RFLP) enzyme information for SNP genotype is available to all SNP IDs and tag SNPs. Single and multiple SNP inputs are possible in order to perform LD analysis by online retrieval from HapMap and NCBI. An LD statistics section provides D, D', r2, δQ, ρ, and the P values of the Hardy-Weinberg Equilibrium for each SNP marker, and Chi-square and likelihood-ratio tests for the pair-wise association of two SNPs in LD calculation. Finally, 2D and 3D plots, as well as plain-text output of the results, can be selected. Conclusion LD2SNPing thus provides a novel visualization environment for multiple SNP input, which facilitates SNP association studies. The software, user manual, and tutorial are freely available at . PMID:19500380
Development of a prototype commonality analysis tool for use in space programs
NASA Technical Reports Server (NTRS)
Yeager, Dorian P.
1988-01-01
A software tool to aid in performing commonality analyses, called Commonality Analysis Problem Solver (CAPS), was designed, and a prototype version (CAPS 1.0) was implemented and tested. The CAPS 1.0 runs in an MS-DOS or IBM PC-DOS environment. The CAPS is designed around a simple input language which provides a natural syntax for the description of feasibility constraints. It provides its users with the ability to load a database representing a set of design items, describe the feasibility constraints on items in that database, and do a comprehensive cost analysis to find the most economical substitution pattern.
EBEX: A Balloon-Borne Telescope for Measuring Cosmic Microwave Background Polarization
NASA Astrophysics Data System (ADS)
Chapman, Daniel
2015-05-01
EBEX is a long-duration balloon-borne (LDB) telescope designed to probe polarization signals in the cosmic microwave background (CMB). It is designed to measure or place an upper limit on the inflationary B-mode signal, a signal predicted by inflationary theories to be imprinted on the CMB by gravitational waves, to detect the effects of gravitational lensing on the polarization of the CMB, and to characterize polarized Galactic foreground emission. The payload consists of a pointed gondola that houses the optics, polarimetry, detectors and detector readout systems, as well as the pointing sensors, control motors, telemetry sytems, and data acquisition and flight control computers. Polarimetry is achieved with a rotating half-wave plate and wire grid polarizer. The detectors are sensitive to frequency bands centered on 150, 250, and 410 GHz. EBEX was flown in 2009 from New Mexico as a full system test, and then flown again in December 2012 / January 2013 over Antarctica in a long-duration flight to collect scientific data. In the instrumentation part of this thesis we discuss the pointing sensors and attitude determination algorithms. We also describe the real-time map making software, "QuickLook", that was custom-designed for EBEX. We devote special attention to the design and construction of the primary pointing sensors, the star cameras, and their custom-designed flight software package, "STARS" (the Star Tracking Attitude Reconstruction Software). In the analysis part of this thesis we describe the current status of the post-flight analysis procedure. We discuss the data structures used in analysis and the pipeline stages related to attitude determination and map making. We also discuss a custom-designed software framework called "LEAP" (the LDB EBEX Analysis Pipeline) that supports most of the analysis pipeline stages.
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
FISH Finder: a high-throughput tool for analyzing FISH images
Shirley, James W.; Ty, Sereyvathana; Takebayashi, Shin-ichiro; Liu, Xiuwen; Gilbert, David M.
2011-01-01
Motivation: Fluorescence in situ hybridization (FISH) is used to study the organization and the positioning of specific DNA sequences within the cell nucleus. Analyzing the data from FISH images is a tedious process that invokes an element of subjectivity. Automated FISH image analysis offers savings in time as well as gaining the benefit of objective data analysis. While several FISH image analysis software tools have been developed, they often use a threshold-based segmentation algorithm for nucleus segmentation. As fluorescence signal intensities can vary significantly from experiment to experiment, from cell to cell, and within a cell, threshold-based segmentation is inflexible and often insufficient for automatic image analysis, leading to additional manual segmentation and potential subjective bias. To overcome these problems, we developed a graphical software tool called FISH Finder to automatically analyze FISH images that vary significantly. By posing the nucleus segmentation as a classification problem, compound Bayesian classifier is employed so that contextual information is utilized, resulting in reliable classification and boundary extraction. This makes it possible to analyze FISH images efficiently and objectively without adjustment of input parameters. Additionally, FISH Finder was designed to analyze the distances between differentially stained FISH probes. Availability: FISH Finder is a standalone MATLAB application and platform independent software. The program is freely available from: http://code.google.com/p/fishfinder/downloads/list Contact: gilbert@bio.fsu.edu PMID:21310746
NASA Technical Reports Server (NTRS)
Hughes, David; Dazzo, Tony
2007-01-01
This viewgraph presentation reviews the use of particle analysis to assist in preparing for the 4th Hubble Space Telescope (HST) Servicing mission. During this mission the Space Telescope Imaging Spectrograph (STIS) will be repaired. The particle analysis consisted of Finite element mesh creation, Black-body viewfactors generated using I-DEAS TMG Thermal Analysis, Grey-body viewfactors calculated using Markov method, Particle distribution modeled using an iterative Monte Carlo process, (time-consuming); in house software called MASTRAM, Differential analysis performed in Excel, and Visualization provided by Tecplot and I-DEAS. Several tests were performed and are reviewed: Conformal Coat Particle Study, Card Extraction Study, Cover Fastener Removal Particle Generation Study, and E-Graf Vibration Particulate Study. The lessons learned during this analysis are also reviewed.
Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.
Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel
2016-01-01
One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.
Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science
Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel
2016-01-01
One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883
Matpar: Parallel Extensions for MATLAB
NASA Technical Reports Server (NTRS)
Springer, P. L.
1998-01-01
Matpar is a set of client/server software that allows a MATLAB user to take advantage of a parallel computer for very large problems. The user can replace calls to certain built-in MATLAB functions with calls to Matpar functions.
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
Advanced Concept Architecture Design and Integrated Analysis (ACADIA)
2017-11-03
and the vertical drag due to the induced velocity download on the vehicle structure. The propeller blades are assumed to be rigid and therefore any...flapping of the blades is assumed to be negligible. Thus, the tip path plane angle of attack gives an indication of the multicopter attitude when used...The software required to run this printer is called Catalyst EX. Catalyst EX generates an estimated print time with a given STL file. Fixed wing
Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.
Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.
Kerepesi, Csaba; Grolmusz, Vince
2016-05-01
DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under-performers: they counted quite reliably each short read to their respective taxon, producing the typical genome length bias. The benchmark dataset is available at http://pitgroup.org/static/3RandomGenome-100kavg150bps.fna.
A database for TMT interface control documents
NASA Astrophysics Data System (ADS)
Gillies, Kim; Roberts, Scott; Brighton, Allan; Rogers, John
2016-08-01
The TMT Software System consists of software components that interact with one another through a software infrastructure called TMT Common Software (CSW). CSW consists of software services and library code that is used by developers to create the subsystems and components that participate in the software system. CSW also defines the types of components that can be constructed and their roles. The use of common component types and shared middleware services allows standardized software interfaces for the components. A software system called the TMT Interface Database System was constructed to support the documentation of the interfaces for components based on CSW. The programmer describes a subsystem and each of its components using JSON-style text files. A command interface file describes each command a component can receive and any commands a component sends. The event interface files describe status, alarms, and events a component publishes and status and events subscribed to by a component. A web application was created to provide a user interface for the required features. Files are ingested into the software system's database. The user interface allows browsing subsystem interfaces, publishing versions of subsystem interfaces, and constructing and publishing interface control documents that consist of the intersection of two subsystem interfaces. All published subsystem interfaces and interface control documents are versioned for configuration control and follow the standard TMT change control processes. Subsystem interfaces and interface control documents can be visualized in the browser or exported as PDF files.
Producing genome structure populations with the dynamic and automated PGS software.
Hua, Nan; Tjong, Harianto; Shin, Hanjun; Gong, Ke; Zhou, Xianghong Jasmine; Alber, Frank
2018-05-01
Chromosome conformation capture technologies such as Hi-C are widely used to investigate the spatial organization of genomes. Because genome structures can vary considerably between individual cells of a population, interpreting ensemble-averaged Hi-C data can be challenging, in particular for long-range and interchromosomal interactions. We pioneered a probabilistic approach for the generation of a population of distinct diploid 3D genome structures consistent with all the chromatin-chromatin interaction probabilities from Hi-C experiments. Each structure in the population is a physical model of the genome in 3D. Analysis of these models yields new insights into the causes and the functional properties of the genome's organization in space and time. We provide a user-friendly software package, called PGS, which runs on local machines (for practice runs) and high-performance computing platforms. PGS takes a genome-wide Hi-C contact frequency matrix, along with information about genome segmentation, and produces an ensemble of 3D genome structures entirely consistent with the input. The software automatically generates an analysis report, and provides tools to extract and analyze the 3D coordinates of specific domains. Basic Linux command-line knowledge is sufficient for using this software. A typical running time of the pipeline is ∼3 d with 300 cores on a computer cluster to generate a population of 1,000 diploid genome structures at topological-associated domain (TAD)-level resolution.
Proteomics Quality Control: Quality Control Software for MaxQuant Results.
Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan
2016-03-04
Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .
A Practical Software Architecture for Virtual Universities
ERIC Educational Resources Information Center
Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun
2006-01-01
This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…
Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2011-01-01
An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.
Software Accelerates Computing Time for Complex Math
NASA Technical Reports Server (NTRS)
2014-01-01
Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.
Advanced Transport Operating System (ATOPS) utility library software description
NASA Technical Reports Server (NTRS)
Clinedinst, Winston C.; Slominski, Christopher J.; Dickson, Richard W.; Wolverton, David A.
1993-01-01
The individual software processes used in the flight computers on-board the Advanced Transport Operating System (ATOPS) aircraft have many common functional elements. A library of commonly used software modules was created for general uses among the processes. The library includes modules for mathematical computations, data formatting, system database interfacing, and condition handling. The modules available in the library and their associated calling requirements are described.
Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich
2009-02-10
Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
System for NIS Forecasting Based on Ensembles Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-01-02
BMA-NIS is a package/library designed to be called by a script (e.g. Perl or Python). The software itself is written in the language of R. The software assists electric power delivery systems in planning resource availability and demand, based on historical data and current data variables. Net Interchange Schedule (NIS) is the algebraic sum of all energy scheduled to flow into or out of a balancing area during any interval. Accurate forecasts for NIS are important so that the Area Control Error (ACE) stays within an acceptable limit. To date, there are many approaches for forecasting NIS but all nonemore » of these are based on single models that can be sensitive to time of day and day of week effects.« less
Automated synthesis and composition of taskblocks for control of manufacturing systems.
Holloway, L E; Guan, X; Sundaravadivelu, R; Ashley, J R
2000-01-01
Automated control synthesis methods for discrete-event systems promise to reduce the time required to develop, debug, and modify control software. Such methods must be able to translate high-level control goals into detailed sequences of actuation and sensing signals. In this paper, we present such a technique. It relies on analysis of a system model, defined as a set of interacting components, each represented as a form of condition system Petri net. Control logic modules, called taskblocks, are synthesized from these individual models. These then interact hierarchically and sequentially to drive the system through specified control goals. The resulting controller is automatically converted to executable control code. The paper concludes with a discussion of a set of software tools developed to demonstrate the techniques on a small manufacturing system.
NASA Astrophysics Data System (ADS)
Hennell, Michael
This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.
Long term stability and individual distinctiveness in captive orca vocalizations
NASA Astrophysics Data System (ADS)
Noonan, Michael; Suchak, Malini
2005-04-01
With focus on the question of signature calling in killer whales, recordings from five captive orcas (of Icelandic origin) held at Marineland of Canada were compared. For the present analysis, samples of three different call syllables were selected from recordings made five years apart and from instances in which the identity of the calling whale was unambiguous due to temporary isolation, concomitant bubbling, and/or head nodding. The Raven software package was used to ascertain the frequency range, frequency (max), duration, and timing of maximum and minimum power within each sample. For two of the three call syllables, statistically significant differences were found among the five whales for call length and for the timing of maximums and minimums (p<0.01-0.001). This similarly proved true for nearly all pairwise comparisons between whales, including mother-offspring dyads. By contrast, for three of four whales for which we had sufficient samples, no significant differences were found on any measure between samples taken from the same whales five years apart. These findings therefore support the notion that the voices of individual orcas are distinct from one another in ways that are stable over the course of multiple years.
Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.
Chen, Rong; Nixon, Erika; Herskovits, Edward
2016-04-01
Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.
Formal verification of mathematical software
NASA Technical Reports Server (NTRS)
Sutherland, D.
1984-01-01
Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.
Investing in Software Sustainment
2015-04-30
colored arrows simply represent a reinforcing loop called the “ Bandwagon Effect ”. This effect simply means that a series of successful missions will...the Software Engineering Institute (SEI) developed a simulation model for analyzing the effects of changes in demand for software sustainment and the...developed a simulation model for analyzing the effects of changes in demand for software sustainment and the corresponding funding decisions. The model
A low-cost PC-based telemetry data-reduction system
NASA Astrophysics Data System (ADS)
Simms, D. A.; Butterfield, C. P.
1990-04-01
The Solar Energy Research Institute's (SERI) Wind Research Branch is using Pulse Code Modulation (PCM) telemetry data-acquisition systems to study horizontal-axis wind turbines. PCM telemetry systems are used in test installations that require accurate multiple-channel measurements taken from a variety of different locations. SERI has found them ideal for use in tests requiring concurrent acquisition of data-reduction system to facilitate quick, in-the-field multiple-channel data analysis. Called the PC-PCM System, it consists of two basic components. First, AT-compatible hardware boards are used for decoding and combining PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for the DOS operating system was developed to simplify data-acquisition control and management. The software provides a quick, easy-to-use interface between the PC and PCM data streams. Called the Quick-Look Data Management Program, it is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. This paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data. Also discussed are problems and techniques associated with PC-based telemetry data acquisition, processing, and real-time display.
Architecture and Implementation of OpenPET Firmware and Embedded Software
Abu-Nimeh, Faisal T.; Ito, Jennifer; Moses, William W.; Peng, Qiyu; Choong, Woon-Seng
2016-01-01
OpenPET is an open source, modular, extendible, and high-performance platform suitable for multi-channel data acquisition and analysis. Due to the flexibility of the hardware, firmware, and software architectures, the platform is capable of interfacing with a wide variety of detector modules not only in medical imaging but also in homeland security applications. Analog signals from radiation detectors share similar characteristics – a pulse whose area is proportional to the deposited energy and whose leading edge is used to extract a timing signal. As a result, a generic design method of the platform is adopted for the hardware, firmware, and software architectures and implementations. The analog front-end is hosted on a module called a Detector Board, where each board can filter, combine, timestamp, and process multiple channels independently. The processed data is formatted and sent through a backplane bus to a module called Support Board, where 1 Support Board can host up to eight Detector Board modules. The data in the Support Board, coming from 8 Detector Board modules, can be aggregated or correlated (if needed) depending on the algorithm implemented or runtime mode selected. It is then sent out to a computer workstation for further processing. The number of channels (detector modules), to be processed, mandates the overall OpenPET System Configuration, which is designed to handle up to 1,024 channels using 16-channel Detector Boards in the Standard System Configuration and 16,384 channels using 32-channel Detector Boards in the Large System Configuration. PMID:27110034
Cake: a bioinformatics pipeline for the integrated analysis of somatic variants in cancer genomes
Rashid, Mamunur; Robles-Espinoza, Carla Daniela; Rust, Alistair G.; Adams, David J.
2013-01-01
Summary: We have developed Cake, a bioinformatics software pipeline that integrates four publicly available somatic variant-calling algorithms to identify single nucleotide variants with higher sensitivity and accuracy than any one algorithm alone. Cake can be run on a high-performance computer cluster or used as a stand-alone application. Availabilty: Cake is open-source and is available from http://cakesomatic.sourceforge.net/ Contact: da1@sanger.ac.uk Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:23803469
Automated systems for the analysis of meteor spectra: The SMART Project
NASA Astrophysics Data System (ADS)
Madiedo, José M.
2017-09-01
This work analyzes a meteor spectroscopy survey called SMART (Spectroscopy of Meteoroids in the Atmosphere by means of Robotic Technologies), which is being conducted since 2006. In total, 55 spectrographs have been deployed at 10 different locations in Spain with the aim to obtain information about the chemical nature of meteoroids ablating in the atmosphere. The main improvements in the hardware and the software developed in the framework of this project are described, and some results obtained by these automatic devices are also discussed.
Methods and analysis of realizing randomized grouping.
Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi
2011-07-01
Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.
Reducing maintenance costs in agreement with CNC machine tools reliability
NASA Astrophysics Data System (ADS)
Ungureanu, A. L.; Stan, G.; Butunoi, P. A.
2016-08-01
Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Software Review: Welcome to the World of Delta Drawing.
ERIC Educational Resources Information Center
King, Charles
1983-01-01
Provided is a review of an educational software program called "Delta Drawing." Included are comments on how the graphics program works, programing features, comparison with LOGO, educational value, and availability. Indicates that as a powerful learning program, it is innovative and imaginative. (JN)
RAPTR-SV: a hybrid method for the detection of structural variants
USDA-ARS?s Scientific Manuscript database
Motivation: Identification of Structural Variants (SV) in sequence data results in a large number of false positive calls using existing software, which overburdens subsequent validation. Results: Simulations using RAPTR-SV and another software package that uses a similar algorithm for SV detection...
An expert system as applied to bridges : software development phase.
DOT National Transportation Integrated Search
1989-01-01
This report describes the results of the third of a four-part study dealing with the use of a computerized expert system to assist bridge engineers in their structures management program. In this phase of the study, software (called DOBES) was writte...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G M
2005-05-03
For a number of years I had the pleasure of teaching Testing Seminars all over the world and meeting and learning from others in our field. Over a twelve year period, I always asked the following questions to Software Developers, Test Engineers, and Managers who took my two or three day seminar on Software Testing: 'When was the first time you heard the word test'? 'Where were you when you first heard the word test'? 'Who said the word test'? 'How did the word test make you feel'? Most of the thousands of responses were similar to 'It was mymore » third grade teacher at school, and I felt nervous and afraid'. Now there were a few exceptions like 'It was my third grade teacher, and I was happy and excited to show how smart I was'. But by and large, my informal survey found that 'testing' is a word to which most people attach negative meanings, based on its historical context. So why is this important to those of us in the software development business? Because I have found that a preponderance of software developers do not get real excited about hearing that the software they just wrote is going to be 'tested' by the Test Group. Typical reactions I have heard over the years run from: 'I'm sure there is nothing wrong with the software, so go ahead and test it, better you find defects than our customers'. to these extremes: 'There is no need to test my software because there is nothing wrong with it'. 'You are not qualified to test my software because you don't know as much as I do about it'. 'If any Test Engineers come into our office again to test our software we will throw them through the third floor window'. So why is there such a strong negative reaction to testing? It is primitive. It goes back to grade school for many of us. It is a negative word that congers up negative emotions. In other words, 'test' is a four letter word. How many of us associate 'Joy' with 'Test'? Not many. It is hard for most of us to reprogram associations learned at an early age. So what can we do about it (short of hypnotic therapy for software developers)? Well one concept I have used (and still use) is to not call testing 'testing'. Call it something else. Ever wonder why most of the Independent Software Testing groups are called Software Quality Assurance groups? Now you know. Software Quality Assurance is not such a negatively charged phrase, even though Software Quality Assurance is much more than simply testing. It was a real blessing when the concept of Validation and Verification came about for software. Now I define Validation to mean assuring that the product produced does the right thing (usually what the customer wants it to do), and verification means that the product was built the right way (in accordance with some good design principles and practices). So I have deliberately called the System Test Group the Verification and Validation Group, or V&V Group, as a way of avoiding the negative image problem. I remember once having a conversation with a developer colleague who said, in the heat of battle, that it was fine to V&V his code, just don't test it! Once again V&V includes many things besides testing, but it just doesn't sound like an onerous thing to do to software. In my current job, working at a highly regarded national laboratory with world renowned physicists, I have again encountered the negativity about testing software. Except here they don't take kindly to Software Quality Assurance or Software Verification and Validation either. After all, software is just a trivial tool to automate algorithms that implement physics models. Testing, SQA, and V&V take time and get in the way of completing ground breaking science experiments. So I have again had to change the name of software testing to something less negative in the physics world. I found (the hard way) that if I requested more time to do software experimentation, the physicist's resistance melted. And so the conversation continues, 'We have time to run more software experiments. Just don't waste any time testing the software'! In case the concept of not calling testing 'testing' appeals to you, and there may be an opportunity for you to take the sting out of the name at your place of employment, I have compiled a table of things that testing could be called besides 'testing'. Of course we can embellish this by adding some good sounding prefixes and suffixes also. To come up with alternate names for testing, pick a word from columns A, B, and C in the table below. For instance Unified Acceptance Trials (A2,B7,C3) or Tailored Observational Demonstration (A6,B5,C5) or Agile Criteria Scoring (A3,B8,C8) or Rapid Requirement Proof (A1,B9,C7) or Satisfaction Assurance (B10,C1). You can probably think of some additional combinations appropriate for your industry.« less
Gross, Arnd; Ziepert, Marita; Scholz, Markus
2012-01-01
Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.
Gross, Arnd; Ziepert, Marita; Scholz, Markus
2012-01-01
Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912
A software platform for statistical evaluation of patient respiratory patterns in radiation therapy.
Dunn, Leon; Kenny, John
2017-10-01
The aim of this work was to design and evaluate a software tool for analysis of a patient's respiration, with the goal of optimizing the effectiveness of motion management techniques during radiotherapy imaging and treatment. A software tool which analyses patient respiratory data files (.vxp files) created by the Varian Real-Time Position Management System (RPM) was developed to analyse patient respiratory data. The software, called RespAnalysis, was created in MATLAB and provides four modules, one each for determining respiration characteristics, providing breathing coaching (biofeedback training), comparing pre and post-training characteristics and performing a fraction-by-fraction assessment. The modules analyse respiratory traces to determine signal characteristics and specifically use a Sample Entropy algorithm as the key means to quantify breathing irregularity. Simulated respiratory signals, as well as 91 patient RPM traces were analysed with RespAnalysis to test the viability of using the Sample Entropy for predicting breathing regularity. Retrospective assessment of patient data demonstrated that the Sample Entropy metric was a predictor of periodic irregularity in respiration data, however, it was found to be insensitive to amplitude variation. Additional waveform statistics assessing the distribution of signal amplitudes over time coupled with Sample Entropy method were found to be useful in assessing breathing regularity. The RespAnalysis software tool presented in this work uses the Sample Entropy method to analyse patient respiratory data recorded for motion management purposes in radiation therapy. This is applicable during treatment simulation and during subsequent treatment fractions, providing a way to quantify breathing irregularity, as well as assess the need for breathing coaching. It was demonstrated that the Sample Entropy metric was correlated to the irregularity of the patient's respiratory motion in terms of periodicity, whilst other metrics, such as percentage deviation of inhale/exhale peak positions provided insight into respiratory amplitude regularity. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
AXAF user interfaces for heterogeneous analysis environments
NASA Technical Reports Server (NTRS)
Mandel, Eric; Roll, John; Ackerman, Mark S.
1992-01-01
The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors routinely layer the GUI on top of IRAF, ksh, SMongo, and IDL. The agcl, based on the facilities of a system called Answer Garden, also has sophisticated support for examining documentation and help files, asking questions of experts, and developing a knowledge base of frequently required information. Thus, the GUI becomes a total environment for running programs, accessing information, examining documents, and finding human assistance. Because the agcl can communicate with any command-line environment, most projects can make use of it easily. New applications are continually being found for these interfaces. It is the authors' intention to evolve the GUI and its underlying parameter interface in response to these needs - from users as well as developers - throughout the astronomy community. This presentation describes the capabilities and technology of the above user interface mechanisms and tools. It also discusses the design philosophies guiding the work, as well as hopes for the future.
Goedhart, Geertje; Vrijheid, Martine; Wiart, Joe; Hours, Martine; Kromhout, Hans; Cardis, Elisabeth; Eastman Langer, Chelsea; de Llobet Viladoms, Patricia; Massardier-Pilonchery, Amelie; Vermeulen, Roel
2015-10-01
A newly developed smartphone application was piloted to characterize and validate mobile phone use in young people. Twenty-six volunteers (mean age 17.3 years) from France, Spain, and the Netherlands used a software-modified smartphone for 4 weeks; the application installed on the phone recorded number and duration of calls, data use, laterality, hands-free device usage, and communication system used for both voice calls and data transfer. Upon returning the phone, participants estimated their mobile phone use during those 4 weeks via an interviewer-administered questionnaire. Results indicated that participants on average underestimated the number of calls they made, while they overestimated total call duration. Participants held the phone for about 90% of total call time near the head, mainly on the side of the head they reported as dominant. Some limitations were encountered when comparing reported and recorded data use and speaker use. When applied in a larger sample, information recorded by the smartphone application will be very useful to improve radiofrequency (RF) exposure modeling from mobile phones to be used in epidemiological research. © 2015 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Byrnes, D. V.; Carney, P. C.; Underwood, J. W.; Vogt, E. D.
1974-01-01
Development, test, conversion, and documentation of computer software for the mission analysis of missions to halo orbits about libration points in the earth-sun system is reported. The software consisting of two programs called NOMNAL and ERRAN is part of the Space Trajectories Error Analysis Programs (STEAP). The program NOMNAL targets a transfer trajectory from Earth on a given launch date to a specified halo orbit on a required arrival date. Either impulsive or finite thrust insertion maneuvers into halo orbit are permitted by the program. The transfer trajectory is consistent with a realistic launch profile input by the user. The second program ERRAN conducts error analyses of the targeted transfer trajectory. Measurements including range, doppler, star-planet angles, and apparent planet diameter are processed in a Kalman-Schmidt filter to determine the trajectory knowledge uncertainty. Execution errors at injection, midcourse correction and orbit insertion maneuvers are analyzed along with the navigation uncertainty to determine trajectory control uncertainties and fuel-sizing requirements. The program is also capable of generalized covariance analyses.
caGrid 1.0: An Enterprise Grid Infrastructure for Biomedical Research
Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel
2008-01-01
Objective To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG™) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. Measurements The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. Conclusions While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community. PMID:18096909
caGrid 1.0: an enterprise Grid infrastructure for biomedical research.
Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel
2008-01-01
To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community.
Faron, Matthew L; Buchan, Blake W; Vismara, Chiara; Lacchini, Carla; Bielli, Alessandra; Gesu, Giovanni; Liebregts, Theo; van Bree, Anita; Jansz, Arjan; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A
2016-03-01
Recently, systems have been developed to create total laboratory automation for clinical microbiology. These systems allow for the automation of specimen processing, specimen incubation, and imaging of bacterial growth. In this study, we used the WASPLab to validate software that discriminates and segregates positive and negative chromogenic methicillin-resistant Staphylococcus aureus (MRSA) plates by recognition of pigmented colonies. A total of 57,690 swabs submitted for MRSA screening were enrolled in the study. Four sites enrolled specimens following their standard of care. Chromogenic agar used at these sites included MRSASelect (Bio-Rad Laboratories, Redmond, WA), chromID MRSA (bioMérieux, Marcy l'Etoile, France), and CHROMagar MRSA (BD Diagnostics, Sparks, MD). Specimens were plated and incubated using the WASPLab. The digital camera took images at 0 and 16 to 24 h and the WASPLab software determined the presence of positive colonies based on a hue, saturation, and value (HSV) score. If the HSV score fell within a defined threshold, the plate was called positive. The performance of the digital analysis was compared to manual reading. Overall, the digital software had a sensitivity of 100% and a specificity of 90.7% with the specificity ranging between 90.0 and 96.0 across all sites. The results were similar using the three different agars with a sensitivity of 100% and specificity ranging between 90.7 and 92.4%. These data demonstrate that automated digital analysis can be used to accurately sort positive from negative chromogenic agar cultures regardless of the pigmentation produced. Copyright © 2016 Faron et al.
Sousa, Luiz Cláudio Demes da Mata; Filho, Herton Luiz Alves Sales; Von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; Neto, Pedro de Alcântara dos Santos; de Castro, José Adail Fonseca; do Monte, Semíramis Jamil Hadad
2011-12-01
The global challenge for solid organ transplantation programs is to distribute organs to the highly sensitized recipients. The purpose of this work is to describe and test the functionality of the EpHLA software, a program that automates the analysis of acceptable and unacceptable HLA epitopes on the basis of the HLAMatchmaker algorithm. HLAMatchmaker considers small configurations of polymorphic residues referred to as eplets as essential components of HLA-epitopes. Currently, the analyses require the creation of temporary files and the manual cut and paste of laboratory tests results between electronic spreadsheets, which is time-consuming and prone to administrative errors. The EpHLA software was developed in Object Pascal programming language and uses the HLAMatchmaker algorithm to generate histocompatibility reports. The automated generation of reports requires the integration of files containing the results of laboratory tests (HLA typing, anti-HLA antibody signature) and public data banks (NMDP, IMGT). The integration and the access to this data were accomplished by means of the framework called eDAFramework. The eDAFramework was developed in Object Pascal and PHP and it provides data access functionalities for software developed in these languages. The tool functionality was successfully tested in comparison to actual, manually derived reports of patients from a renal transplantation program with related donors. We successfully developed software, which enables the automated definition of the epitope specificities of HLA antibodies. This new tool will benefit the management of recipient/donor pairs selection for highly sensitized patients. Copyright © 2011 Elsevier B.V. All rights reserved.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Science on TeacherTube: A Mixed Methods Analysis of Teacher Produced Video
NASA Astrophysics Data System (ADS)
Chmiel, Margaret (Marjee)
Increased bandwidth, inexpensive video cameras and easy-to-use video editing software have made social media sites featuring user generated video (UGV) an increasingly popular vehicle for online communication. As such, UGV have come to play a role in education, both formal and informal, but there has been little research on this topic in scholarly literature. In this mixed-methods study, a content and discourse analysis are used to describe the most successful UGV in the science channel of an education-focused site called TeacherTube. The analysis finds that state achievement tests, and their focus on vocabulary and recall-level knowledge, drive much of the content found on TeacherTube.
Technical design and system implementation of region-line primitive association framework
NASA Astrophysics Data System (ADS)
Wang, Min; Xing, Jinjin; Wang, Jie; Lv, Guonian
2017-08-01
Apart from regions, image edge lines are an important information source, and they deserve more attention in object-based image analysis (OBIA) than they currently receive. In the region-line primitive association framework (RLPAF), we promote straight-edge lines as line primitives to achieve powerful OBIAs. Along with regions, straight lines become basic units for subsequent extraction and analysis of OBIA features. This study develops a new software system called remote-sensing knowledge finder (RSFinder) to implement RLPAF for engineering application purposes. This paper introduces the extended technical framework, a comprehensively designed feature set, key technology, and software implementation. To our knowledge, RSFinder is the world's first OBIA system based on two types of primitives, namely, regions and lines. It is fundamentally different from other well-known region-only-based OBIA systems, such as eCogntion and ENVI feature extraction module. This paper has important reference values for the development of similarly structured OBIA systems and line-involved extraction algorithms of remote sensing information.
Toolboxes for a standardised and systematic study of glycans
2014-01-01
Background Recent progress in method development for characterising the branched structures of complex carbohydrates has now enabled higher throughput technology. Automation of structure analysis then calls for software development since adding meaning to large data collections in reasonable time requires corresponding bioinformatics methods and tools. Current glycobioinformatics resources do cover information on the structure and function of glycans, their interaction with proteins or their enzymatic synthesis. However, this information is partial, scattered and often difficult to find to for non-glycobiologists. Methods Following our diagnosis of the causes of the slow development of glycobioinformatics, we review the "objective" difficulties encountered in defining adequate formats for representing complex entities and developing efficient analysis software. Results Various solutions already implemented and strategies defined to bridge glycobiology with different fields and integrate the heterogeneous glyco-related information are presented. Conclusions Despite the initial stage of our integrative efforts, this paper highlights the rapid expansion of glycomics, the validity of existing resources and the bright future of glycobioinformatics. PMID:24564482
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
User's Guide for a Modular Flutter Analysis Software System (Fast Version 1.0)
NASA Technical Reports Server (NTRS)
Desmarais, R. N.; Bennett, R. M.
1978-01-01
The use and operation of a group of computer programs to perform a flutter analysis of a single planar wing are described. This system of programs is called FAST for Flutter Analysis System, and consists of five programs. Each program performs certain portions of a flutter analysis and can be run sequentially as a job step or individually. FAST uses natural vibration modes as input data and performs a conventional V-g type of solution. The unsteady aerodynamics programs in FAST are based on the subsonic kernel function lifting-surface theory although other aerodynamic programs can be used. Application of the programs is illustrated by a sample case of a complete flutter calculation that exercises each program.
ERIC Educational Resources Information Center
Lieberth, Ann K.; Martin, Doug R.
1995-01-01
Because of the diversity of clients served by speech-language pathologists and audiologists, available commercial software may not meet all needs. Authoring programs allow the clinician to design software that can be customized for individual clients. This article describes an authoring program called HyperCard and its use in preparing hypermedia…
Tele-EnREDando.com: A Multimedia WEB-CALL Software for Mobile Phones.
ERIC Educational Resources Information Center
Garcia, Jose Carlos
2002-01-01
Presents one of the world's first prototypes of language learning software for smart-phones. Tele-EnREDando.com is an Internet based multimedia application designed for 3G mobile phones with audio, video, and interactive exercises for learning Spanish for business. (Author/VWL)
Poly-Pattern Compressive Segmentation of ASTER Data for GIS
NASA Technical Reports Server (NTRS)
Myers, Wayne; Warner, Eric; Tutwiler, Richard
2007-01-01
Pattern-based segmentation of multi-band image data, such as ASTER, produces one-byte and two-byte approximate compressions. This is a dual segmentation consisting of nested coarser and finer level pattern mappings called poly-patterns. The coarser A-level version is structured for direct incorporation into geographic information systems in the manner of a raster map. GIs renderings of this A-level approximation are called pattern pictures which have the appearance of color enhanced images. The two-byte version consisting of thousands of B-level segments provides a capability for approximate restoration of the multi-band data in selected areas or entire scenes. Poly-patterns are especially useful for purposes of change detection and landscape analysis at multiple scales. The primary author has implemented the segmentation methodology in a public domain software suite.
An adaptive, object oriented strategy for base calling in DNA sequence analysis.
Giddings, M C; Brumley, R L; Haker, M; Smith, L M
1993-01-01
An algorithm has been developed for the determination of nucleotide sequence from data produced in fluorescence-based automated DNA sequencing instruments employing the four-color strategy. This algorithm takes advantage of object oriented programming techniques for modularity and extensibility. The algorithm is adaptive in that data sets from a wide variety of instruments and sequencing conditions can be used with good results. Confidence values are provided on the base calls as an estimate of accuracy. The algorithm iteratively employs confidence determinations from several different modules, each of which examines a different feature of the data for accurate peak identification. Modules within this system can be added or removed for increased performance or for application to a different task. In comparisons with commercial software, the algorithm performed well. Images PMID:8233787
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walkup, Elizabeth
This software is an analyzer for automated sandbox analysis of malware on the OS X operating system. It runs inside an OS X virtual machine to collect data about what happens when a given file is opened or run. As of August 2014, there was no sandbox software for Mac OS X malware, as it requires different methods from those used on the Windows OS (which most sandboxes are written for). This software adds OS X analysis capabilities to an existing open-source sandbox, Cuckoo Sandbox (http://cuckoosandbox.org/), which previously only worked for Windows. The analyzer itself can take many different typesmore » of files as input: the traditional Mach-O and FAT executables, .app files, zip files, Python scripts, Java archives, and web pages, as well as PDFs and other documents. While the file is running, the analyzer also simulates rudimentary human interaction with clicks and mouse movements in order to bypass the tests some malware use to see if they are being analyzed. The analyzer outputs several different kinds of data: function call traces, network captures, screenshots, and all created and modified files. This work also includes a static analysis Cuckoo module for Mach-O binary files. It extracts file structures, code library imports and exports, and signatures. This data can be used along with the analyzer results to create signatures for malware.« less
Fracture network evaluation program (FraNEP): A software for analyzing 2D fracture trace-line maps
NASA Astrophysics Data System (ADS)
Zeeb, Conny; Gomez-Rivas, Enrique; Bons, Paul D.; Virgo, Simon; Blum, Philipp
2013-10-01
Fractures, such as joints, faults and veins, strongly influence the transport of fluids through rocks by either enhancing or inhibiting flow. Techniques used for the automatic detection of lineaments from satellite images and aerial photographs, LIDAR technologies and borehole televiewers significantly enhanced data acquisition. The analysis of such data is often performed manually or with different analysis software. Here we present a novel program for the analysis of 2D fracture networks called FraNEP (Fracture Network Evaluation Program). The program was developed using Visual Basic for Applications in Microsoft Excel™ and combines features from different existing software and characterization techniques. The main novelty of FraNEP is the possibility to analyse trace-line maps of fracture networks applying the (1) scanline sampling, (2) window sampling or (3) circular scanline and window method, without the need of switching programs. Additionally, binning problems are avoided by using cumulative distributions, rather than probability density functions. FraNEP is a time-efficient tool for the characterisation of fracture network parameters, such as density, intensity and mean length. Furthermore, fracture strikes can be visualized using rose diagrams and a fitting routine evaluates the distribution of fracture lengths. As an example of its application, we use FraNEP to analyse a case study of lineament data from a satellite image of the Oman Mountains.
Matsumoto, Keiichi; Endo, Keigo
2013-06-01
Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.
iSDS: a self-configurable software-defined storage system for enterprise
NASA Astrophysics Data System (ADS)
Chen, Wen-Shyen Eric; Huang, Chun-Fang; Huang, Ming-Jen
2018-01-01
Storage is one of the most important aspects of IT infrastructure for various enterprises. But, enterprises are interested in more than just data storage; they are interested in such things as more reliable data protection, higher performance and reduced resource consumption. Traditional enterprise-grade storage satisfies these requirements at high cost. It is because traditional enterprise-grade storage is usually designed and constructed by customised field-programmable gate array to achieve high-end functionality. However, in this ever-changing environment, enterprises request storage with more flexible deployment and at lower cost. Moreover, the rise of new application fields, such as social media, big data, video streaming service etc., makes operational tasks for administrators more complex. In this article, a new storage system called intelligent software-defined storage (iSDS), based on software-defined storage, is described. More specifically, this approach advocates using software to replace features provided by traditional customised chips. To alleviate the management burden, it also advocates applying machine learning to automatically configure storage to meet dynamic requirements of workloads running on storage. This article focuses on the analysis feature of iSDS cluster by detailing its architecture and design.
Rothkirch, André; Gatta, G Diego; Meyer, Mathias; Merkel, Sébastien; Merlini, Marco; Liermann, Hanns Peter
2013-09-01
Fast detectors employed at third-generation synchrotrons have reduced collection times significantly and require the optimization of commercial as well as customized software packages for data reduction and analysis. In this paper a procedure to collect, process and analyze single-crystal data sets collected at high pressure at the Extreme Conditions beamline (P02.2) at PETRA III, DESY, is presented. A new data image format called `Esperanto' is introduced that is supported by the commercial software package CrysAlis(Pro) (Agilent Technologies UK Ltd). The new format acts as a vehicle to transform the most common area-detector data formats via a translator software. Such a conversion tool has been developed and converts tiff data collected on a Perkin Elmer detector, as well as data collected on a MAR345/555, to be imported into the CrysAlis(Pro) software. In order to demonstrate the validity of the new approach, a complete structure refinement of boron-mullite (Al5BO9) collected at a pressure of 19.4 (2) GPa is presented. Details pertaining to the data collections and refinements of B-mullite are presented.
STOP-IT: Windows executable software for the stop-signal paradigm.
Verbruggen, Frederick; Logan, Gordon D; Stevens, Michaël A
2008-05-01
The stop-signal paradigm is a useful tool for the investigation of response inhibition. In this paradigm, subjects are instructed to respond as fast as possible to a stimulus unless a stop signal is presented after a variable delay. However, programming the stop-signal task is typically considered to be difficult. To overcome this issue, we present software called STOP-IT, for running the stop-signal task, as well as an additional analyzing program called ANALYZE-IT. The main advantage of both programs is that they are a precompiled executable, and for basic use there is no need for additional programming. STOP-IT and ANALYZE-IT are completely based on free software, are distributed under the GNU General Public License, and are available at the personal Web sites of the first two authors or at expsy.ugent.be/tscope/stop.html.
BS-virus-finder: virus integration calling using bisulfite sequencing data.
Gao, Shengjie; Hu, Xuesong; Xu, Fengping; Gao, Changduo; Xiong, Kai; Zhao, Xiao; Chen, Haixiao; Zhao, Shancen; Wang, Mengyao; Fu, Dongke; Zhao, Xiaohui; Bai, Jie; Mao, Likai; Li, Bo; Wu, Song; Wang, Jian; Li, Shengbin; Yang, Huangming; Bolund, Lars; Pedersen, Christian N S
2018-01-01
DNA methylation plays a key role in the regulation of gene expression and carcinogenesis. Bisulfite sequencing studies mainly focus on calling single nucleotide polymorphism, different methylation region, and find allele-specific DNA methylation. Until now, only a few software tools have focused on virus integration using bisulfite sequencing data. We have developed a new and easy-to-use software tool, named BS-virus-finder (BSVF, RRID:SCR_015727), to detect viral integration breakpoints in whole human genomes. The tool is hosted at https://github.com/BGI-SZ/BSVF. BS-virus-finder demonstrates high sensitivity and specificity. It is useful in epigenetic studies and to reveal the relationship between viral integration and DNA methylation. BS-virus-finder is the first software tool to detect virus integration loci by using bisulfite sequencing data. © The Authors 2017. Published by Oxford University Press.
User's Manual and Final Report for Hot-SMAC GUI Development
NASA Technical Reports Server (NTRS)
Yarrington, Phil
2001-01-01
A new software package called Higher Order Theory-Structural/Micro Analysis Code (HOT-SMAC) has been developed as an effective alternative to the finite element approach for Functionally Graded Material (FGM) modeling. HOT-SMAC is a self-contained package including pre- and post-processing through an intuitive graphical user interface, along with the well-established Higher Order Theory for Functionally Graded Materials (HOTFGM) thermomechanical analysis engine. This document represents a Getting Started/User's Manual for HOT-SMAC and a final report for its development. First, the features of the software are presented in a simple step-by-step example where a HOT-SMAC model representing a functionally graded material is created, mechanical and thermal boundary conditions are applied, the model is analyzed and results are reviewed. In a second step-by-step example, a HOT-SMAC model of an actively cooled metallic channel with ceramic thermal barrier coating is built and analyzed. HOT-SMAC results from this model are compared to recently published results (NASA/TM-2001-210702) for two grid densities. Finally, a prototype integration of HOTSMAC with the commercially available HyperSizer(R) structural analysis and sizing software is presented. In this integration, local strain results from HyperSizer's structural analysis are fed to a detailed HOT-SMAC model of the flange-to-facesheet bond region of a stiffened panel. HOT-SMAC is then used to determine the peak shear and peel (normal) stresses between the facesheet and bonded flange of the panel and determine the "free edge" effects.
NASA Astrophysics Data System (ADS)
Fisher, W. I.
2017-12-01
The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.
Do, Hongdo; Molania, Ramyar
2017-01-01
The identification of genomic rearrangements with high sensitivity and specificity using massively parallel sequencing remains a major challenge, particularly in precision medicine and cancer research. Here, we describe a new method for detecting rearrangements, GRIDSS (Genome Rearrangement IDentification Software Suite). GRIDSS is a multithreaded structural variant (SV) caller that performs efficient genome-wide break-end assembly prior to variant calling using a novel positional de Bruijn graph-based assembler. By combining assembly, split read, and read pair evidence using a probabilistic scoring, GRIDSS achieves high sensitivity and specificity on simulated, cell line, and patient tumor data, recently winning SV subchallenge #5 of the ICGC-TCGA DREAM8.5 Somatic Mutation Calling Challenge. On human cell line data, GRIDSS halves the false discovery rate compared to other recent methods while matching or exceeding their sensitivity. GRIDSS identifies nontemplate sequence insertions, microhomologies, and large imperfect homologies, estimates a quality score for each breakpoint, stratifies calls into high or low confidence, and supports multisample analysis. PMID:29097403
NASA Technical Reports Server (NTRS)
1973-01-01
A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.
DOT National Transportation Integrated Search
2014-04-01
This report provides a Road Map for implementing the AASHTOWare Pavement ME Design software for the Idaho Transportation Department (ITD). The Road Map calls for a series of three stages: Stage 1 - Immediate, Stage 2 - Near Term, and Stage 3 - Future...
Quick Prototyping of Educational Software: An Object-Oriented Approach.
ERIC Educational Resources Information Center
Wong, Simon C-H
1994-01-01
Introduces and demonstrates a quick-prototyping model for educational software development that can be used by teachers developing their own courseware using an object-oriented programming system. Development of a courseware package called "The Match-Maker" is explained as an example that uses HyperCard for quick prototyping. (Contains…
Choosing and Using Text-to-Speech Software
ERIC Educational Resources Information Center
Peters, Tom; Bell, Lori
2007-01-01
This article describes a computer-based technology for generating speech called text-to-speech (TTS). This software is ready for widespread use by libraries, other organizations, and individual users. It offers the affordable ability to turn just about any electronic text that is not image-based into an artificially spoken communication. The…
Software GOLUCA: Knowledge Representation in Mental Calculation
ERIC Educational Resources Information Center
Casas-Garcia, Luis M.; Luengo-Gonzalez, Ricardo; Godinho-Lopes, Vitor
2011-01-01
We present a new software, called Goluca (Godinho, Luengo, and Casas, 2007), based on the technique of Pathfinder Associative Networks (Schvaneveldt, 1989), which produces graphical representations of the cognitive structure of individuals in a given field knowledge. In this case, we studied the strategies used by teachers and its relationship…
User's operating procedures. Volume 2: Scout project financial analysis program
NASA Technical Reports Server (NTRS)
Harris, C. G.; Haris, D. K.
1985-01-01
A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.
Relative Displacement Method for Track-Structure Interaction
Ramos, Óscar Ramón; Pantaleón, Marcos J.
2014-01-01
The track-structure interaction effects are usually analysed with conventional FEM programs, where it is difficult to implement the complex track-structure connection behaviour, which is nonlinear, elastic-plastic and depends on the vertical load. The authors developed an alternative analysis method, which they call the relative displacement method. It is based on the calculation of deformation states in single DOF element models that satisfy the boundary conditions. For its solution, an iterative optimisation algorithm is used. This method can be implemented in any programming language or analysis software. A comparison with ABAQUS calculations shows a very good result correlation and compliance with the standard's specifications. PMID:24634610
Scheuch, Matthias; Höper, Dirk; Beer, Martin
2015-03-03
Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.
How to choose the right statistical software?—a method increasing the post-purchase satisfaction
2015-01-01
Nowadays, we live in the “data era” where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher’s personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it’s not enough at all and might lead to a “dead end” situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called “gray literature”, even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher’s own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011. PMID:26793368
Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud. PMID:26271043
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.
A software development and evolution model based on decision-making
NASA Technical Reports Server (NTRS)
Wild, J. Christian; Dong, Jinghuan; Maly, Kurt
1991-01-01
Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.
High-energy physics software parallelization using database techniques
NASA Astrophysics Data System (ADS)
Argante, E.; van der Stok, P. D. V.; Willers, I.
1997-02-01
A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.
NASA Technical Reports Server (NTRS)
1992-01-01
This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.
NASA Astrophysics Data System (ADS)
Basista, A.
2013-12-01
There are many tools to manage spatial data. They called Geographic Information System (GIS), which apart from data visualization in space, let users make various spatial analysis. Thanks to them, it is possible to obtain more, essential information for real estate market analysis. Many scientific research present GIS exploitation to future mass valuation, because it is necessary to use advanced tools to manage such a huge real estates' data sets gathered for mass valuation needs. In practice, appraisers use rarely these tools for single valuation, because there are not many available GIS tools to support real estate valuation. The paper presents the functionality of geoinformatic subsystem, that is used to support real estate market analysis and real estate valuation. There are showed a detailed description of the process relied to attributes' inputting into the database and the attributes' values calculation based on the proposed definition of attributes' scales. This work presents also the algorithm of similar properties selection that was implemented within the described subsystem. The main stage of this algorithm is the calculation of the price creative indicator for each real estate, using their attributes' values. The set of properties, chosen in this way, are visualized on the map. The geoinformatic subsystem is used for the un-built real estates and living premises. Geographic Information System software was used to worked out this project. The basic functionality of gvSIG software (open source software) was extended and some extra functions were added to support real estate market analysis.
An evolutionary morphological approach for software development cost estimation.
Araújo, Ricardo de A; Oliveira, Adriano L I; Soares, Sergio; Meira, Silvio
2012-08-01
In this work we present an evolutionary morphological approach to solve the software development cost estimation (SDCE) problem. The proposed approach consists of a hybrid artificial neuron based on framework of mathematical morphology (MM) with algebraic foundations in the complete lattice theory (CLT), referred to as dilation-erosion perceptron (DEP). Also, we present an evolutionary learning process, called DEP(MGA), using a modified genetic algorithm (MGA) to design the DEP model, because a drawback arises from the gradient estimation of morphological operators in the classical learning process of the DEP, since they are not differentiable in the usual way. Furthermore, an experimental analysis is conducted with the proposed model using five complex SDCE problems and three well-known performance metrics, demonstrating good performance of the DEP model to solve SDCE problems. Copyright © 2012 Elsevier Ltd. All rights reserved.
The Electron Microscopy Outreach Program: A Web-based resource for research and education.
Sosinsky, G E; Baker, T S; Hand, G; Ellisman, M H
1999-01-01
We have developed a centralized World Wide Web (WWW)-based environment that serves as a resource of software tools and expertise for biological electron microscopy. A major focus is molecular electron microscopy, but the site also includes information and links on structural biology at all levels of resolution. This site serves to help integrate or link structural biology techniques in accordance with user needs. The WWW site, called the Electron Microscopy (EM) Outreach Program (URL: http://emoutreach.sdsc.edu), provides scientists with computational and educational tools for their research and edification. In particular, we have set up a centralized resource containing course notes, references, and links to image analysis and three-dimensional reconstruction software for investigators wanting to learn about EM techniques either within or outside of their fields of expertise. Copyright 1999 Academic Press.
Electron tunneling in proteins program.
Hagras, Muhammad A; Stuchebrukhov, Alexei A
2016-06-05
We developed a unique integrated software package (called Electron Tunneling in Proteins Program or ETP) which provides an environment with different capabilities such as tunneling current calculation, semi-empirical quantum mechanical calculation, and molecular modeling simulation for calculation and analysis of electron transfer reactions in proteins. ETP program is developed as a cross-platform client-server program in which all the different calculations are conducted at the server side while only the client terminal displays the resulting calculation outputs in the different supported representations. ETP program is integrated with a set of well-known computational software packages including Gaussian, BALLVIEW, Dowser, pKip, and APBS. In addition, ETP program supports various visualization methods for the tunneling calculation results that assist in a more comprehensive understanding of the tunneling process. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
obitools: a unix-inspired software package for DNA metabarcoding.
Boyer, Frédéric; Mercier, Céline; Bonin, Aurélie; Le Bras, Yvan; Taberlet, Pierre; Coissac, Eric
2016-01-01
DNA metabarcoding offers new perspectives in biodiversity research. This recently developed approach to ecosystem study relies heavily on the use of next-generation sequencing (NGS) and thus calls upon the ability to deal with huge sequence data sets. The obitools package satisfies this requirement thanks to a set of programs specifically designed for analysing NGS data in a DNA metabarcoding context. Their capacity to filter and edit sequences while taking into account taxonomic annotation helps to set up tailor-made analysis pipelines for a broad range of DNA metabarcoding applications, including biodiversity surveys or diet analyses. The obitools package is distributed as an open source software available on the following website: http://metabarcoding.org/obitools. A Galaxy wrapper is available on the GenOuest core facility toolshed: http://toolshed.genouest.org. © 2015 John Wiley & Sons Ltd.
User's Manual for LEWICE Version 3.2
NASA Technical Reports Server (NTRS)
Wright, William
2008-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
NASA Technical Reports Server (NTRS)
Brown, David B.
1990-01-01
The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.
Software Model Checking Without Source Code
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Ivers, James
2009-01-01
We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.
Design Optimization Toolkit: Users' Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less
The equipment access software for a distributed UNIX-based accelerator control system
NASA Astrophysics Data System (ADS)
Trofimov, Nikolai; Zelepoukine, Serguei; Zharkov, Eugeny; Charrue, Pierre; Gareyte, Claire; Poirier, Hervé
1994-12-01
This paper presents a generic equipment access software package for a distributed control system using computers with UNIX or UNIX-like operating systems. The package consists of three main components, an application Equipment Access Library, Message Handler and Equipment Data Base. An application task, which may run in any computer in the network, sends requests to access equipment through Equipment Library calls. The basic request is in the form Equipment-Action-Data and is routed via a remote procedure call to the computer to which the given equipment is connected. In this computer the request is received by the Message Handler. According to the type of the equipment connection, the Message Handler either passes the request to the specific process software in the same computer or forwards it to a lower level network of equipment controllers using MIL1553B, GPIB, RS232 or BITBUS communication. The answer is then returned to the calling application. Descriptive information required for request routing and processing is stored in the real-time Equipment Data Base. The package has been written to be portable and is currently available on DEC Ultrix, LynxOS, HPUX, XENIX, OS-9 and Apollo domain.
VIEW-Station software and its graphical user interface
NASA Astrophysics Data System (ADS)
Kawai, Tomoaki; Okazaki, Hiroshi; Tanaka, Koichiro; Tamura, Hideyuki
1992-04-01
VIEW-Station is a workstation-based image processing system which merges the state-of-the- art software environment of Unix with the computing power of a fast image processor. VIEW- Station has a hierarchical software architecture, which facilitates device independence when porting across various hardware configurations, and provides extensibility in the development of application systems. The core image computing language is V-Sugar. V-Sugar provides a set of image-processing datatypes and allows image processing algorithms to be simply expressed, using a functional notation. VIEW-Station provides a hardware independent window system extension called VIEW-Windows. In terms of GUI (Graphical User Interface) VIEW-Station has two notable aspects. One is to provide various types of GUI as visual environments for image processing execution. Three types of interpreters called (mu) V- Sugar, VS-Shell and VPL are provided. Users may choose whichever they prefer based on their experience and tasks. The other notable aspect is to provide facilities to create GUI for new applications on the VIEW-Station system. A set of widgets are available for construction of task-oriented GUI. A GUI builder called VIEW-Kid is developed for WYSIWYG interactive interface design.
Architecture and Implementation of OpenPET Firmware and Embedded Software
Abu-Nimeh, Faisal T.; Ito, Jennifer; Moses, William W.; ...
2016-01-11
OpenPET is an open source, modular, extendible, and high-performance platform suitable for multi-channel data acquisition and analysis. Due to the versatility of the hardware, firmware, and software architectures, the platform is capable of interfacing with a wide variety of detector modules not only in medical imaging but also in homeland security applications. Analog signals from radiation detectors share similar characteristics-a pulse whose area is proportional to the deposited energy and whose leading edge is used to extract a timing signal. As a result, a generic design method of the platform is adopted for the hardware, firmware, and software architectures andmore » implementations. The analog front-end is hosted on a module called a Detector Board, where each board can filter, combine, timestamp, and process multiple channels independently. The processed data is formatted and sent through a backplane bus to a module called Support Board, where 1 Support Board can host up to eight Detector Board modules. The data in the Support Board, coming from 8 Detector Board modules, can be aggregated or correlated (if needed) depending on the algorithm implemented or runtime mode selected. It is then sent out to a computer workstation for further processing. The number of channels (detector modules), to be processed, mandates the overall OpenPET System Configuration, which is designed to handle up to 1,024 channels using 16-channel Detector Boards in the Standard System Configuration and 16,384 channels using 32-channel Detector Boards in the Large System Configuration.« less
Scaffolding Executive Function Capabilities via Play-&-Learn Software for Preschoolers
ERIC Educational Resources Information Center
Axelsson, Anton; Andersson, Richard; Gulz, Agneta
2016-01-01
Educational software in the form of games or so called "computer assisted intervention" for young children has become increasingly common receiving a growing interest and support. Currently there are, for instance, more than 1,000 iPad apps tagged for preschool. Thus, it has become increasingly important to empirically investigate…
The Future of the Web, Intelligent Devices, and Education.
ERIC Educational Resources Information Center
Strauss, Howard
1999-01-01
Examines past trends in hardware, software, networking, and education, in an attempt to determine where they are going and what their broad implications might be. Speculates on what will replace the World Wide Web. Describes new applications and telematons along with a new paradigm for education called SMILE (Software-Managed Instruction,…
Identification of Factors That Affect Software Complexity.
ERIC Educational Resources Information Center
Kaiser, Javaid
A survey of computer scientists was conducted to identify factors that affect software complexity. A total of 160 items were selected from the literature to include in a questionnaire sent to 425 individuals who were employees of computer-related businesses in Lawrence and Kansas City. The items were grouped into nine categories called system…
Helping Students Make Sense of Graphs: An Experimental Trial of SmartGraphs Software
ERIC Educational Resources Information Center
Zucker, Andrew; Kay, Rachel; Staudt, Carolyn
2014-01-01
Graphs are commonly used in science, mathematics, and social sciences to convey important concepts; yet students at all ages demonstrate difficulties interpreting graphs. This paper reports on an experimental study of free, Web-based software called SmartGraphs that is specifically designed to help students overcome their misconceptions regarding…
Painting a picture across the landscape with ModelMap
Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino
2017-01-01
Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...
A Constrained and Guided Approach for Managing Software Engineering Course Projects
ERIC Educational Resources Information Center
Cheng, Y.-P.; Lin, J. M.-C.
2010-01-01
This paper documents several years of experimentation with a new approach to organizing and managing projects in a software engineering course. The initial failure and subsequent refinements that the new approach has been through since 2004 are described herein. The "constrained and guided" approach, as it is called, has helped to reduce…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-03
... type of switch software) to provide payphone specific coding digits for per-call compensation. The... Information; [cir] RF Exposure Information; [cir] Operational Description; [cir] Cover Letters; [cir] Software Defined Radio/Cognitive Radio Files In general, an applicant's submission is as follows: (a) FCC Form 731...
ERIC Educational Resources Information Center
Lee, Young-Jin
2011-01-01
This study investigates whether a visual programming environment called Etoys could enable teachers to create software applications meeting their own instructional needs. Twenty-four teachers who participated in the study successfully developed their own educational computer programs in the educational technology course employing cognitive…
Methods For Self-Organizing Software
Bouchard, Ann M.; Osbourn, Gordon C.
2005-10-18
A method for dynamically self-assembling and executing software is provided, containing machines that self-assemble execution sequences and data structures. In addition to ordered functions calls (found commonly in other software methods), mutual selective bonding between bonding sites of machines actuates one or more of the bonding machines. Two or more machines can be virtually isolated by a construct, called an encapsulant, containing a population of machines and potentially other encapsulants that can only bond with each other. A hierarchical software structure can be created using nested encapsulants. Multi-threading is implemented by populations of machines in different encapsulants that are interacting concurrently. Machines and encapsulants can move in and out of other encapsulants, thereby changing the functionality. Bonding between machines' sites can be deterministic or stochastic with bonding triggering a sequence of actions that can be implemented by each machine. A self-assembled execution sequence occurs as a sequence of stochastic binding between machines followed by their deterministic actuation. It is the sequence of bonding of machines that determines the execution sequence, so that the sequence of instructions need not be contiguous in memory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
HUBER, J.H.
An Enraf Densitometer is installed on tank 241-AY-102. The Densitometer will frequently be tasked to obtain and log density profiles. The activity can be effected a number of ways. Enraf Incorporated provides a software package called ''Logger18'' to its customers for the purpose of in-shop testing of their gauges. Logger18 is capable of accepting an input file which can direct the gauge to obtain a density profile for a given tank level and bottom limit. Logger18 is a complex, DOS based program which will require trained technicians and/or tank farm entries to obtain the data. ALARA considerations have prompted themore » development of a more user-friendly, computer-based interface to the Enraf densitometers. This document records the plan by which this new Enraf data acquisition software will be developed, reviewed, verified, and released. This plan applies to the development and implementation of a one-time-use software program, which will be called ''Enraf Control Panel.'' The software will be primarily used for remote operation of Enraf Densitometers for the purpose of obtaining and logging tank product density profiles.« less
PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.
Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan
2009-01-01
Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.
PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data
Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan
2009-01-01
Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561
SeqMule: automated pipeline for analysis of human exome/genome sequencing data.
Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai
2015-09-18
Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.
Thermal Tracker: The Secret Lives of Bats and Birds Revealed
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Offshore wind developers and stakeholders can accelerate the sustainable, widespread deployment of offshore wind using a new open-source software program, called ThermalTracker. Researchers can now collect the data they need to better understand the potential effects of offshore wind turbines on bird and bat populations. This plug and play software can be used with any standard desktop computer, thermal camera, and statistical software to identify species and behaviors of animals in offshore locations.
Usability Evaluation of Air Warfare Assessment & Review Toolset in Exercise Black Skies 2012
2013-12-01
is, it allows the user to do what they want to do with it ( Pressman , 2005). This concept is sometimes called fitness for purpose (Nielsen, 1993...Other characteristics of good software defined by Pressman (2005) are: reliability – the proportion of time the software is available for its intended...Diego, CA: Academic Press,. Pressman , R. S. (2005). Software Engineering: A Practitioner’s Approach. New York: McGraw- Hill. Symons, S., France, M
Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0
NASA Technical Reports Server (NTRS)
Wright, Theodore W.
2016-01-01
A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.
A MultiDiscipline Approach to Digitizing Historic Seismograms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Andrew
2016-04-07
Retriever Technology has developed and has made available free of charge a seismogram digitization software package called SKATE (Seismogram Kit for Automatic Trace Extraction). We have developed an extensive set of algorithms that process seismogram image files, provide editing tools, and output time series data. The software is available online and free of charge at seismo.redfish.com. To demonstrate the speed and cost effectiveness of the software, we have processed over 30,000 images.
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
NASA Technical Reports Server (NTRS)
Watts, Michael E.; Dejpour, Shabob R.
1989-01-01
The changes made on the data analysis and management program DATAMAP (Data from Aeromechanics Test and Analytics - Management and Analysis Package) are detailed. These changes are made to Version 3.07 (released February, 1981) and are called Version 4.0. Version 4.0 improvements were performed by Sterling Software under contract to NASA Ames Research Center. The increased capabilities instituted in this version include the breakout of the source code into modules for ease of modification, addition of a more accurate curve fit routine, ability to handle higher frequency data, additional data analysis features, and improvements in the functionality of existing features. These modification will allow DATAMAP to be used on more data sets and will make future modifications and additions easier to implement.
Activity-Centric Approach to Distributed Programming
NASA Technical Reports Server (NTRS)
Levy, Renato; Satapathy, Goutam; Lang, Jun
2004-01-01
The first phase of an effort to develop a NASA version of the Cybele software system has been completed. To give meaning to even a highly abbreviated summary of the modifications to be embodied in the NASA version, it is necessary to present the following background information on Cybele: Cybele is a proprietary software infrastructure for use by programmers in developing agent-based application programs [complex application programs that contain autonomous, interacting components (agents)]. Cybele provides support for event handling from multiple sources, multithreading, concurrency control, migration, and load balancing. A Cybele agent follows a programming paradigm, called activity-centric programming, that enables an abstraction over system-level thread mechanisms. Activity centric programming relieves application programmers of the complex tasks of thread management, concurrency control, and event management. In order to provide such functionality, activity-centric programming demands support of other layers of software. This concludes the background information. In the first phase of the present development, a new architecture for Cybele was defined. In this architecture, Cybele follows a modular service-based approach to coupling of the programming and service layers of software architecture. In a service-based approach, the functionalities supported by activity-centric programming are apportioned, according to their characteristics, among several groups called services. A well-defined interface among all such services serves as a path that facilitates the maintenance and enhancement of such services without adverse effect on the whole software framework. The activity-centric application-program interface (API) is part of a kernel. The kernel API calls the services by use of their published interface. This approach makes it possible for any application code written exclusively under the API to be portable for any configuration of Cybele.
Wang, Xiupin; Peng, Qingzhi; Li, Peiwu; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen; Zhang, Liangxiao
2016-10-12
High complexity of identification for non-target triacylglycerols (TAGs) is a major challenge in lipidomics analysis. To identify non-target TAGs, a powerful tool named accurate MS(n) spectrometry generating so-called ion trees is used. In this paper, we presented a technique for efficient structural elucidation of TAGs on MS(n) spectral trees produced by LTQ Orbitrap MS(n), which was implemented as an open source software package, or TIT. The TIT software was used to support automatic annotation of non-target TAGs on MS(n) ion trees from a self-built fragment ion database. This database includes 19108 simulate TAG molecules from a random combination of fatty acids and corresponding 500582 self-built multistage fragment ions (MS ≤ 3). Our software can identify TAGs using a "stage-by-stage elimination" strategy. By utilizing the MS(1) accurate mass and referenced RKMD, the TIT software can discriminate unique elemental composition candidates. The regiospecific isomers of fatty acyl chains will be distinguished using MS(2) and MS(3) fragment spectra. We applied the algorithm to the selection of 45 TAG standards and demonstrated that the molecular ions could be 100% correctly assigned. Therefore, the TIT software could be applied to TAG identification in complex biological samples such as mouse plasma extracts. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
A Distributed Simulation Software System for Multi-Spacecraft Missions
NASA Technical Reports Server (NTRS)
Burns, Richard; Davis, George; Cary, Everett
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Online Interactive Data Analysis of Multi-Sensor Data Using Giovanni
NASA Astrophysics Data System (ADS)
Berrick, S.; Leptoukh, G.; Liu, Z.; Rui, H.; Shen, S.; Teng, W.; Zhu, T.
2005-12-01
The goal of the GES-DISC Interactive Online Visualization and Analysis System (Giovanni) is to provide earth science users a means for performing data analysis on data in the Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) without having to download the data. Through Giovanni, users are able to apply statistical analysis on many individual gridded global data products across multiple instruments and even inter-compare parameters from more than one instrument. Giovanni currently allows users to select a time window and a region of interest to generate many graphical output types including area plots (time-averaged), time-series (area-averaged), Hovmoller (latitude vs. time, longitude vs. time), and animations for area plots. A number of graphical output types are also available for parameter inter-comparisons. ASCII output is also available for those who want to apply their own analysis software. Using the knowledge gained from Giovanni, a user can minimize the amount of data they need to download while maximizing the amount of relevant content in those data. The design challenges of Giovanni are (1) to successfully balance a simple, intuitive Web interface with the complexity and heterogeneity of our data, (2) to have a simple and flexible configuration so that new data sets and parameters can be added and organized for particular user communities, (3) to be agnostic with respect to the analysis software and graphing software and, (4) scalability. In a short time, the original Giovanni (Giovanni 1) has grown from two instances to eight (Giovanni 2), each tailored for a specific user community. The demand, however, for Giovanni and its capabilities continues to increase and in order to meet those demands, a redesign effort of Giovanni, which we call Giovanni 3, is being undertaken.
An Analysis of the Use of Social Software and Its Impact on Organizational Processes
NASA Astrophysics Data System (ADS)
Pascual-Miguel, Félix; Chaparro-Peláez, Julián; Hernández-García, Ángel
This article proposes a study on the implementation rate of the most relevant 2.0 tools and technologies in Spanish enterprises, and their impact on 12 important aspects of business processes. In order to characterize the grade of implementation and the perceived improvements on the processes two indexes, Implementation Index and Impact Rate, have been created and displayed in a matrix called "2.0 Success Matrix". Data has been analyzed from a survey taken to directors and executives of large companies and small and medium businesses.
NASA Technical Reports Server (NTRS)
Haering, Edward A.
2017-01-01
The world as a whole and NASA in particular, owes a large debt of gratitude to Dr. Kenneth Plotkin for his decades of service in the field of sonic boom research and advancement of quiet supersonic transportation. This presentation will highlight the contributions of Dr. Plotkin to a myriad of NASA projects. One of the largest efforts was the assembly and continual improvement of sonic boom propagation software tools, collectively called PCBoom, which allowed the analysis of real and imagined vehicles from Mach cutoff conditions to the hypersonic.
Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy
We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.
Potential medical applications of TAE
NASA Technical Reports Server (NTRS)
Fahy, J. Ben; Kaucic, Robert; Kim, Yongmin
1986-01-01
In cooperation with scientists in the University of Washington Medical School, a microcomputer-based image processing system for quantitative microscopy, called DMD1 (Digital Microdensitometer 1) was constructed. In order to make DMD1 transportable to different hosts and image processors, we have been investigating the possibility of rewriting the lower level portions of DMD1 software using Transportable Applications Executive (TAE) libraries and subsystems. If successful, we hope to produce a newer version of DMD1, called DMD2, running on an IBM PC/AT under the SCO XENIX System 5 operating system, using any of seven target image processors available in our laboratory. Following this implementation, copies of the system will be transferred to other laboratories with biomedical imaging applications. By integrating those applications into DMD2, we hope to eventually expand our system into a low-cost general purpose biomedical imaging workstation. This workstation will be useful not only as a self-contained instrument for clinical or research applications, but also as part of a large scale Digital Imaging Network and Picture Archiving and Communication System, (DIN/PACS). Widespread application of these TAE-based image processing and analysis systems should facilitate software exchange and scientific cooperation not only within the medical community, but between the medical and remote sensing communities as well.
Tubiana, Luca; Polles, Guido; Orlandini, Enzo; Micheletti, Cristian
2018-06-07
The KymoKnot software package and web server identifies and locates physical knots or proper knots in a series of polymer conformations. It is mainly intended as an analysis tool for trajectories of linear or circular polymers, but it can be used on single instances too, e.g. protein structures in PDB format. A key element of the software package is the so-called minimally interfering chain closure algorithm that is used to detect physical knots in open chains and to locate the knotted region in both open and closed chains. The web server offers a user-friendly graphical interface that identifies the knot type and highlights the knotted region on each frame of the trajectory, which the user can visualize interactively from various viewpoints. The dynamical evolution of the knotted region along the chain contour is presented as a kymograph. All data can be downloaded in text format. The KymoKnot package is licensed under the BSD 3-Clause licence. The server is publicly available at http://kymoknot.sissa.it/kymoknot/interactive.php .
The jmzQuantML programming interface and validator for the mzQuantML data standard.
Qi, Da; Krishna, Ritesh; Jones, Andrew R
2014-03-01
The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Varley, Anna; Warren, Fiona C.; Richards, Suzanne H.; Calitri, Raff; Chaplin, Katherine; Fletcher, Emily; Holt, Tim A.; Lattimer, Valerie; Murdoch, Jamie; Richards, David A.; Campbell, John
2016-01-01
Background Nurse-led telephone triage is increasingly used to manage demand for general practitioner consultations in UK general practice. Previous studies are equivocal about the relationship between clinical experience and the call outcomes of nurse triage. Most research is limited to investigating nurse telephone triage in out-of-hours settings. Objective To investigate whether the professional characteristics of primary care nurses undertaking computer decision supported software telephone triage are related to call disposition. Design Questionnaire survey of nurses delivering the nurse intervention arm of the ESTEEM trial, to capture role type (practice nurse or nurse practitioner), prescriber status, number of years’ nursing experience, graduate status, previous experience of triage, and perceived preparedness for triage. Our main outcome was the proportion of triaged patients recommended for follow-up within the practice (call disposition), including all contact types (face-to-face, telephone or home visit), by a general practitioner or nurse. Settings 15 general practices and 7012 patients receiving the nurse triage intervention in four regions of the UK. Participants 45 nurse practitioners and practice nurse trained in the use of clinical decision support software. Methods We investigated the associations between nursing characteristics and triage call disposition for patient ‘same-day’ appointment requests in general practice using multivariable logistic regression modelling. Results Valid responses from 35 nurses (78%) from 14 practices: 31/35 (89%) had ≥10 years’ experience with 24/35 (69%) having ≥20 years. Most patient contacts (3842/4605; 86%) were recommended for follow-up within the practice. Nurse practitioners were less likely to recommend patients for follow-up odds ratio 0.19, 95% confidence interval 0.07; 0.49 than practice nurses. Nurses who reported that their previous experience had prepared them less well for triage were more likely to recommend patients for follow-up (OR 3.17, 95% CI 1.18–5.55). Conclusion Nurse characteristics were associated with disposition of triage calls to within practice follow-up. Nurse practitioners or those who reported feeling ‘more prepared’ for the role were more likely to manage the call definitively. Practices considering nurse triage should ensure that nurses transitioning into new roles feel adequately prepared. While standardised training is necessary, it may not be sufficient to ensure successful implementation. PMID:27087294
Development of a Unix/VME data acquisition system
NASA Astrophysics Data System (ADS)
Miller, M. C.; Ahern, S.; Clark, S. M.
1992-01-01
The current status of a Unix-based VME data acquisition development project is described. It is planned to use existing Fortran data collection software to drive the existing CAMAC electronics via a VME CAMAC branch driver card and associated Daresbury Unix driving software. The first usable Unix driver has been written and produces single-action CAMAC cycles from test software. The data acquisition code has been implemented in test mode under Unix with few problems and effort is now being directed toward finalizing calls to the CAMAC-driving software and ultimate evaluation of the complete system.
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
Static analysis of class invariants in Java programs
NASA Astrophysics Data System (ADS)
Bonilla-Quintero, Lidia Dionisia
2011-12-01
This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.
Programmable ubiquitous telerobotic devices
NASA Astrophysics Data System (ADS)
Doherty, Michael; Greene, Matthew; Keaton, David; Och, Christian; Seidl, Matthew L.; Waite, William; Zorn, Benjamin G.
1997-12-01
We are investigating a field of research that we call ubiquitous telepresence, which involves the design and implementation of low-cost robotic devices that can be programmed and operated from anywhere on the Internet. These devices, which we call ubots, can be used for academic purposes (e.g., a biologist could remote conduct a population survey), commercial purposes (e.g., a house could be shown remotely by a real-estate agent), and for recreation and education (e.g., someone could tour a museum remotely). We anticipate that such devices will become increasingly common due to recent changes in hardware and software technology. In particular, current hardware technology enables such devices to be constructed very cheaply (less than $500), and current software and network technology allows highly portable code to be written and downloaded across the Internet. In this paper, we present our prototype system architecture, and the ubot implementation we have constructed based on it. The hardware technology we use is the handy board, a 6811-based controller board with digital and analog inputs and outputs. Our software includes a network layer based on TCP/IP and software layers written in Java. Our software enables users across the Internet to program the behavior of the vehicle and to receive image feedback from a camera mounted on it.
Billimek, John; Guzman, Herlinda; Angulo, Marco A
2015-04-10
Low-income, Mexican-American patients with diabetes exhibit high rates of medication nonadherence, poor blood sugar control and serious complications, and often have difficulty communicating their concerns about the medication regimen to physicians. Interventions led by community health workers, non-professional community members who are trained to work with patients to improve engagement and communication during the medical visit, have had mixed success in improving outcomes. The primary objective of this project is to pilot test a prototype software toolkit called "EMPATHy" that a community health worker can administer to help patients identify the most important barriers to adherence that they face and discuss these barriers with their doctor. The EMPATHy toolkit will be piloted in an ongoing intervention (Coached Care) in which community health workers are trained to be "coaches" to meet with patients before the medical visit and help them prepare a list of important questions for the doctor. A total of 190 Mexican-American patients with poorly controlled type 2 diabetes will be recruited from December 2014 through June 2015 and will be randomly assigned to complete either a single Coached Care intervention visit with no software tools or a Coached Care visit incorporating the EMPATHy software toolkit. The primary endpoints are (1) the development of a "contextualized plan of care" (i.e., a plan of care that addresses a barrier to medication adherence in the patient's daily life) with the doctor, determined from an audio recording of the medical visit, and (2) attainment of a concrete behavioral goal set during the intervention session, assessed in a 2-week follow-up phone call to the patient. The statistical analysis will include logistic regression models and is powered to detect a 50% increase in the primary endpoints. The study will provide evidence regarding the effectiveness and feasibility of a software tool to help patients communicate with doctors about problems they face with their medications. ClinicalTrials.gov NCT02324036 Registered 16 December 2014.
The Relationship between Software Design and Children's Engagement
ERIC Educational Resources Information Center
Buckleitner, Warren
2006-01-01
This study was an attempt to measure the effects of praise and reinforcement on children in a computer learning setting. A sorting game was designed to simulate 2 interaction styles. One style, called high computer control, provided frequent praise and coaching. The other, called high child control, had narration and praise toggled off. A…
2017-03-17
NASA engineers and test directors gather in Firing Room 3 in the Launch Control Center at NASA's Kennedy Space Center in Florida, to watch a demonstration of the automated command and control software for the agency's Space Launch System (SLS) and Orion spacecraft. The software is called the Ground Launch Sequencer. It will be responsible for nearly all of the launch commit criteria during the final phases of launch countdowns. The Ground and Flight Application Software Team (GFAST) demonstrated the software. It was developed by the Command, Control and Communications team in the Ground Systems Development and Operations (GSDO) Program. GSDO is helping to prepare the center for the first test flight of Orion atop the SLS on Exploration Mission 1.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos
2017-08-01
Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis
2017-01-01
Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616
Versatile synchronized real-time MEG hardware controller for large-scale fast data acquisition.
Sun, Limin; Han, Menglai; Pratt, Kevin; Paulson, Douglas; Dinh, Christoph; Esch, Lorenz; Okada, Yoshio; Hämäläinen, Matti
2017-05-01
Versatile controllers for accurate, fast, and real-time synchronized acquisition of large-scale data are useful in many areas of science, engineering, and technology. Here, we describe the development of a controller software based on a technique called queued state machine for controlling the data acquisition (DAQ) hardware, continuously acquiring a large amount of data synchronized across a large number of channels (>400) at a fast rate (up to 20 kHz/channel) in real time, and interfacing with applications for real-time data analysis and display of electrophysiological data. This DAQ controller was developed specifically for a 384-channel pediatric whole-head magnetoencephalography (MEG) system, but its architecture is useful for wide applications. This controller running in a LabVIEW environment interfaces with microprocessors in the MEG sensor electronics to control their real-time operation. It also interfaces with a real-time MEG analysis software via transmission control protocol/internet protocol, to control the synchronous acquisition and transfer of the data in real time from >400 channels to acquisition and analysis workstations. The successful implementation of this controller for an MEG system with a large number of channels demonstrates the feasibility of employing the present architecture in several other applications.
Computational Support for Technology- Investment Decisions
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey
2007-01-01
Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.
Versatile synchronized real-time MEG hardware controller for large-scale fast data acquisition
NASA Astrophysics Data System (ADS)
Sun, Limin; Han, Menglai; Pratt, Kevin; Paulson, Douglas; Dinh, Christoph; Esch, Lorenz; Okada, Yoshio; Hämäläinen, Matti
2017-05-01
Versatile controllers for accurate, fast, and real-time synchronized acquisition of large-scale data are useful in many areas of science, engineering, and technology. Here, we describe the development of a controller software based on a technique called queued state machine for controlling the data acquisition (DAQ) hardware, continuously acquiring a large amount of data synchronized across a large number of channels (>400) at a fast rate (up to 20 kHz/channel) in real time, and interfacing with applications for real-time data analysis and display of electrophysiological data. This DAQ controller was developed specifically for a 384-channel pediatric whole-head magnetoencephalography (MEG) system, but its architecture is useful for wide applications. This controller running in a LabVIEW environment interfaces with microprocessors in the MEG sensor electronics to control their real-time operation. It also interfaces with a real-time MEG analysis software via transmission control protocol/internet protocol, to control the synchronous acquisition and transfer of the data in real time from >400 channels to acquisition and analysis workstations. The successful implementation of this controller for an MEG system with a large number of channels demonstrates the feasibility of employing the present architecture in several other applications.
Yu, Kebing; Salomon, Arthur R
2009-12-01
Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.
Mutation detection in the human HSP70B′ gene by denaturing high-performance liquid chromatography
Hecker, Karl H.; Asea, Alexzander; Kobayashi, Kaoru; Green, Stacy; Tang, Dan; Calderwood, Stuart K.
2000-01-01
Variances, particularly single nucleotide polymorphisms (SNP), in the genomic sequence of individuals are the primary key to understanding gene function as it relates to differences in the susceptibility to disease, environmental influences, and therapy. In this report, the HSP70B′ gene is the target sequence for mutation detection in biopsy samples from human prostate cancer patients undergoing combined hyperthermia and radiation therapy at the Dana-Farber Cancer Institute, using temperature-modulated heteroduplex analysis (TMHA). The underlying principles of TMHA for mutation detection using DHPLC technology are discussed. The procedures involved in amplicon design for mutation analysis by DHPLC are detailed. The melting behavior of the complete coding sequence of the target gene is characterized using WAVEMAKERTM software. Four overlapping amplicons, which span the complete coding region of the HSP70B′ gene, amenable to mutation detection by DHPLC were identified based on the software-predicted melting profile of the target sequence. TMHA was performed on PCR products of individual amplicons of the HSP70B′ gene on the WAVE® Nucleic Acid Fragment Analysis System. The criteria for mutation calling by comparing wild-type and mutant chromatographic patterns are discussed. PMID:11189446
Mutation detection in the human HSP7OB' gene by denaturing high-performance liquid chromatography.
Hecker, K H; Asea, A; Kobayashi, K; Green, S; Tang, D; Calderwood, S K
2000-11-01
Variances, particularly single nucleotide polymorphisms (SNP), in the genomic sequence of individuals are the primary key to understanding gene function as it relates to differences in the susceptibility to disease, environmental influences, and therapy. In this report, the HSP70B' gene is the target sequence for mutation detection in biopsy samples from human prostate cancer patients undergoing combined hyperthermia and radiation therapy at the Dana-Farber Cancer Institute, using temperature-modulated heteroduplex analysis (TMHA). The underlying principles of TMHA for mutation detection using DHPLC technology are discussed. The procedures involved in amplicon design for mutation analysis by DHPLC are detailed. The melting behavior of the complete coding sequence of the target gene is characterized using WAVEMAKER software. Four overlapping amplicons, which span the complete coding region of the HSP70B' gene, amenable to mutation detection by DHPLC were identified based on the software-predicted melting profile of the target sequence. TMHA was performed on PCR products of individual amplicons of the HSP70B' gene on the WAVE Nucleic Acid Fragment Analysis System. The criteria for mutation calling by comparing wild-type and mutant chromatographic patterns are discussed.
Project management in the development of scientific software
NASA Astrophysics Data System (ADS)
Platz, Jochen
1986-08-01
This contribution is a rough outline of a comprehensive project management model for the development of software for scientific applications. The model was tested in the unique environment of the Siemens AG Corporate Research and Technology Division. Its focal points are the structuring of project content - the so-called phase organization, the project organization and the planning model used, and its particular applicability to innovative projects. The outline focuses largely on actual project management aspects rather than associated software engineering measures.
Students' Perception on the Usefulness of ICT-Based Language Program
ERIC Educational Resources Information Center
Wiyaka; Mujiyanto, Januarius; Rukmini, Dwi
2018-01-01
This paper presents the result of a survey on the usefulness of an ICT-based software program called DEC (a pseudonym for a particular commercial English learning resource). This program was utilized by English Departement University of PGRI Semarang as a complementary software in Integrated Course offered to the first semester students. The…
Study to Minimize Learning Progress Differences in Software Learning Class Using PLITAZ System
ERIC Educational Resources Information Center
Dong, Jian-Jie; Hwang, Wu-Yuin
2012-01-01
This study developed a system using two-phased strategies called "Pause Lecture, Instant Tutor-Tutee Match, and Attention Zone" (PLITAZ). This system was used to help solve learning challenges and to minimize learning progress differences in a software learning class. During a teacher's lecture time, students were encouraged to anonymously express…
ERIC Educational Resources Information Center
Zhang, Qing; Brode, Ly; Cao, Tingting; Thompson, J. E.
2017-01-01
We describe the construction and initial demonstration of a new instructional tool called ROXI (Research Opportunity through eXperimental Instruction). The system interfaces a series of electronic sensors to control software via the Arduino platform. The sensors have been designed to enable low-cost data collection in laboratory courses. Data are…
ERIC Educational Resources Information Center
Pendzick, Richard E.; Downs, Robert L.
2002-01-01
Describes software for electronic visitor management (EVM) called EasyLobbyTM, currently in use in thousands of federal and corporate installations throughout the world and its application for school and campus environments. Explains EasyLobbyTM's use to replace visitor logs, capture and store visitor data electronically, and provide badges that…
GRIDVIEW: Recent Improvements in Research and Education Software for Exploring Mars Topography
NASA Technical Reports Server (NTRS)
Roark, J. H.; Frey, H. V.
2001-01-01
We have developed an Interactive Data Language (IDL) scientific visualization software tool called GRIDVIEW that can be used in research and education to explore and study the most recent Mars Orbiter Laser Altimeter (MOLA) gridded topography of Mars (http://denali.gsfc.nasa.gov/mola_pub/gridview). Additional information is contained in the original extended abstract.
Talking high-tech turkey: USDA uses new software to analyze habitat management scenarios
H. Michael Rauscher; John E. Spearman; C. Preston Fout; Robert H. Giles; Mark J. Twery
2001-01-01
Researchers at the USDA Forest Service, Northeastern and Southern Research Stations, with many collaborators, have been developing a computer software product called the NED Decision Support System. This program is designed to help forestry consultants and their private landowner clients develop goals, assess current and potential conditions, provide ways to study and...
The Influence Of Component Alignment On The Life Of Total Knee Prostheses
NASA Astrophysics Data System (ADS)
Bugariu, Delia; Bereteu, Liviu
2012-12-01
An arthritic knee affects the patient's life by causing pain and limiting movement. If the cartilage and the bone surfaces are severely affected, the natural joint is replaced with an artificial joint. The procedure is called total knee arthroplasty (TKA). Lately, the numbers of implanted total knee prostheses grow steadily. An important factor in TKA is the perfect alignment of the total knee prosthesis (TKP) components. Component misalignment can lead to the prosthesis loss by producing wear particles. The paper proposes a study on mechanical behaviors of a TKP based on numerical analysis, using ANSYS software. The numerical analysis is based on both the normal and the changed angle of the components alignment.
AnClim and ProClimDB software for data quality control and homogenization of time series
NASA Astrophysics Data System (ADS)
Stepanek, Petr
2015-04-01
During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.
Prowess - A Software Model for the Ooty Wide Field Array
NASA Astrophysics Data System (ADS)
Marthi, Visweshwar Ram
2017-03-01
One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted H i emission from z ˜ 3.35. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA H i experiment.
Diagnostic Analyzer for Gearboxes (DAG): User's Guide. Version 3.1 for Microsoft Windows 3.1
NASA Technical Reports Server (NTRS)
Jammu, Vinay B.; Kourosh, Danai
1997-01-01
This documentation describes the Diagnostic Analyzer for Gearboxes (DAG) software for performing fault diagnosis of gearboxes. First, the user would construct a graphical representation of the gearbox using the gear, bearing, shaft, and sensor tools contained in the DAG software. Next, a set of vibration features obtained by processing the vibration signals recorded from the gearbox using a signal analyzer is required. Given this information, the DAG software uses an unsupervised neural network referred to as the Fault Detection Network (FDN) to identify the occurrence of faults, and a pattern classifier called Single Category-Based Classifier (SCBC) for abnormality scaling of individual vibration features. The abnormality-scaled vibration features are then used as inputs to a Structure-Based Connectionist Network (SBCN) for identifying faults in gearbox subsystems and components. The weights of the SBCN represent its diagnostic knowledge and are derived from the structure of the gearbox graphically presented in DAG. The outputs of SBCN are fault possibility values between 0 and 1 for individual subsystems and components in the gearbox with a 1 representing a definite fault and a 0 representing normality. This manual describes the steps involved in creating the diagnostic gearbox model, along with the options and analysis tools of the DAG software.
An Ontology-based Architecture for Integration of Clinical Trials Management Applications
Shankar, Ravi D.; Martins, Susana B.; O’Connor, Martin; Parrish, David B.; Das, Amar K.
2007-01-01
Management of complex clinical trials involves coordinated-use of a myriad of software applications by trial personnel. The applications typically use distinct knowledge representations and generate enormous amount of information during the course of a trial. It becomes vital that the applications exchange trial semantics in order for efficient management of the trials and subsequent analysis of clinical trial data. Existing model-based frameworks do not address the requirements of semantic integration of heterogeneous applications. We have built an ontology-based architecture to support interoperation of clinical trial software applications. Central to our approach is a suite of clinical trial ontologies, which we call Epoch, that define the vocabulary and semantics necessary to represent information on clinical trials. We are continuing to demonstrate and validate our approach with different clinical trials management applications and with growing number of clinical trials. PMID:18693919
Chemkin-II: A Fortran chemical kinetics package for the analysis of gas-phase chemical kinetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kee, R.J.; Rupley, F.M.; Miller, J.A.
1989-09-01
This document is the user's manual for the second-generation Chemkin package. Chemkin is a software package for whose purpose is to facilitate the formation, solution, and interpretation of problems involving elementary gas-phase chemical kinetics. It provides an especially flexible and powerful tool for incorporating complex chemical kinetics into simulations of fluid dynamics. The package consists of two major software components: an Interpreter and Gas-Phase Subroutine Library. The Interpreter is a program that reads a symbolic description of an elementary, user-specified chemical reaction mechanism. One output from the Interpreter is a data file that forms a link to the Gas-Phase Subroutinemore » Library. This library is a collection of about 100 highly modular Fortran subroutines that may be called to return information on equation of state, thermodynamic properties, and chemical production rates.« less
NASA Astrophysics Data System (ADS)
Tokareva, Victoria
2018-04-01
New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.
Kukafka, Rita; Khan, Sharib A.; Hutchinson, Carly; McFarlane, Delano J.; Li, Jianhua; Ancker, Jessica S.; Cohall, Alwyn
2007-01-01
We describe the steps taken by the Harlem Health Promotion Center to develop a community-specific health web portal aimed at promoting health and well-being in Harlem. Methods and results that begin with data collection and move onto elucidating requirements for the web portal are discussed. Sentiments of distrust in medical institutions, and the desire for community specific content and resources were among the needs emanating from our data analysis. These findings guided our decision to customize social software designed to foster connections, collaborations, flexibility, and interactivity; an “architecture of participation”. While we maintain that the leveraging of social software may indeed be the way to build healthy communities and support learning and engagement in underserved communities, our conclusion calls for careful thinking, testing and evaluation research to establish best practice models for leveraging these emerging technologies to support health improvements in the community. PMID:18693872
Numerical evaluation of an innovative cup layout for open volumetric solar air receivers
NASA Astrophysics Data System (ADS)
Cagnoli, Mattia; Savoldi, Laura; Zanino, Roberto; Zaversky, Fritz
2016-05-01
This paper proposes an innovative volumetric solar absorber design to be used in high-temperature air receivers of solar power tower plants. The innovative absorber, a so-called CPC-stacked-plate configuration, applies the well-known principle of a compound parabolic concentrator (CPC) for the first time in a volumetric solar receiver, heating air to high temperatures. The proposed absorber configuration is analyzed numerically, applying first the open-source ray-tracing software Tonatiuh in order to obtain the solar flux distribution on the absorber's surfaces. Next, a Computational Fluid Dynamic (CFD) analysis of a representative single channel of the innovative receiver is performed, using the commercial CFD software ANSYS Fluent. The solution of the conjugate heat transfer problem shows that the behavior of the new absorber concept is promising, however further optimization of the geometry will be necessary in order to exceed the performance of the classical absorber designs.
Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression
Wiedenhoeft, John; Brugel, Eric; Schliep, Alexander
2016-01-01
By integrating Haar wavelets with Hidden Markov Models, we achieve drastically reduced running times for Bayesian inference using Forward-Backward Gibbs sampling. We show that this improves detection of genomic copy number variants (CNV) in array CGH experiments compared to the state-of-the-art, including standard Gibbs sampling. The method concentrates computational effort on chromosomal segments which are difficult to call, by dynamically and adaptively recomputing consecutive blocks of observations likely to share a copy number. This makes routine diagnostic use and re-analysis of legacy data collections feasible; to this end, we also propose an effective automatic prior. An open source software implementation of our method is available at http://schlieplab.org/Software/HaMMLET/ (DOI: 10.5281/zenodo.46262). This paper was selected for oral presentation at RECOMB 2016, and an abstract is published in the conference proceedings. PMID:27177143
NASA Technical Reports Server (NTRS)
Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.
1992-01-01
The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.
Ontological Modeling for Integrated Spacecraft Analysis
NASA Technical Reports Server (NTRS)
Wicks, Erica
2011-01-01
Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.
Knowledge assistant: A sensor fusion framework for robotic environmental characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feddema, J.T.; Rivera, J.J.; Tucker, S.D.
1996-12-01
A prototype sensor fusion framework called the {open_quotes}Knowledge Assistant{close_quotes} has been developed and tested on a gantry robot at Sandia National Laboratories. This Knowledge Assistant guides the robot operator during the planning, execution, and post analysis stages of the characterization process. During the planning stage, the Knowledge Assistant suggests robot paths and speeds based on knowledge of sensors available and their physical characteristics. During execution, the Knowledge Assistant coordinates the collection of data through a data acquisition {open_quotes}specialist.{close_quotes} During execution and post analysis, the Knowledge Assistant sends raw data to other {open_quotes}specialists,{close_quotes} which include statistical pattern recognition software, a neuralmore » network, and model-based search software. After the specialists return their results, the Knowledge Assistant consolidates the information and returns a report to the robot control system where the sensed objects and their attributes (e.g. estimated dimensions, weight, material composition, etc.) are displayed in the world model. This paper highlights the major components of this system.« less
NASA Astrophysics Data System (ADS)
Tavakkol, Sasan; Lynett, Patrick
2017-08-01
In this paper, we introduce an interactive coastal wave simulation and visualization software, called Celeris. Celeris is an open source software which needs minimum preparation to run on a Windows machine. The software solves the extended Boussinesq equations using a hybrid finite volume-finite difference method and supports moving shoreline boundaries. The simulation and visualization are performed on the GPU using Direct3D libraries, which enables the software to run faster than real-time. Celeris provides a first-of-its-kind interactive modeling platform for coastal wave applications and it supports simultaneous visualization with both photorealistic and colormapped rendering capabilities. We validate our software through comparison with three standard benchmarks for non-breaking and breaking waves.
Mining Software Usage with the Automatic Library Tracking Database (ALTD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadri, Bilel; Fahey, Mark R
2013-01-01
Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less
Understanding software faults and their role in software reliability modeling
NASA Technical Reports Server (NTRS)
Munson, John C.
1994-01-01
This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.
2004-01-01
A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biyikli, Emre; To, Albert C., E-mail: albertto@pitt.edu
Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with themore » associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3–8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.« less
Multiresolution molecular mechanics: Implementation and efficiency
NASA Astrophysics Data System (ADS)
Biyikli, Emre; To, Albert C.
2017-01-01
Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with the associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3-8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.
FTOOLS: A general package of software to manipulate FITS files
NASA Astrophysics Data System (ADS)
Blackburn, J. K.; Shaw, R. A.; Payne, H. E.; Hayes, J. J. E.; Heasarc
1999-12-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. The FTOOLS package contains many utility programs which perform modular tasks on any FITS image or table, as well as higher-level analysis programs designed specifically for data from current and past high energy astrophysics missions. The utility programs for FITS tables are especially rich and powerful, and provide functions for presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual FTOOLS programs can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. FTOOLS development began in 1991 and has produced the main set of data analysis software for the current ASCA and RXTE space missions and for other archival sets of X-ray and gamma-ray data. The FTOOLS software package is supported on most UNIX platforms and on Windows machines. The user interface is controlled by standard parameter files that are very similar to those used by IRAF. The package is self documenting through a stand alone help task called fhelp. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris
2014-09-29
The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.
2014-01-01
Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416
ParseCNV integrative copy number variation association software with quality tracking
Glessner, Joseph T.; Li, Jin; Hakonarson, Hakon
2013-01-01
A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case–control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net. PMID:23293001
ParseCNV integrative copy number variation association software with quality tracking.
Glessner, Joseph T; Li, Jin; Hakonarson, Hakon
2013-03-01
A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case-control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net.
Viceconti, M; Testi, D; Gori, R; Zannoni, C
2000-01-01
The present work describes a technology transfer project called HIPCOM devoted to the re-engineering of the process used by a medical devices manufacturer to design custom-made hip prostheses. Although it started with insufficient support from the end-user management, a very tight scheduling and a moderate budget, the project developed into what is considered by all partners a success story. In particular, the development of the design software, called HIPCOM Interactive Design Environment (HIDE) was completed in a time shorter than any optimistic expectation. The software was quite stable since its first beta version, and once introduced at the user site it fully replaced the original procedure in less than two months. One year after the early adoption, more than 80 custom-made prostheses had been designed with HIDE and the user had reported only two bugs, both cosmetics. The scope of the present work was to report the development experience and to investigate the reasons for these positive results, with particular reference to the development procedure and the software architecture. The choice of TCL/TK as development language and the adoption of well-defined software architecture were found to be the success key factors. Other important determinants were found to be the adoption of an incremental software engineering strategy, well suited for small to medium projects and the presence in the development staff of a technology transfer expert.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Nakato, Ryuichiro; Itoh, Tahehiko; Shirahige, Katsuhiko
2013-07-01
Chromatin immunoprecipitation with high-throughput sequencing (ChIP-seq) can identify genomic regions that bind proteins involved in various chromosomal functions. Although the development of next-generation sequencers offers the technology needed to identify these protein-binding sites, the analysis can be computationally challenging because sequencing data sometimes consist of >100 million reads/sample. Herein, we describe a cost-effective and time-efficient protocol that is generally applicable to ChIP-seq analysis; this protocol uses a novel peak-calling program termed DROMPA to identify peaks and an additional program, parse2wig, to preprocess read-map files. This two-step procedure drastically reduces computational time and memory requirements compared with other programs. DROMPA enables the identification of protein localization sites in repetitive sequences and efficiently identifies both broad and sharp protein localization peaks. Specifically, DROMPA outputs a protein-binding profile map in pdf or png format, which can be easily manipulated by users who have a limited background in bioinformatics. © 2013 The Authors Genes to Cells © 2013 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.
GEOquery: a bridge between the Gene Expression Omnibus (GEO) and BioConductor.
Davis, Sean; Meltzer, Paul S
2007-07-15
Microarray technology has become a standard molecular biology tool. Experimental data have been generated on a huge number of organisms, tissue types, treatment conditions and disease states. The Gene Expression Omnibus (Barrett et al., 2005), developed by the National Center for Bioinformatics (NCBI) at the National Institutes of Health is a repository of nearly 140,000 gene expression experiments. The BioConductor project (Gentleman et al., 2004) is an open-source and open-development software project built in the R statistical programming environment (R Development core Team, 2005) for the analysis and comprehension of genomic data. The tools contained in the BioConductor project represent many state-of-the-art methods for the analysis of microarray and genomics data. We have developed a software tool that allows access to the wealth of information within GEO directly from BioConductor, eliminating many the formatting and parsing problems that have made such analyses labor-intensive in the past. The software, called GEOquery, effectively establishes a bridge between GEO and BioConductor. Easy access to GEO data from BioConductor will likely lead to new analyses of GEO data using novel and rigorous statistical and bioinformatic tools. Facilitating analyses and meta-analyses of microarray data will increase the efficiency with which biologically important conclusions can be drawn from published genomic data. GEOquery is available as part of the BioConductor project.
Documentation Driven Development for Complex Real-Time Systems
2004-12-01
This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real
Intent Specifications: An Approach to Building Human-Centered Specifications
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.
1999-01-01
This paper examines and proposes an approach to writing software specifications, based on research in systems theory, cognitive psychology, and human-machine interaction. The goal is to provide specifications that support human problem solving and the tasks that humans must perform in software development and evolution. A type of specification, called intent specifications, is constructed upon this underlying foundation.
Evaluation of Computer Based Foreign Language Learning Software by Teachers and Students
ERIC Educational Resources Information Center
Baz, Fatih Çagatay; Tekdal, Mehmet
2014-01-01
The aim of this study is to evaluate Computer Based Foreign Language Learning software called Dynamic Education (DYNED) by teachers and students. The study is conducted with randomly chosen ten primary schools with the participants of 522 7th grade students and 7 English teachers. Three points Likert scale for teachers and five points Likert scale…
Butterfly valve in a virtual environment
NASA Astrophysics Data System (ADS)
Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.
2017-11-01
Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.
ERIC Educational Resources Information Center
Jain, Ameeta; Thomson, Dianne; Farley, Alan; Mulready, Pamela
2012-01-01
The introduction of a social software blog space called the Trading Room in an undergraduate finance unit generated a great deal of activity to support student learning. A subsequent evaluation of this innovation, viewed through the lens of Activity Theory, demonstrated that students perceived high value in the opportunity it provided for them to…
Discrete Fourier Transform Analysis in a Complex Vector Space
NASA Technical Reports Server (NTRS)
Dean, Bruce H.
2009-01-01
Alternative computational strategies for the Discrete Fourier Transform (DFT) have been developed using analysis of geometric manifolds. This approach provides a general framework for performing DFT calculations, and suggests a more efficient implementation of the DFT for applications using iterative transform methods, particularly phase retrieval. The DFT can thus be implemented using fewer operations when compared to the usual DFT counterpart. The software decreases the run time of the DFT in certain applications such as phase retrieval that iteratively call the DFT function. The algorithm exploits a special computational approach based on analysis of the DFT as a transformation in a complex vector space. As such, this approach has the potential to realize a DFT computation that approaches N operations versus Nlog(N) operations for the equivalent Fast Fourier Transform (FFT) calculation.
Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations
NASA Technical Reports Server (NTRS)
Sorensen, Danny C.
1996-01-01
Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.
Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models
NASA Astrophysics Data System (ADS)
Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto
In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.
Application driven interface generation for EASIE. M.S. Thesis
NASA Technical Reports Server (NTRS)
Kao, Ya-Chen
1992-01-01
The Environment for Application Software Integration and Execution (EASIE) provides a user interface and a set of utility programs which support the rapid integration and execution of analysis programs about a central relational database. EASIE provides users with two basic modes of execution. One of them is a menu-driven execution mode, called Application-Driven Execution (ADE), which provides sufficient guidance to review data, select a menu action item, and execute an application program. The other mode of execution, called Complete Control Execution (CCE), provides an extended executive interface which allows in-depth control of the design process. Currently, the EASIE system is based on alphanumeric techniques only. It is the purpose of this project to extend the flexibility of the EASIE system in the ADE mode by implementing it in a window system. Secondly, a set of utilities will be developed to assist the experienced engineer in the generation of an ADE application.
Reservoir property grids improve with geostatistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogt, J.
1993-09-01
Visualization software, reservoir simulators and many other E and P software applications need reservoir property grids as input. Using geostatistics, as compared to other gridding methods, to produce these grids leads to the best output from the software programs. For the purpose stated herein, geostatistics is simply two types of gridding methods. Mathematically, these methods are based on minimizing or duplicating certain statistical properties of the input data. One geostatical method, called kriging, is used when the highest possible point-by-point accuracy is desired. The other method, called conditional simulation, is used when one wants statistics and texture of the resultingmore » grid to be the same as for the input data. In the following discussion, each method is explained, compared to other gridding methods, and illustrated through example applications. Proper use of geostatistical data in flow simulations, use of geostatistical data for history matching, and situations where geostatistics has no significant advantage over other methods, also will be covered.« less
Universal programming interface with concurrent access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alferov, Oleg
2004-10-07
There exist a number of devices with a positioning nature of operation, such as mechanical linear stages, temperature controllers, or filterwheels with discrete state, and most of them have different programming interfaces. The Universal Positioner software suggests the way to handle all of them is with a single approach, whereby a particular hardware driver is created from the template and by translating the actual commands used by the hardware to and from the universal programming interface. The software contains the universal API module itself, the demo simulation of hardware, and the front-end programs to help developers write their own softwaremore » drivers along with example drivers for actual hardware controllers. The software allows user application programs to call devices simultaneously without race conditions (multitasking and concurrent access). The template suggested in this package permits developers to integrate various devices easily into their applications using the same API. The drivers can be stacked; i.e., they can call each other via the same interface.« less
Porting DubaiSat-2 Flight Software to RTEMS: A Feasibility Study
NASA Astrophysics Data System (ADS)
Khoory, Mohammed; Al Shamsi, Zakareyya; Al Midfa, Ibrahim
2015-09-01
This paper details the process taken by EIAST to study RTEMS as a potential real-time operating system for future space missions. The direction was to attempt to run the DubaiSat-2 flight software under RTEMS 4.10.2 with as little modification to the original source as possible. The implementation used a “translation layer” to translate system calls used by the DS-2 flight software into RTEMS system calls. The RTEMS RTL project was integrated to satisfy the run-time loading requirement, and some differences in the filesystem were encountered and worked around. The implementation was tested for performance and stability, and comparisons were made. The conclusion is that RTEMS provides an adequate base for future space missions with certain advantages over other RTOS’s including cost, a smaller executable size, and control over the source. Drawbacks include the slow speed of loading tasks during runtime and some filesystem integrity issues during unexpected reboots.
Transparent Ada rendezvous in a fault tolerant distributed system
NASA Technical Reports Server (NTRS)
Racine, Roger
1986-01-01
There are many problems associated with distributing an Ada program over a loosely coupled communication network. Some of these problems involve the various aspects of the distributed rendezvous. The problems addressed involve supporting the delay statement in a selective call and supporting the else clause in a selective call. Most of these difficulties are compounded by the need for an efficient communication system. The difficulties are compounded even more by considering the possibility of hardware faults occurring while the program is running. With a hardware fault tolerant computer system, it is possible to design a distribution scheme and communication software which is efficient and allows Ada semantics to be preserved. An Ada design for the communications software of one such system will be presented, including a description of the services provided in the seven layers of an International Standards Organization (ISO) Open System Interconnect (OSI) model communications system. The system capabilities (hardware and software) that allow this communication system will also be described.
The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Using Vocal Dialects to Assess the Population Structure of Bigg's Killer Whales in Alaska
NASA Astrophysics Data System (ADS)
Sharpe, D. L.; Wade, P. R.; Castellote, M.; Cornick, L. A.
2016-02-01
Apex predators are important indicators of ecosystem health, but little is known about the population structure of Bigg's killer whales (Orcinus orca; i.e. "transient" ecotype) in western Alaska. Currently, all Bigg's killer whales in western Alaska are ascribed to a single broad stock for management under the US Marine Mammal Protection Act. However, recent nuclear microsatellite and mitochondrial DNA analyses indicate that this stock is likely comprised of genetically distinct sub-populations. In accordance with what is known about group-specific killer whale vocal dialects in other locations, we sought to evaluate and refine Bigg's killer whale population structure by using acoustic recordings to examine the spatial distribution of call types in western Alaska. Digital audio recordings were collected from 34 encounters with Bigg's killer whales throughout the Aleutian and Pribilof Islands in the summers of 2001-2007 and 2009-2010, then visually and aurally reviewed using the software Adobe Audition. High quality calls were identified and classified into discrete call types based on spectrographic characteristics and aural uniqueness. A comparative analysis of call types recorded throughout the study area revealed spatial segregation of call types, corresponding well with proposed genetic delineations. These results suggest that Bigg's killer whales exhibit regional vocal dialects, which can be used to help refine the putative sub-populations that have been genetically identified throughout western Alaska. Our findings support the proposal to restructure current stock designations.
Knowledge-based control of an adaptive interface
NASA Technical Reports Server (NTRS)
Lachman, Roy
1989-01-01
The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.
Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM
NASA Astrophysics Data System (ADS)
Kaiya, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji
Quality requirements are scattered over a requirements specification, thus it is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs. We proposed a technique called “spectrum analysis for quality requirements” which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification. In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature of the specification with respect to quality. Therefore, we can compare a specification of a system to another one with respect to quality. As a result, we can validate such a specification because we can check whether the specification has common quality features and know its specific features against specifications of existing similar systems. However, our first spectrum analysis for quality requirements required a lot of effort and knowledge of a problem domain and it was hard to reuse such knowledge to reduce the effort. We thus introduce domain knowledge called term-characteristic map (TCM) to reuse the knowledge for our quality spectrum analysis. Through several experiments, we evaluate our spectrum analysis, and main finding are as follows. First, we confirmed specifications of similar systems have similar quality spectra. Second, results of spectrum analysis using TCM are objective, i.e., different analysts can generate almost the same spectra when they analyze the same specification.
Status report of the SRT radiotelescope control software: the DISCOS project
NASA Astrophysics Data System (ADS)
Orlati, A.; Bartolini, M.; Buttu, M.; Fara, A.; Migoni, C.; Poppi, S.; Righini, S.
2016-08-01
The Sardinia Radio Telescope (SRT) is a 64-m fully-steerable radio telescope. It is provided with an active surface to correct for gravitational deformations, allowing observations from 300 MHz to 100 GHz. At present, three receivers are available: a coaxial LP-band receiver (305-410 MHz and 1.5-1.8 GHz), a C-band receiver (5.7-7.7 GHz) and a 7-feed K-band receiver (18-26.5 GHz). Several back-ends are also available in order to perform the different data acquisition and analysis procedures requested by scientific projects. The design and development of the SRT control software started in 2004, and now belongs to a wider project called DISCOS (Development of the Italian Single-dish COntrol System), which provides a common infrastructure to the three Italian radio telescopes (Medicina, Noto and SRT dishes). DISCOS is based on the Alma Common Software (ACS) framework, and currently consists of more than 500k lines of code. It is organized in a common core and three specific product lines, one for each telescope. Recent developments, carried out after the conclusion of the technical commissioning of the instrument (October 2013), consisted in the addition of several new features in many parts of the observing pipeline, spanning from the motion control to the digital back-ends for data acquisition and data formatting; we brie y describe such improvements. More importantly, in the last two years we have supported the astronomical validation of the SRT radio telescope, leading to the opening of the first public call for proposals in late 2015. During this period, while assisting both the engineering and the scientific staff, we massively employed the control software and were able to test all of its features: in this process we received our first feedback from the users and we could verify how the system performed in a real-life scenario, drawing the first conclusions about the overall system stability and performance. We examine how the system behaves in terms of network load and system load, how it reacts to failures and errors, and what components and services seem to be the most critical parts of our architecture, showing how the ACS framework impacts on these aspects. Moreover, the exposure to public utilization has highlighted the major flaws in our development and software management process, which had to be tuned and improved in order to achieve faster release cycles in response to user feedback, and safer deploy operations. In this regard we show how the introduction of testing practices, along with continuous integration, helped us to meet higher quality standards. Having identified the most critical aspects of our software, we conclude showing our intentions for the future development of DISCOS, both in terms of software features and software infrastructures.
NASA Astrophysics Data System (ADS)
Zhang, M.; Zheng, G. Z.; Zheng, W.; Chen, Z.; Yuan, T.; Yang, C.
2016-04-01
The magnetic confinement nuclear fusion experiments require various real-time control applications like plasma control. ITER has designed the Fast Plant System Controller (FPSC) for this job. ITER provided hardware and software standards and guidelines for building a FPSC. In order to develop various real-time FPSC applications efficiently, a flexible real-time software framework called J-TEXT real-time framework (JRTF) is developed by J-TEXT tokamak team. JRTF allowed developers to implement different functions as independent and reusable modules called Application Blocks (AB). The AB developers only need to focus on implementing the control tasks or the algorithms. The timing, scheduling, data sharing and eventing are handled by the JRTF pipelines. JRTF provides great flexibility on developing ABs. Unit test against ABs can be developed easily and ABs can even be used in non-JRTF applications. JRTF also provides interfaces allowing JRTF applications to be configured and monitored at runtime. JRTF is compatible with ITER standard FPSC hardware and ITER (Control, Data Access and Communication) CODAC Core software. It can be configured and monitored using (Experimental Physics and Industrial Control System) EPICS. Moreover the JRTF can be ported to different platforms and be integrated with supervisory control software other than EPICS. The paper presents the design and implementation of JRTF as well as brief test results.
Tools and Approaches for the Construction of Knowledge Models from the Neuroscientific Literature
Burns, Gully A. P. C.; Khan, Arshad M.; Ghandeharizadeh, Shahram; O’Neill, Mark A.; Chen, Yi-Shin
2015-01-01
Within this paper, we describe a neuroinformatics project (called “NeuroScholar,” http://www.neuroscholar.org/) that enables researchers to examine, manage, manipulate, and use the information contained within the published neuroscientific literature. The project is built within a multi-level, multi-component framework constructed with the use of software engineering methods that themselves provide code-building functionality for neuroinformaticians. We describe the different software layers of the system. First, we present a hypothetical usage scenario illustrating how NeuroScholar permits users to address large-scale questions in a way that would otherwise be impossible. We do this by applying NeuroScholar to a “real-world” neuroscience question: How is stress-related information processed in the brain? We then explain how the overall design of NeuroScholar enables the system to work and illustrate different components of the user interface. We then describe the knowledge management strategy we use to store interpretations. Finally, we describe the software engineering framework we have devised (called the “View-Primitive-Data Model framework,” [VPDMf]) to provide an open-source, accelerated software development environment for the project. We believe that NeuroScholar will be useful to experimental neuroscientists by helping them interact with the primary neuroscientific literature in a meaningful way, and to neuroinformaticians by providing them with useful, affordable software engineering tools. PMID:15055395
NASA Astrophysics Data System (ADS)
Szidarovszky, Tamás; Jono, Maho; Yamanouchi, Kaoru
2018-07-01
A user-friendly and cross-platform software called Laser-Induced Molecular Alignment and Orientation simulator (LIMAO) has been developed. The program can be used to simulate within the rigid rotor approximation the rotational dynamics of gas phase molecules induced by linearly polarized intense laser fields at a given temperature. The software is implemented in the Java and Mathematica programming languages. The primary aim of LIMAO is to aid experimental scientists in predicting and analyzing experimental data representing laser-induced spatial alignment and orientation of molecules.
Advanced Transport Operating System (ATOPS) control display unit software description
NASA Technical Reports Server (NTRS)
Slominski, Christopher J.; Parks, Mark A.; Debure, Kelly R.; Heaphy, William J.
1992-01-01
The software created for the Control Display Units (CDUs), used for the Advanced Transport Operating Systems (ATOPS) project, on the Transport Systems Research Vehicle (TSRV) is described. Module descriptions are presented in a standardized format which contains module purpose, calling sequence, a detailed description, and global references. The global reference section includes subroutines, functions, and common variables referenced by a particular module. The CDUs, one for the pilot and one for the copilot, are used for flight management purposes. Operations performed with the CDU affects the aircraft's guidance, navigation, and display software.
Internet: An Overview of Key Technology Policy Issues Affecting Its Use and Growth
2004-12-29
Alliance OSS Open Source Software SSA Social Security Administration SSN Social Security Number TLD Top Level Domain UCE Unsolicited Commercial E-mail... Alliance General Types of Internet Services B2B Business-to-Business B2G Business-to-Government G2B Government-to-Business G2C Government-to-Citizen G2G...response. Such software is called “adware.” Software CRS-7 programs that include spyware can be sold or provided for free, on a disk (or other media ) or
Warning: Projects May Be Closer than They Appear
NASA Technical Reports Server (NTRS)
Africa, Colby
2004-01-01
I had been working for two years as the technical product manager for a large software company, when their partner company gave me a call. They needed good software engineers to customize a new version of software, and they thought I was their guy. They told me what they wanted to do to the software, and they even showed me some prototypes. Their idea was to take the basic software tool that the large company was producing and make it more accessible to the customer. They would do this by building in flexibility based on user skill level and organizational maturity. I thought that was a fascinating approach, and I bought into it in a big way. I decided to leave my job and join up with the smaller company as their director of software engineering.
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Advanced Optimal Extraction for the Spitzer/IRS
NASA Astrophysics Data System (ADS)
Lebouteiller, V.; Bernard-Salas, J.; Sloan, G. C.; Barry, D. J.
2010-02-01
We present new advances in the spectral extraction of pointlike sources adapted to the Infrared Spectrograph (IRS) on board the Spitzer Space Telescope. For the first time, we created a supersampled point-spread function of the low-resolution modules. We describe how to use the point-spread function to perform optimal extraction of a single source and of multiple sources within the slit. We also examine the case of the optimal extraction of one or several sources with a complex background. The new algorithms are gathered in a plug-in called AdOpt which is part of the SMART data analysis software.
Control electronics for a multi-laser/multi-detector scanning system
NASA Technical Reports Server (NTRS)
Kennedy, W.
1980-01-01
The Mars Rover Laser Scanning system uses a precision laser pointing mechanism, a photodetector array, and the concept of triangulation to perform three dimensional scene analysis. The system is used for real time terrain sensing and vision. The Multi-Laser/Multi-Detector laser scanning system is controlled by a digital device called the ML/MD controller. A next generation laser scanning system, based on the Level 2 controller, is microprocessor based. The new controller capabilities far exceed those of the ML/MD device. The first draft circuit details and general software structure are presented.
Analysis of Bioprocesses. Dynamic Modeling is a Must.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramkrishna, Doraiswami; Song, Hyun-Seob
2016-01-01
The goal of this paper is to report on the performance of a promising dynamic framework based on the cybernetic concepts which have evolved over three decades. We present case studies of successful dynamic simulations of wild-type strains as well as specific KO mutants on bacteria and yeast. An extensive metabolic engineering effort, including genome scale networks, is called for to secure the methodology and realize its full potential. Towards this end, the software AUMIC is under active further development to enable speedy applications. Its wide use will be enabled by a publication that is shortly due.
Predicting Long-Range Traversability from Short-Range Stereo-Derived Geometry
NASA Technical Reports Server (NTRS)
Turmon, Michael; Tang, Benyang; Howard, Andrew; Brjaracharya, Max
2010-01-01
Based only on its appearance in imagery, this program uses close-range 3D terrain analysis to produce training data sufficient to estimate the traversability of terrain beyond 3D sensing range. This approach is called learning from stereo (LFS). In effect, the software transfers knowledge from middle distances, where 3D geometry provides training cues, into the far field where only appearance is available. This is a viable approach because the same obstacle classes, and sometimes the same obstacles, are typically present in the mid-field and the farfield. Learning thus extends the effective look-ahead distance of the sensors.
Open web system of Virtual labs for nuclear and applied physics
NASA Astrophysics Data System (ADS)
Saldikov, I. S.; Afanasyev, V. V.; Petrov, V. I.; Ternovykh, M. Yu
2017-01-01
An example of virtual lab work on unique experimental equipment is presented. The virtual lab work is software based on a model of real equipment. Virtual labs can be used for educational process in nuclear safety and analysis field. As an example it includes the virtual lab called “Experimental determination of the material parameter depending on the pitch of a uranium-water lattice”. This paper included general description of this lab. A description of a database on the support of laboratory work on unique experimental equipment which is included this work, its concept development are also presented.
The AST3 controlling and operating software suite for automatic sky survey
NASA Astrophysics Data System (ADS)
Hu, Yi; Shang, Zhaohui; Ma, Bin; Hu, Keliang
2016-07-01
We have developed a specialized software package, called ast3suite, to achieve the remote control and automatic sky survey for AST3 (Antarctic Survey Telescope) from scratch. It includes several daemon servers and many basic commands. Each program does only one single task, and they work together to make AST3 a robotic telescope. A survey script calls basic commands to carry out automatic sky survey. Ast3suite was carefully tested in Mohe, China in 2013 and has been used at Dome, Antarctica in 2015 and 2016 with the real hardware for practical sky survey. Both test results and practical using showed that ast3suite had worked very well without any manual auxiliary as we expected.
Software Design Document SAF Simulation Host CSCI (8). Volume 1, Sections 1.0 - 2.7
1991-06-01
list for the patch, testing edges matching grid-loc-woni for intervisibility blocks. Calls Function IWhere Described Icheck edges Sec. 2.6.7.1.8 Table...edges matching grid-loc-word for intervisibility blocks. Calls Function Where Described check box Sec. 2.6.7.1.31 treelines Sec. 2.6.7.1.16 Icheck edges
Development of an Information Model for Kidney Transplant Wait List.
Bircan, Hüseyin Yüce; Özçelik, Ümit; Uysal, Nida; Demirağ, Alp; Haberal, Mehmet
2015-11-01
Deceased-donor kidney transplant is unique among surgical procedures that are an urgent procedure performed in an elective population. It has not been possible to accurately determine when a given patient will be called for transplant. Patients on the active transplant list can be called for a transplant at any time. As a result, every effort must be made to optimize their health according to best practices and published clinical practice guidelines. Once the patient is placed on the transplant wait list after undergoing an initial extensive evaluation, continued surveillance is required. Therefore, we developed a kidney transplant wait list surveillance software program that alerts organ transplant coordinator on time regarding which patients need a work-up. The new designed software has a database of our waiting patients with their completed and pending controls. The software also has built-in functions to warn the responsible staff with an E-mail. If one of the controls of a recipient delayed, the software sends an automated E-mail to the staff regarding the patients delayed controls. The software is a Web application that works on any platform with a Web browser and Internet connection and allows access by multiple users. The software has been developed with NET platform. The database is SQL server. The software has the following functions: patient communication info, search, alert list, alert E-mail, control entry, and system management. As of January 2014, a total of 21 000 patients were registered on the National Kidney Transplant wait list in Turkey and the kidney transplant wait list had been expanding by 2000 to 3000 patients each year. Therefore computerized wait list programs are crucial to help to transplant centers to keep their patients up-to-date on time.
Surface infrastructure functions, requirements and subsystems for a manned Mars mission
NASA Technical Reports Server (NTRS)
Fairchild, Kyle
1986-01-01
Planning and development for a permanently manned scientific outpost on Mars requires an in-depth understanding and analysis of the functions the outpost is expected to perform. The optimum configuration that accomplishes these functions then arises during the trade studies process. In a project this complex, it becomes necessary to use a formal methodology to document the design and planning process. The method chosen for this study is called top-down functional decomposition. This method is used to determine the functions that are needed to accomplish the overall mission, then determine what requirements and systems are needed to do each of the functions. This method facilitates automation of the trades and options process. In the example, this was done with an off-the shelf software package called TK! olver. The basic functions that a permanently manned outpost on Mars must accomplish are: (1) Establish the Life Critical Systems; (2) Support Planetary Sciences and Exploration; and (3) Develop and Maintain Long-term Support Functions, including those systems needed towards self-sufficiency. The top-down functional decomposition methology, combined with standard spread sheet software, offers a powerful tool to quickly assess various design trades and analyze options. As the specific subsystems, and the relational rule algorithms are further refined, it will be possible to very accurately determine the implications of continually evolving mission requirements.
Ontology-based specification, identification and analysis of perioperative risks.
Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich
2017-09-06
Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.
ERIC Educational Resources Information Center
Computer Symbolic, Inc., Washington, DC.
A pseudo assembly language, PAL, was developed and specified for use as the lowest level in a general, multilevel programing system for the realization of cost-effective, hardware-independent Naval software. The language was developed as part of the system called FIRMS (Fast Iterative Recursive Macro System) and is sufficiently general to allow…
Study of a unified hardware and software fault-tolerant architecture
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan; Alger, Linda; Friend, Steven; Greeley, Gregory; Sacco, Stephen; Adams, Stuart
1989-01-01
A unified architectural concept, called the Fault Tolerant Processor Attached Processor (FTP-AP), that can tolerate hardware as well as software faults is proposed for applications requiring ultrareliable computation capability. An emulation of the FTP-AP architecture, consisting of a breadboard Motorola 68010-based quadruply redundant Fault Tolerant Processor, four VAX 750s as attached processors, and four versions of a transport aircraft yaw damper control law, is used as a testbed in the AIRLAB to examine a number of critical issues. Solutions of several basic problems associated with N-Version software are proposed and implemented on the testbed. This includes a confidence voter to resolve coincident errors in N-Version software. A reliability model of N-Version software that is based upon the recent understanding of software failure mechanisms is also developed. The basic FTP-AP architectural concept appears suitable for hosting N-Version application software while at the same time tolerating hardware failures. Architectural enhancements for greater efficiency, software reliability modeling, and N-Version issues that merit further research are identified.
Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin
2018-01-01
A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.
Failure analysis on false call probe pins of microprocessor test equipment
NASA Astrophysics Data System (ADS)
Tang, L. W.; Ong, N. R.; Mohamad, I. S. B.; Alcain, J. B.; Retnasamy, V.
2017-09-01
A study has been conducted to investigate failure analysis on probe pins of test modules for microprocessor. The `health condition' of the probe pin is determined by the resistance value. A test module of 5V power supplied from Arduino UNO with "Four-wire Ohm measurement" method is implemented in this study to measure the resistance of the probe pins of a microprocessor. The probe pins from a scrapped computer motherboard is used as the test sample in this study. The functionality of the test module was validated with the pre-measurement experiment via VEE Pro software. Lastly, the experimental work have demonstrated that the implemented test module have the capability to identify the probe pin's `health condition' based on the measured resistance value.
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1991-01-01
We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P.; Gerstein, Mark
2010-01-01
The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers’ continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems. PMID:20439753
Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P; Gerstein, Mark
2010-05-18
The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers' continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Optimum-AIV: A planning and scheduling system for spacecraft AIV
NASA Technical Reports Server (NTRS)
Arentoft, M. M.; Fuchs, Jens J.; Parrod, Y.; Gasquet, Andre; Stader, J.; Stokes, I.; Vadon, H.
1991-01-01
A project undertaken for the European Space Agency (ESA) is presented. The project is developing a knowledge based software system for planning and scheduling of activities for spacecraft assembly, integration, and verification (AIV). The system extends into the monitoring of plan execution and the plan repair phase. The objectives are to develop an operational kernel of a planning, scheduling, and plan repair tool, called OPTIMUM-AIV, and to provide facilities which will allow individual projects to customize the kernel to suit its specific needs. The kernel shall consist of a set of software functionalities for assistance in initial specification of the AIV plan, in verification and generation of valid plans and schedules for the AIV activities, and in interactive monitoring and execution problem recovery for the detailed AIV plans. Embedded in OPTIMUM-AIV are external interfaces which allow integration with alternative scheduling systems and project databases. The current status of the OPTIMUM-AIV project, as of Jan. 1991, is that a further analysis of the AIV domain has taken place through interviews with satellite AIV experts, a software requirement document (SRD) for the full operational tool was approved, and an architectural design document (ADD) for the kernel excluding external interfaces is ready for review.
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
Toxmatch-a new software tool to aid in the development and evaluation of chemically similar groups.
Patlewicz, G; Jeliazkova, N; Gallegos Saliner, A; Worth, A P
2008-01-01
Chemical similarity is a widely used concept in toxicology, and is based on the hypothesis that similar compounds should have similar biological activities. This forms the underlying basis for performing read-across, forming chemical groups and developing (Quantitative) Structure-Activity Relationships ((Q)SARs). Chemical similarity is often perceived as structural similarity but in fact there are a number of other approaches that can be used to assess similarity. A systematic similarity analysis usually comprises two main steps. Firstly the chemical structures to be compared need to be characterised in terms of relevant descriptors which encode their physicochemical, topological, geometrical and/or surface properties. A second step involves a quantitative comparison of those descriptors using similarity (or dissimilarity) indices. This work outlines the use of chemical similarity principles in the formation of endpoint specific chemical groupings. Examples are provided to illustrate the development and evaluation of chemical groupings using a new software application called Toxmatch that was recently commissioned by the European Chemicals Bureau (ECB), of the European Commission's Joint Research Centre. Insights from using this software are highlighted with specific focus on the prospective application of chemical groupings under the new chemicals legislation, REACH.
Derbidge, Renatus; Feiten, Linus; Conradt, Oliver; Heusser, Peter; Baumgartner, Stephan
2013-01-01
Photographs of mistletoe (Viscum album L.) berries taken by a permanently fixed camera during their development in autumn were subjected to an outline shape analysis by fitting path curves using a mathematical algorithm from projective geometry. During growth and maturation processes the shape of mistletoe berries can be described by a set of such path curves, making it possible to extract changes of shape using one parameter called Lambda. Lambda describes the outline shape of a path curve. Here we present methods and software to capture and measure these changes of form over time. The present paper describes the software used to automatize a number of tasks including contour recognition, optimization of fitting the contour via hill-climbing, derivation of the path curves, computation of Lambda and blinding the pictures for the operator. The validity of the program is demonstrated by results from three independent measurements showing circadian rhythm in mistletoe berries. The program is available as open source and will be applied in a project to analyze the chronobiology of shape in mistletoe berries and the buds of their host trees. PMID:23565255
Proactive Security Testing and Fuzzing
NASA Astrophysics Data System (ADS)
Takanen, Ari
Software is bound to have security critical flaws, and no testing or code auditing can ensure that software is flaw-less. But software security testing requirements have improved radically during the past years, largely due to criticism from security conscious consumers and Enterprise customers. Whereas in the past, security flaws were taken for granted (and patches were quietly and humbly installed), they now are probably one of the most common reasons why people switch vendors or software providers. The maintenance costs from security updates often add to become one of the biggest cost items to large Enterprise users. Fortunately test automation techniques have also improved. Techniques like model-based testing (MBT) enable efficient generation of security tests that reach good confidence levels in discovering zero-day mistakes in software. This technique is called fuzzing.
2017-03-17
NASA engineers and test directors gather in Firing Room 3 in the Launch Control Center at NASA's Kennedy Space Center in Florida, to watch a demonstration of the automated command and control software for the agency's Space Launch System (SLS) and Orion spacecraft. In front, far right, is Charlie Blackwell-Thompson, launch director for Exploration Mission 1 (EM-1). The software is called the Ground Launch Sequencer. It will be responsible for nearly all of the launch commit criteria during the final phases of launch countdowns. The Ground and Flight Application Software Team (GFAST) demonstrated the software. It was developed by the Command, Control and Communications team in the Ground Systems Development and Operations (GSDO) Program. GSDO is helping to prepare the center for the first test flight of Orion atop the SLS on EM-1.
Development of the electronic health records for nursing education (EHRNE) software program.
Kowitlawakul, Yanika; Wang, Ling; Chan, Sally Wai-Chi
2013-12-01
This paper outlines preliminary research of an innovative software program that enables the use of an electronic health record in a nursing education curriculum. The software application program is called EHRNE, which stands for Electronic Heath Record for Nursing Education. The aim of EHRNE is to enhance student's learning of health informatics when they are working in the simulation laboratory. Integrating EHRNE into the nursing curriculum exposes students to electronic health records before they go into the workplace. A qualitative study was conducted using focus group interviews of nine nursing students. Nursing students' perceptions of using the EHRNE application were explored. The interviews were audio-taped and transcribed verbatim. The data was analyzed following the Colaizzi (1978) guideline. Four main categories that related to the EHRNE application were identified from the interviews: functionality, data management, timing and complexity, and accessibility. The analysis of the data revealed advantages and limitations of using EHRNE in the classroom setting. Integrating the EHRNE program into the curriculum will promote students' awareness of electronic documentation and enhance students' learning in the simulation laboratory. Preliminary findings suggested that before integrating the EHRNE program into the nursing curriculum, educational sessions for both students and faculty outlining the software's purpose, advantages, and limitations were needed. Following the educational sessions, further investigation of students' perceptions and learning using the EHRNE program is recommended. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hobby, Kirsten; Gallagher, Richard T; Caldwell, Patrick; Wilson, Ian D
2009-01-01
This work describes the identification of 'isotopically enriched' metabolites of 4-cyanoaniline using the unique features of the software package 'Spectral Simplicity'. The software is capable of creating the theoretical mass spectra for partially isotope-enriched compounds, and subsequently performing an elemental composition analysis to give the elemental formula for the 'isotopically enriched' metabolite. A novel mass spectral correlation method, called 'FuzzyFit', was employed. 'FuzzyFit' utilises the expected experimental distribution of errors in both mass accuracy and isotope pattern and enables discrimination between statistically probable and improbable candidate formulae. The software correctly determined the molecular formulae of ten previously described metabolites of 4-cyanoaniline confirming the technique of partial isotope enrichment can produce results analogous to standard methodologies. Six previously unknown species were also identified, based on the presence of the unique 'designer' isotope ratio. Three of the unknowns were tentatively identified as N-acetylglutamine, O-methyl-N acetylglucuronide and a putative fatty acid conjugate. The discovery of a significant number of unknown species of a model drug with a comprehensive history of investigation highlights the potential for enhancement to the analytical process by the use of 'designer' isotope ratio compounds. The 'FuzzyFit' methodology significantly aided the elucidation of candidate formulae, by provision of a vastly simplified candidate formula data set. Copyright (c) 2008 John Wiley & Sons, Ltd.
Using Selection Pressure as an Asset to Develop Reusable, Adaptable Software Systems
NASA Technical Reports Server (NTRS)
Berrick, Stephen; Lynnes, Christopher
2007-01-01
The Goddard Earth Sciences Data and Information Services Center (GES DISC) at NASA has over the years developed and honed several reusable architectural components for supporting large-scale data centers with a large customer base. These include a processing system (S4PM) and an archive system (S4PA) based upon a workflow engine called the Simple Scalable Script based Science Processor (S4P) and an online data visualization and analysis system (Giovanni). These subsystems are currently reused internally in a variety of combinations to implement customized data management on behalf of instrument science teams and other science investigators. Some of these subsystems (S4P and S4PM) have also been reused by other data centers for operational science processing. Our experience has been that development and utilization of robust interoperable and reusable software systems can actually flourish in environments defined by heterogeneous commodity hardware systems the emphasis on value-added customer service and the continual goal for achieving higher cost efficiencies. The repeated internal reuse that is fostered by such an environment encourages and even forces changes to the software that make it more reusable and adaptable. Allowing and even encouraging such selective pressures to software development has been a key factor In the success of S4P and S4PM which are now available to the open source community under the NASA Open source Agreement
Connecting the person with dementia and family: a feasibility study of a telepresence robot
2014-01-01
Background Maintenance of communication is important for people with dementia living in long-term care. The purpose of this study was to assess the feasibility of using “Giraff”, a telepresence robot to enhance engagement between family and a person with dementia living in long-term care. Methods A mixed-methods approach involving semi-structured interviews, call records and video observational data was used. Five people with dementia and their family member participated in a discussion via the Giraff robot for a minimum of six times over a six-week period. A feasibility framework was used to assess feasibility and included video analysis of emotional response and engagement. Results Twenty-six calls with an average duration of 23 mins took place. Residents showed a general state of positive emotions across the calls with a high level of engagement and a minimal level of negative emotions. Participants enjoyed the experience and families reported that the Giraff robot offered the opportunity to reduce social isolation. A number of software and hardware challenges were encountered. Conclusions Participants perceived this novel approach to engage families and people with dementia as a feasible option. Participants were observed and also reported to enjoy the experience. The technical challenges identified have been improved in a newer version of the robot. Future research should include a feasibility trial of longer duration, with a larger sample and a cost analysis. PMID:24456417
The composition of intern work while on call.
Fletcher, Kathlyn E; Visotcky, Alexis M; Slagle, Jason M; Tarima, Sergey; Weinger, Matthew B; Schapira, Marilyn M
2012-11-01
The work of house staff is being increasingly scrutinized as duty hours continue to be restricted. To describe the distribution of work performed by internal medicine interns while on call. Prospective time motion study on general internal medicine wards at a VA hospital affiliated with a tertiary care medical center and internal medicine residency program. Internal medicine interns. Trained observers followed interns during a "call" day. The observers continuously recorded the tasks performed by interns, using customized task analysis software. We measured the amount of time spent on each task. We calculated means and standard deviations for the amount of time spent on six categories of tasks: clinical computer work (e.g., writing orders and notes), non-patient communication, direct patient care (work done at the bedside), downtime, transit and teaching/learning. We also calculated means and standard deviations for time spent on specific tasks within each category. We compared the amount of time spent on the top three categories using analysis of variance. The largest proportion of intern time was spent in clinical computer work (40 %). Thirty percent of time was spent on non-patient communication. Only 12 % of intern time was spent at the bedside. Downtime activities, transit and teaching/learning accounted for 11 %, 5 % and 2 % of intern time, respectively. Our results suggest that during on call periods, relatively small amounts of time are spent on direct patient care and teaching/learning activities. As intern duty hours continue to decrease, attention should be directed towards preserving time with patients and increasing time in education.
Building Interactive Visualizations for Geochronological Data
NASA Astrophysics Data System (ADS)
Zeringue, J.; Bowring, J. F.; McLean, N. M.; Pastor, F.
2014-12-01
Since the early 1990s, Ken Ludwig's Isoplot software has been the tool of choice for visualization and analysis of isotopic data used for geochronology. The software is an add-in to Microsoft Excel that allows users to generate visual representations of data. However, recent changes to Excel have made Isoplot more difficult to use and maintain, and the software is no longer supported. In the last several years, the Cyber Infrastructure Research and Development Lab for the Earth Sciences (CIRDLES), at the College of Charleston, has worked collaboratively with geochronologists to develop U-Pb_Redux, a software product that provides some of Isoplot's functionality for U-Pb geochronology. However, the community needs a full and complete Isoplot replacement that is open source, platform independent, and not dependent on proprietary software. This temporary lapse in tooling also presents a tremendous opportunity for scientific computing in the earth sciences. When Isoplot was written for Excel, it gained much of the platform's flexibility and power but also was burdened with its limitations. For example, Isoplot could not be used outside of Excel, could not be cross-platform (so long as Excel wasn't), could not be embedded in other applications, and only static images could be produced. Nonetheless this software was and still is a powerful tool that has served the community for more than two decades and the trade-offs were more than acceptable. In 2014, we seek to gain flexibility not available with Excel. We propose that the next generation of charting software be reusable, platform-agnostic, and interactive. This new software should allow scientists to easily explore—not just passively view—their data. Beginning in the fall of 2013, researchers at CIRDLES began planning for and prototyping a 21st-century replacement for Isoplot, which we call Topsoil, an anagram of Isoplot. This work is being conducted in the public domain at https://github.com/CIRDLES/topsoil. We welcome and encourage community participation and contributions.
SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.
Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao
2014-08-08
Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.
Yu, Kebing; Salomon, Arthur R.
2010-01-01
Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895
Mapping CMMI Level 2 to Scrum Practices: An Experience Report
NASA Astrophysics Data System (ADS)
Diaz, Jessica; Garbajosa, Juan; Calvo-Manzano, Jose A.
CMMI has been adopted advantageously in large companies for improvements in software quality, budget fulfilling, and customer satisfaction. However SPI strategies based on CMMI-DEV require heavy software development processes and large investments in terms of cost and time that medium/small companies do not deal with. The so-called light software development processes, such as Agile Software Development (ASD), deal with these challenges. ASD welcomes changing requirements and stresses the importance of adaptive planning, simplicity and continuous delivery of valuable software by short time-framed iterations. ASD is becoming convenient in a more and more global, and changing software market. It would be greatly useful to be able to introduce agile methods such as Scrum in compliance with CMMI process model. This paper intends to increase the understanding of the relationship between ASD and CMMI-DEV reporting empirical results that confirm theoretical comparisons between ASD practices and CMMI level2.
Genome-Assisted Prediction of Quantitative Traits Using the R Package sommer.
Covarrubias-Pazaran, Giovanny
2016-01-01
Most traits of agronomic importance are quantitative in nature, and genetic markers have been used for decades to dissect such traits. Recently, genomic selection has earned attention as next generation sequencing technologies became feasible for major and minor crops. Mixed models have become a key tool for fitting genomic selection models, but most current genomic selection software can only include a single variance component other than the error, making hybrid prediction using additive, dominance and epistatic effects unfeasible for species displaying heterotic effects. Moreover, Likelihood-based software for fitting mixed models with multiple random effects that allows the user to specify the variance-covariance structure of random effects has not been fully exploited. A new open-source R package called sommer is presented to facilitate the use of mixed models for genomic selection and hybrid prediction purposes using more than one variance component and allowing specification of covariance structures. The use of sommer for genomic prediction is demonstrated through several examples using maize and wheat genotypic and phenotypic data. At its core, the program contains three algorithms for estimating variance components: Average information (AI), Expectation-Maximization (EM) and Efficient Mixed Model Association (EMMA). Kernels for calculating the additive, dominance and epistatic relationship matrices are included, along with other useful functions for genomic analysis. Results from sommer were comparable to other software, but the analysis was faster than Bayesian counterparts in the magnitude of hours to days. In addition, ability to deal with missing data, combined with greater flexibility and speed than other REML-based software was achieved by putting together some of the most efficient algorithms to fit models in a gentle environment such as R.
MPATHav: A software prototype for multiobjective routing in transportation risk assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganter, J.H.; Smith, J.D.
Most routing problems depend on several important variables: transport distance, population exposure, accident rate, mandated roads (e.g., HM-164 regulations), and proximity to emergency response resources are typical. These variables may need to be minimized or maximized, and often are weighted. `Objectives` to be satisfied by the analysis are thus created. The resulting problems can be approached by combining spatial analysis techniques from geographic information systems (GIS) with multiobjective analysis techniques from the field of operations research (OR); we call this hybrid multiobjective spatial analysis` (MOSA). MOSA can be used to discover, display, and compare a range of solutions that satisfymore » a set of objectives to varying degrees. For instance, a suite of solutions may include: one solution that provides short transport distances, but at a cost of high exposure; another solution that provides low exposure, but long distances; and a range of solutions between these two extremes.« less
Turbine blade forced response prediction using FREPS
NASA Technical Reports Server (NTRS)
Murthy, Durbha, V.; Morel, Michael R.
1993-01-01
This paper describes a software system called FREPS (Forced REsponse Prediction System) that integrates structural dynamic, steady and unsteady aerodynamic analyses to efficiently predict the forced response dynamic stresses in axial flow turbomachinery blades due to aerodynamic and mechanical excitations. A flutter analysis capability is also incorporated into the system. The FREPS system performs aeroelastic analysis by modeling the motion of the blade in terms of its normal modes. The structural dynamic analysis is performed by a finite element code such as MSC/NASTRAN. The steady aerodynamic analysis is based on nonlinear potential theory and the unsteady aerodynamic analyses is based on the linearization of the non-uniform potential flow mean. The program description and presentation of the capabilities are reported herein. The effectiveness of the FREPS package is demonstrated on the High Pressure Oxygen Turbopump turbine of the Space Shuttle Main Engine. Both flutter and forced response analyses are performed and typical results are illustrated.
NGSANE: a lightweight production informatics framework for high-throughput data analysis.
Buske, Fabian A; French, Hugh J; Smith, Martin A; Clark, Susan J; Bauer, Denis C
2014-05-15
The initial steps in the analysis of next-generation sequencing data can be automated by way of software 'pipelines'. However, individual components depreciate rapidly because of the evolving technology and analysis methods, often rendering entire versions of production informatics pipelines obsolete. Constructing pipelines from Linux bash commands enables the use of hot swappable modular components as opposed to the more rigid program call wrapping by higher level languages, as implemented in comparable published pipelining systems. Here we present Next Generation Sequencing ANalysis for Enterprises (NGSANE), a Linux-based, high-performance-computing-enabled framework that minimizes overhead for set up and processing of new projects, yet maintains full flexibility of custom scripting when processing raw sequence data. Ngsane is implemented in bash and publicly available under BSD (3-Clause) licence via GitHub at https://github.com/BauerLab/ngsane. Denis.Bauer@csiro.au Supplementary data are available at Bioinformatics online.
General Mode Scanning Probe Microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somnath, Suhas; Jesse, Stephen
A critical part of SPM measurements is the information transfer from the probe-sample junction to the measurement system. Current information transfer methods heavily compress the information-rich data stream by averaging the data over a time interval, or via heterodyne detection approaches such as lock-in amplifiers and phase-locked loops. As a consequence, highly valuable information at the sub-microsecond time scales or information from frequencies outside the measurement band is lost. We have developed a fundamentally new approach called General Mode (G-mode), where we can capture the complete information stream from the detectors in the microscope. The availability of the complete informationmore » allows the microscope operator to analyze the data via information-theory analysis or comprehensive physical models. Furthermore, the complete data stream enables advanced data-driven filtering algorithms, multi-resolution imaging, ultrafast spectroscropic imaging, spatial mapping of multidimensional variability in material properties, etc. Though we applied this approach to scanning probe microscopy, the general philosophy of G-mode can be applied to many other modes of microscopy. G-mode data is captured by completely custom software written in LabVIEW and Matlab. The software generates the waveforms to electrically, thermally, or mechanically excite the SPM probe. It handles real-time communications with the microscope software for operations such as moving the SPM probe position and also controls other instrumentation hardware. The software also controls multiple variants of high-speed data acquisition cards to excite the SPM probe with the excitation waveform and simultaneously measure multiple channels of information from the microscope detectors at sampling rates of 1-100 MHz. The software also saves the raw data to the computer and allows the microscope operator to visualize processed or filtered data during the experiment. The software performs all these features while offering a user-friendly interface.« less
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.