Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1981-01-01
The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.
Quantitative evaluation of software packages for single-molecule localization microscopy.
Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael
2015-08-01
The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 80 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 15 consists of 27 packages; set 16 consists of 53 packages. Each software review lists producer, time and place of evaluation,…
A Characteristics Approach to the Evaluation of Economics Software Packages.
ERIC Educational Resources Information Center
Lumsden, Keith; Scott, Alex
1988-01-01
Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)
de Hoop, Bartjan; Gietema, Hester; van Ginneken, Bram; Zanen, Pieter; Groenewegen, Gerard; Prokop, Mathias
2009-04-01
We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules >or=8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages.
ERIC Educational Resources Information Center
Weaver, Dave, Ed.
This document consists of 30 microcomputer software package evaluations prepared for the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory (NWREL). The concise, single-sheet resume describing and evaluating each software package includes source, cost, ability level,…
NDE Software Developed at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Roth, Donald J.; Martin, Richard E.; Rauser, Richard W.; Nichols, Charles; Bonacuse, Peter J.
2014-01-01
NASA Glenn Research Center has developed several important Nondestructive Evaluation (NDE) related software packages for different projects in the last 10 years. Three of the software packages have been created with commercial-grade user interfaces and are available to United States entities for download on the NASA Technology Transfer and Partnership Office server (https://sr.grc.nasa.gov/). This article provides brief overviews of the software packages.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M
2008-01-01
To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM
2009-01-01
Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 68 microcomputer software package evaluations prepared by MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. There are 26 packages in set 13 and 42 in set 14. Each software review lists producer, time and place of evaluation, cost, ability level,…
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 170 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 11 consists of 37 packages. Set 12 consists of 34 packages. A special unnumbered set, entitled LIBRA Reviews, treats 99 packages…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
Virginia Transit Performance Evaluation Package (VATPEP).
DOT National Transportation Integrated Search
1987-01-01
The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...
Validation of thermal effects of LED package by using Elmer finite element simulation method
NASA Astrophysics Data System (ADS)
Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap
2017-02-01
The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.
Laboratory Connections: Review of Two Commercial Interfacing Packages.
ERIC Educational Resources Information Center
Powers, Michael H.
1989-01-01
Evaluates two Apple II interfacing packages designed to measure pH: (1) "Experiments in Chemistry" by HRM Software and (2) "Voltage Plotter III" by Vernier Software. Provides characteristics and screen dumps of each package. Reports both systems are suitable for high school or beginning college laboratories. (MVL)
Developing a Virtual Physics World
ERIC Educational Resources Information Center
Wegener, Margaret; McIntyre, Timothy J.; McGrath, Dominic; Savage, Craig M.; Williamson, Michael
2012-01-01
In this article, the successful implementation of a development cycle for a physics teaching package based on game-like virtual reality software is reported. The cycle involved several iterations of evaluating students' use of the package followed by instructional and software development. The evaluation used a variety of techniques, including…
Software for Managing Personal Files.
ERIC Educational Resources Information Center
Lundeen, Gerald
1989-01-01
Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…
Reference datasets for bioequivalence trials in a two-group parallel design.
Fuglsang, Anders; Schütz, Helmut; Labes, Detlew
2015-03-01
In order to help companies qualify and validate the software used to evaluate bioequivalence trials with two parallel treatment groups, this work aims to define datasets with known results. This paper puts a total 11 datasets into the public domain along with proposed consensus obtained via evaluations from six different software packages (R, SAS, WinNonlin, OpenOffice Calc, Kinetica, EquivTest). Insofar as possible, datasets were evaluated with and without the assumption of equal variances for the construction of a 90% confidence interval. Not all software packages provide functionality for the assumption of unequal variances (EquivTest, Kinetica), and not all packages can handle datasets with more than 1000 subjects per group (WinNonlin). Where results could be obtained across all packages, one showed questionable results when datasets contained unequal group sizes (Kinetica). A proposal is made for the results that should be used as validation targets.
Code of Federal Regulations, 2013 CFR
2013-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Code of Federal Regulations, 2014 CFR
2014-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Code of Federal Regulations, 2012 CFR
2012-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Code of Federal Regulations, 2010 CFR
2010-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe
2017-10-01
To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.
Evaluation of copy number variation detection for a SNP array platform
2014-01-01
Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668
Diagnostic evaluation of three cardiac software packages using a consecutive group of patients
2011-01-01
Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.
ERIC Educational Resources Information Center
Lovell, Ashley C., Comp.
Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…
Sustaining Software-Intensive Systems
2006-05-01
2.2 Multi- Service Operational Test and Evaluation .......................................4 2.3 Stable Software Baseline...or equivalent document • completed Multi- Service Operational Test and Evaluation (MOT&E) for the potential production software package (or OT&E if...not multi- service ) • stable software production baseline • complete and current software documentation • Authority to Operate (ATO) for an
A Software Development Approach for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Cushion, Steve
2005-01-01
Over the last 5 years we have developed, produced, tested, and evaluated an authoring software package to produce web-based, interactive, audio-enhanced language-learning material. That authoring package has been used to produce language-learning material in French, Spanish, German, Arabic, and Tamil. We are currently working on increasing…
Microcomputer Software Programs for Vocational Education.
ERIC Educational Resources Information Center
Rodenstein, Judith, Ed.; Lambert, Roger, Ed.
Over 200 microcomputer software packages applicable to vocational education are listed. Most of the programs are available for the Apple, TRS-80, and Commodore microcomputers. The packages have been reviewed, but have not been formally evaluated. Titles of the programs with names and addresses of the distributors are provided. Telephone numbers…
SuperLab LT: Evaluation and Uses in Teaching Experimental Psychology
ERIC Educational Resources Information Center
Ragozzine, Frank
2002-01-01
I describe and evaluate SuperLab LT (Chase & Abboud, 1990), a software package that enables students to replicate classic experiments in cognitive psychology. I also discuss the package with respect to its uses in teaching an undergraduate course in Experimental Psychology. Although the package has minor flaws, SuperLab LT provides numerous…
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 24 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory. Each software review lists source, cost, ability level, subject, topic, medium of transfer, required hardware, required software,…
A Study of Visualization for Mathematics Education
NASA Technical Reports Server (NTRS)
Daugherty, Sarah C.
2008-01-01
Graphical representations such as figures, illustrations, and diagrams play a critical role in mathematics and they are equally important in mathematics education. However, graphical representations in mathematics textbooks are static, Le. they are used to illustrate only a specific example or a limited set. of examples. By using computer software to visualize mathematical principles, virtually there is no limit to the number of specific cases and examples that can be demonstrated. However, we have not seen widespread adoption of visualization software in mathematics education. There are currently a number of software packages that provide visualization of mathematics for research and also software packages specifically developed for mathematics education. We conducted a survey of mathematics visualization software packages, summarized their features and user bases, and analyzed their limitations. In this survey, we focused on evaluating the software packages for their use with mathematical subjects adopted by institutions of secondary education in the United States (middle schools and high schools), including algebra, geometry, trigonometry, and calculus. We found that cost, complexity, and lack of flexibility are the major factors that hinder the widespread use of mathematics visualization software in education.
Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Lívia Almeida Bueno; Freitas, Deborah Queiroz
2015-01-01
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei
2014-01-01
Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
MicroSIFT Courseware Evaluations (88-168).
ERIC Educational Resources Information Center
Holznagel, Donald C., Ed.
This document consists of 81 microcomputer software package evaluations prepared for the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory (NWREL) and distributed during 1983 as "sets" 6, 7, and 8. The concise, single-sheet resume describing and evaluating each…
Automated data collection in single particle electron microscopy
Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget
2016-01-01
Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944
User’s guide for GcClust—An R package for clustering of regional geochemical data
Ellefsen, Karl J.; Smith, David B.
2016-04-08
GcClust is a software package developed by the U.S. Geological Survey for statistical clustering of regional geochemical data, and similar data such as regional mineralogical data. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of the user’s guide are bundled together in R’s unit of sharable code, which is called a “package.” The user’s guide includes step-by-step instructions showing how the functions are used to cluster data and to evaluate the clustering results. These functions are demonstrated in this report using test data, which are included in the package.
Evaluating Process Sustainability Using Flowsheet Monitoring
Environmental metric software can be used to evaluate the sustainability of a chemical based on data from the chemical process that is used to manufacture it. One problem in developing environmental metric software is that chemical process simulation packages typically do not rea...
Investigation into the development of computer aided design software for space based sensors
NASA Technical Reports Server (NTRS)
Pender, C. W.; Clark, W. L.
1987-01-01
The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.
Improving the quality of EHR recording in primary care: a data quality feedback tool.
van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A
2017-01-01
Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Evaluating Process Sustainability Using Flowsheet Monitoring (Abstract)
Environmental metric software can be used to evaluate the sustainability of a chemical based upon data from the chemical process that is used to manufacture it. One problem in developing environmental metric software is that chemical process simulation packages typically do not p...
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
PB-AM: An open-source, fully analytical linear poisson-boltzmann solver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui
2016-11-02
We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less
1993-05-01
limitation of the software package would not allow DATE/I’ME FREQUENCY (kHz) the program to run over 2359 to 0001 UT. This was 18.1 19.0 21.4 24.0...Capability (LWPC), software package devel- oped at NOSC (FERGUSON et al 1989) and adapted by us to the Macintosh personal computer. We find that this... software works very well. Our investigations are to I evaluate and devise geophysical models to be used with . LWPC in assessing VLF communications and
2015-01-01
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser.1 One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing’s capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of “re-dockings” with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing’s docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening. PMID:25151852
Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee
2014-09-22
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.
PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.
Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa
2017-06-05
We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Sieverts, Eric G.; And Others
1993-01-01
Reports on tests evaluating nine microcomputer software packages designed for information storage and retrieval: BRS-Search, dtSearch, InfoBank, Micro-OPC, Q&A, STN-PFS, Strix, TINman, and ZYindex. Tables and narrative evaluations detail results related to security, hardware, user features, search capability, indexing, input, maintenance of files,…
Rastgou, Fereydoon; Shojaeifard, Maryam; Amin, Ahmad; Ghaedian, Tahereh; Firoozabadi, Hasan; Malek, Hadi; Yaghoobi, Nahid; Bitarafan-Rajabi, Ahmad; Haghjoo, Majid; Amouzadeh, Hedieh; Barati, Hossein
2014-12-01
Recently, the phase analysis of gated single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) has become feasible via several software packages for the evaluation of left ventricular mechanical dyssynchrony. We compared two quantitative software packages, quantitative gated SPECT (QGS) and Emory cardiac toolbox (ECTb), with tissue Doppler imaging (TDI) as the conventional method for the evaluation of left ventricular mechanical dyssynchrony. Thirty-one patients with severe heart failure (ejection fraction ≤35%) and regular heart rhythm, who referred for gated-SPECT MPI, were enrolled. TDI was performed within 3 days after MPI. Dyssynchrony parameters derived from gated-SPECT MPI were analyzed by QGS and ECTb and were compared with the Yu index and septal-lateral wall delay measured by TDI. QGS and ECTb showed a good correlation for assessment of phase histogram bandwidth (PHB) and phase standard deviation (PSD) (r = 0.664 and r = 0.731, P < .001, respectively). However, the mean value of PHB and PSD by ECTb was significantly higher than that of QGS. No significant correlation was found between ECTb and QGS and the Yu index. Nevertheless, PHB, PSD, and entropy derived from QGS revealed a significant (r = 0.424, r = 0.478, r = 0.543, respectively; P < .02) correlation with septal-lateral wall delay. Despite a good correlation between QGS and ECTb software packages, different normal cut-off values of PSD and PHB should be defined for each software package. There was only a modest correlation between phase analysis of gated-SPECT MPI and TDI data, especially in the population of heart failure patients with both narrow and wide QRS complex.
Pre-Mastering and CD-WO Evaluations
NASA Technical Reports Server (NTRS)
Hecox, D.; Hyon, J.; Martin, M.; Marski, K.; Shields, E.; Sorensen, S.; Teramae, S.
1993-01-01
This article reviews the features and functionality of five desktop pre-mastering software packages for the PC. Desktop pre-mastering packages are aimed primarily at end-users interested in bringing CD-ROM publishing tasks in-house, rather than traditional CD-ROM developers.
Nuclear Data Online Services at Peking University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, T.S.; Guo, Z.Y.; Ye, W.G.
2005-05-24
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
Nuclear Data Online Services at Peking University
NASA Astrophysics Data System (ADS)
Fan, T. S.; Guo, Z. Y.; Ye, W. G.; Liu, W. L.; Liu, T. J.; Liu, C. X.; Chen, J. X.; Tang, G. Y.; Shi, Z. M.; Huang, X. L.; Chen, J. E.
2005-05-01
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
Software engineering the mixed model for genome-wide association studies on large samples.
Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J
2009-11-01
Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.
ERIC Educational Resources Information Center
Podany, Zita
This guide lists 19 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 17 computer coordinators from educational software preview centers and evaluation agencies. The following software is listed: (1) ASK-IT, an authoring tool; (2) Balance of the Planet, an environmental…
ERIC Educational Resources Information Center
Podany, Zita
This guide lists 21 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 12 computer coordinators from educational software preview centers and evaluation agencies. These software products have been selected as not being likely to appear in the reviews produced by major software…
ERIC Educational Resources Information Center
Ahl, David H.
1985-01-01
The "College Explorer" is a software package (for the 64K Apple II, IBM PC, TRS-80 model III and 4 microcomputers) which aids in choosing a college. The major features of this package (manufactured by The College Board) are described and evaluated. Sample input/output is included. (JN)
Parallelization of Rocket Engine System Software (Press)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet
1996-01-01
The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.
Software for Middle School Physical Science.
ERIC Educational Resources Information Center
Podany, Zita
This final report in the MicroSIFT series reviews 10 software packages that deal mainly with the areas of electricity, magnetism, and heat energy. Software titles appearing in this report were selected because they were judged to be exemplary according to various criteria in the MicroSIFT Evaluator's Guide, with some additions to address science…
Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.
ERIC Educational Resources Information Center
Rice, James
1988-01-01
Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…
Development of an Ada package library
NASA Technical Reports Server (NTRS)
Burton, Bruce; Broido, Michael
1986-01-01
A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.
High resolution ultrasonic spectroscopy system for nondestructive evaluation
NASA Technical Reports Server (NTRS)
Chen, C. H.
1991-01-01
With increased demand for high resolution ultrasonic evaluation, computer based systems or work stations become essential. The ultrasonic spectroscopy method of nondestructive evaluation (NDE) was used to develop a high resolution ultrasonic inspection system supported by modern signal processing, pattern recognition, and neural network technologies. The basic system which was completed consists of a 386/20 MHz PC (IBM AT compatible), a pulser/receiver, a digital oscilloscope with serial and parallel communications to the computer, an immersion tank with motor control of X-Y axis movement, and the supporting software package, IUNDE, for interactive ultrasonic evaluation. Although the hardware components are commercially available, the software development is entirely original. By integrating signal processing, pattern recognition, maximum entropy spectral analysis, and artificial neural network functions into the system, many NDE tasks can be performed. The high resolution graphics capability provides visualization of complex NDE problems. The phase 3 efforts involve intensive marketing of the software package and collaborative work with industrial sectors.
Original Courseware for Introductory Psychology: Implementation and Evaluation.
ERIC Educational Resources Information Center
Slotnick, Robert S.
1988-01-01
Describes the implementation and field testing of PsychWare, a courseware package for introductory psychology developed and field tested at New York Institute of Technology. Highlights include the courseware package (10 software programs, a faculty manual, and a student workbook), and instructional design features (simulations, real-time…
NASA PC software evaluation project
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Kuan, Julie C.
1986-01-01
The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.
cit: hypothesis testing software for mediation analysis in genomic applications.
Millstein, Joshua; Chen, Gary K; Breton, Carrie V
2016-08-01
The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Stefanski, Philip L.
2015-01-01
Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.
Packaging Software Assets for Reuse
NASA Astrophysics Data System (ADS)
Mattmann, C. A.; Marshall, J. J.; Downs, R. R.
2010-12-01
The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.
Roos, Malgorzata; Stawarczyk, Bogna
2012-07-01
This study evaluated and compared Weibull parameters of resin bond strength values using six different general-purpose statistical software packages for two-parameter Weibull distribution. Two-hundred human teeth were randomly divided into 4 groups (n=50), prepared and bonded on dentin according to the manufacturers' instructions using the following resin cements: (i) Variolink (VAN, conventional resin cement), (ii) Panavia21 (PAN, conventional resin cement), (iii) RelyX Unicem (RXU, self-adhesive resin cement) and (iv) G-Cem (GCM, self-adhesive resin cement). Subsequently, all specimens were stored in water for 24h at 37°C. Shear bond strength was measured and the data were analyzed using Anderson-Darling goodness-of-fit (MINITAB 16) and two-parameter Weibull statistics with the following statistical software packages: Excel 2011, SPSS 19, MINITAB 16, R 2.12.1, SAS 9.1.3. and STATA 11.2 (p≤0.05). Additionally, the three-parameter Weibull was fitted using MNITAB 16. Two-parameter Weibull calculated with MINITAB and STATA can be compared using an omnibus test and using 95% CI. In SAS only 95% CI were directly obtained from the output. R provided no estimates of 95% CI. In both SAS and R the global comparison of the characteristic bond strength among groups is provided by means of the Weibull regression. EXCEL and SPSS provided no default information about 95% CI and no significance test for the comparison of Weibull parameters among the groups. In summary, conventional resin cement VAN showed the highest Weibull modulus and characteristic bond strength. There are discrepancies in the Weibull statistics depending on the software package and the estimation method. The information content in the default output provided by the software packages differs to very high extent. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Canputer Science and Technology: Introduction to Software Packages
1984-04-01
Table 5 Sources of Software Packages.20 Table 6 Reference Services Matrix . 33 Table 7 Reference Matrix.40 LIST OF FIGURES Figure 1 Document...consideration should be given to the acquisition of appropriate software packages to replace or upgrade existing services and to provide services not...Consequently, there are many companies that produce only software packages, and are committed to providing training, service , and support. These vendors
XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.
Ching, Daniel J; Gürsoy, Dogˇa
2017-03-01
The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
XDesign: An open-source software package for designing X-ray imaging phantoms and experiments
Ching, Daniel J.; Gursoy, Dogˇa
2017-02-21
Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
NASA Technical Reports Server (NTRS)
1989-01-01
Loredan Biomedical, Inc.'s LIDO, a computerized physical therapy system, was purchased by NASA in 1985 for evaluation as a Space Station Freedom exercise program. In 1986, while involved in an ARC muscle conditioning project, Malcom Bond, Loredan's chairman, designed an advanced software package for NASA which became the basis for LIDOSOFT software used in the commercially available system. The system employs a "proprioceptive" software program which perceives internal body conditions, induces perturbations to muscular effort and evaluates the response. Biofeedback on a screen allows a patient to observe his own performance.
Design Optimization Toolkit: Users' Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less
Integrative Lifecourse and Genetic Analysis of Military Working Dogs
2012-10-01
Recognition), ICR (Intelligent Character Recognition) and HWR ( Handwriting Recognition). A number of various software packages were evaluated and we have...the third-party software is able to recognize check-boxes and columns and do a reasonable job with handwriting – which is does. This workflow will
Integrative Lifecourse and Genetic Analysis of Military Working Dogs
2012-10-01
Intelligent Character Recognition) and HWR ( Handwriting Recognition). A number of various software packages were evaluated and we have settled on a...third-party software is able to recognize check-boxes and columns and do a reasonable job with handwriting – which is does. This workflow will
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
SIRU utilization. Volume 2: Software description and program documentation
NASA Technical Reports Server (NTRS)
Oehrle, J.; Whittredge, R.
1973-01-01
A complete description of the additional analysis, development and evaluation provided for the SIRU system as identified in the requirements for the SIRU utilization program is presented. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The modules are mounted in this package so that their measurement input axes form a unique symmetrical pattern that corresponds to the array of perpendiculars to the faces of a regular dodecahedron. This six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. Documentation of the additional software and software modifications required to implement the utilization capabilities includes assembly listings and flow charts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shevitz, Daniel Wolf; Key, Brian P.; Garcia, Daniel B.
2017-09-05
The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.
Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane
2017-08-01
The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All software packages underestimated the upper airway dimensions of the anthropomorphic phantom.
State-of-the-art software for window energy-efficiency rating and labeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arasteh, D.; Finlayson, E.; Huang, J.
1998-07-01
Measuring the thermal performance of windows in typical residential buildings is an expensive proposition. Not only is laboratory testing expensive, but each window manufacturer typically offers hundreds of individual products, each of which has different thermal performance properties. With over a thousand window manufacturers nationally, a testing-based rating system would be prohibitively expensive to the industry and to consumers. Beginning in the early 1990s, simulation software began to be used as part of a national program for rating window U-values. The rating program has since been expanded to include Solar Hear Gain Coefficients and is now being extended to annualmore » energy performance. This paper describes four software packages available to the public from Lawrence Berkeley National Laboratory (LBNL). These software packages are used to evaluate window thermal performance: RESFEN (for evaluating annual energy costs), WINDOW (for calculating a product`s thermal performance properties), THERM (a preprocessor for WINDOW that determines two-dimensional heat-transfer effects), and Optics (a preprocessor for WINDOW`s glass database). Software not only offers a less expensive means than testing to evaluate window performance, it can also be used during the design process to help manufacturers produce windows that will meet target specifications. In addition, software can show small improvements in window performance that might not be detected in actual testing because of large uncertainties in test procedures.« less
An Evaluation of the Learning of Undergraduates Using E-Learning in a Tertiary Institution in China
ERIC Educational Resources Information Center
Liu, James N. K.; Cheng, Xiangqian
2008-01-01
In recent years, tertiary institutions in developed countries have made extensive use of Course Management Systems (CMSs), software packages designed to help educators create online learning communities. To date, however, such packages have been little used in higher education institutions (HEIs) in China. In this article, we describe the…
NASA Astrophysics Data System (ADS)
Kosterina, E. A.; Isagadzhieva, Z. Sh
2018-01-01
Data of the ecological-hydrogeological fieldwork at the Predvolzhye region of the Republic of Tatarstan were analyzed. A geofiltration model of the Buinsk region area near the village of Stary Studenets in the territory of the Republic of Tatarstan was constructed by the PM5 software package. The model can be developed to become the basis for estimation of the groundwater reserves of the territory, modeling the operation of water intake wells, designing the location of water intake wells, and evaluation of their operational capabilities, and constructing sanitary protection zones.
Pediconi, Federica; Catalano, Carlo; Venditti, Fiammetta; Ercolani, Mauro; Carotenuto, Luigi; Padula, Simona; Moriconi, Enrica; Roselli, Antonella; Giacomelli, Laura; Kirchin, Miles A; Passariello, Roberto
2005-07-01
The objective of this study was to evaluate the value of a color-coded automated signal intensity curve software package for contrast-enhanced magnetic resonance mammography (CE-MRM) in patients with suspected breast cancer. Thirty-six women with suspected breast cancer based on mammographic and sonographic examinations were preoperatively evaluated on CE-MRM. CE-MRM was performed on a 1.5-T magnet using a 2D Flash dynamic T1-weighted sequence. A dosage of 0.1 mmol/kg of Gd-BOPTA was administered at a flow rate of 2 mL/s followed by 10 mL of saline. Images were analyzed with the new software package and separately with a standard display method. Statistical comparison was performed of the confidence for lesion detection and characterization with the 2 methods and of the diagnostic accuracy for characterization compared with histopathologic findings. At pathology, 54 malignant lesions and 14 benign lesions were evaluated. All 68 (100%) lesions were detected with both methods and good correlation with histopathologic specimens was obtained. Confidence for both detection and characterization was significantly (P < or = 0.025) better with the color-coded method, although no difference (P > 0.05) between the methods was noted in terms of the sensitivity, specificity, and overall accuracy for lesion characterization. Excellent agreement between the 2 methods was noted for both the determination of lesion size (kappa = 0.77) and determination of SI/T curves (kappa = 0.85). The novel color-coded signal intensity curve software allows lesions to be visualized as false color maps that correspond to conventional signal intensity time curves. Detection and characterization of breast lesions with this method is quick and easily interpretable.
Software and System Warranty Issues and Generic Warranty Clause.
1987-06-01
communications networks and other government-furnished equipment. Special attention . must also be paid to software packages, such as operating...34 :- ’.-’".,:., ",’., . .’.’ . ’ -.’ -. ., .- . 0;/ .’.. ,; .’ ’.’...’. • . .. . * Phose A - Devekpment Test and Evaluation conducted at a test facility. * Phaie - Devopment Test and
Software and package applicating for network meta-analysis: A usage-based comparative study.
Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao
2017-12-21
To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, William Eugene
These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly:more » the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.« less
Trend Monitoring System (TMS) graphics software
NASA Technical Reports Server (NTRS)
Brown, J. S.
1979-01-01
A prototype bus communications systems, which is being used to support the Trend Monitoring System (TMS) and to evaluate the bus concept is considered. A set of FORTRAN-callable graphics subroutines for the host MODCOMP comuter, and an approach to splitting graphics work between the host and the system's intelligent graphics terminals are described. The graphics software in the MODCOMP and the operating software package written for the graphics terminals are included.
Evaluator's Guide for Microcomputer-Based Instructional Packages. Revised.
ERIC Educational Resources Information Center
International Council for Computers in Education, Eugene, OR.
Two instruments have been developed to aid teachers and other educators in evaluating educational software and courseware: the "Courseware Description" form and the "Courseware Evaluation" form. Complete instructions for using both forms are provided in this guide, along with the forms themselves. Prior to the instructions is…
Reference Gene Validation for RT-qPCR, a Note on Different Available Software Packages
De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia
2015-01-01
Background An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. Results 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. Conclusions This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use. PMID:25825906
Reference gene validation for RT-qPCR, a note on different available software packages.
De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia
2015-01-01
An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use.
NLM microcomputer-based tutorials (for microcomputers). Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perkins, M.
1990-04-01
The package consists of TOXLEARN--a microcomputer-based training package for TOXLINE (Toxicology Information Online), CHEMLEARN-a microcomputer-based training package for CHEMLINE (Chemical Information Online), MEDTUTOR--a microcomputer-based training package for MEDLINE (Medical Information Online), and ELHILL LEARN--a microcomputer-based training package for the ELHILL search and retrieval software that supports the above-mentioned databases...Software Description: The programs were developed under PILOTplus using the NLM LEARN Programmer. They run on IBM-PC, XT, AT, PS/2, and fully compatible computers. The programs require 512K RAM memory, one disk drive, and DOS 2.0 or higher. The software supports most monochrome, color graphics, enhanced color graphics, or visual graphics displays.
Preliminary Dynamic Feasibility and Analysis of a Spherical, Wind-Driven (Tumbleweed), Martian Rover
NASA Technical Reports Server (NTRS)
Flick, John J.; Toniolo, Matthew D.
2005-01-01
The process and findings are presented from a preliminary feasibility study examining the dynamics characteristics of a spherical wind-driven (or Tumbleweed) rover, which is intended for exploration of the Martian surface. The results of an initial feasibility study involving several worst-case mobility situations that a Tumbleweed rover might encounter on the surface of Mars are discussed. Additional topics include the evaluation of several commercially available analysis software packages that were examined as possible platforms for the development of a Monte Carlo Tumbleweed mission simulation tool. This evaluation lead to the development of the Mars Tumbleweed Monte Carlo Simulator (or Tumbleweed Simulator) using the Vortex physics software package from CM-Labs, Inc. Discussions regarding the development and evaluation of the Tumbleweed Simulator, as well as the results of a preliminary analysis using the tool are also presented. Finally, a brief conclusions section is presented.
NASA Technical Reports Server (NTRS)
Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey
1993-01-01
Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.
Integrated software system for low level waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worku, G.
1995-12-31
In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less
NMRbox: A Resource for Biomolecular NMR Computation.
Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C
2017-04-25
Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.
XP-SWMM is a commercial software package used throughout the United States and around the world for simulation of storm, sanitary and combined sewer systems. It was designed based on the EPA Storm Water Management Model (EPA SWMM), but has enhancements and additional algorithms f...
Evaluation of open source data mining software packages
Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt
2009-01-01
Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...
Evaluating Dense 3d Reconstruction Software Packages for Oblique Monitoring of Crop Canopy Surface
NASA Astrophysics Data System (ADS)
Brocks, S.; Bareth, G.
2016-06-01
Crop Surface Models (CSMs) are 2.5D raster surfaces representing absolute plant canopy height. Using multiple CMSs generated from data acquired at multiple time steps, a crop surface monitoring is enabled. This makes it possible to monitor crop growth over time and can be used for monitoring in-field crop growth variability which is useful in the context of high-throughput phenotyping. This study aims to evaluate several software packages for dense 3D reconstruction from multiple overlapping RGB images on field and plot-scale. A summer barley field experiment located at the Campus Klein-Altendorf of University of Bonn was observed by acquiring stereo images from an oblique angle using consumer-grade smart cameras. Two such cameras were mounted at an elevation of 10 m and acquired images for a period of two months during the growing period of 2014. The field experiment consisted of nine barley cultivars that were cultivated in multiple repetitions and nitrogen treatments. Manual plant height measurements were carried out at four dates during the observation period. The software packages Agisoft PhotoScan, VisualSfM with CMVS/PMVS2 and SURE are investigated. The point clouds are georeferenced through a set of ground control points. Where adequate results are reached, a statistical analysis is performed.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.
Construction and comparative evaluation of different activity detection methods in brain FDG-PET.
Buchholz, Hans-Georg; Wenzel, Fabian; Gartenschläger, Martin; Thiele, Frank; Young, Stewart; Reuss, Stefan; Schreckenberger, Mathias
2015-08-18
We constructed and evaluated reference brain FDG-PET databases for usage by three software programs (Computer-aided diagnosis for dementia (CAD4D), Statistical Parametric Mapping (SPM) and NEUROSTAT), which allow a user-independent detection of dementia-related hypometabolism in patients' brain FDG-PET. Thirty-seven healthy volunteers were scanned in order to construct brain FDG reference databases, which reflect the normal, age-dependent glucose consumption in human brain, using either software. Databases were compared to each other to assess the impact of different stereotactic normalization algorithms used by either software package. In addition, performance of the new reference databases in the detection of altered glucose consumption in the brains of patients was evaluated by calculating statistical maps of regional hypometabolism in FDG-PET of 20 patients with confirmed Alzheimer's dementia (AD) and of 10 non-AD patients. Extent (hypometabolic volume referred to as cluster size) and magnitude (peak z-score) of detected hypometabolism was statistically analyzed. Differences between the reference databases built by CAD4D, SPM or NEUROSTAT were observed. Due to the different normalization methods, altered spatial FDG patterns were found. When analyzing patient data with the reference databases created using CAD4D, SPM or NEUROSTAT, similar characteristic clusters of hypometabolism in the same brain regions were found in the AD group with either software. However, larger z-scores were observed with CAD4D and NEUROSTAT than those reported by SPM. Better concordance with CAD4D and NEUROSTAT was achieved using the spatially normalized images of SPM and an independent z-score calculation. The three software packages identified the peak z-scores in the same brain region in 11 of 20 AD cases, and there was concordance between CAD4D and SPM in 16 AD subjects. The clinical evaluation of brain FDG-PET of 20 AD patients with either CAD4D-, SPM- or NEUROSTAT-generated databases from an identical reference dataset showed similar patterns of hypometabolism in the brain regions known to be involved in AD. The extent of hypometabolism and peak z-score appeared to be influenced by the calculation method used in each software package rather than by different spatial normalization parameters.
Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.
1994-01-01
As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.
pyam: Python Implementation of YaM
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.
Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research
2011-01-01
open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical
A software package for evaluating the performance of a star sensor operation
NASA Astrophysics Data System (ADS)
Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2017-02-01
We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.
Real-Time Data Processing Onboard Remote Sensor Platforms: Annual Review #3 Data Package
NASA Technical Reports Server (NTRS)
Cook, Sid; Harsanyi, Joe
2003-01-01
The current program status reviewed by this viewgraph presentation includes: 1) New Evaluation Results; 2) Algorithm Improvement Investigations; 3) Electronic Hardware Design; 4) Software Hardware Interface Design.
Headway Separation Assurance Subsystem (HSAS)
DOT National Transportation Integrated Search
1975-07-01
This report discusses the design, fabrication, test and evaluation of a Headway Separation Assurance Subsystem (HSAS) capable of reliable, failsafe performance in PRT systems. The items designed include both hardware and software packages. These pack...
Astronomical Software Directory Service
NASA Technical Reports Server (NTRS)
Hanisch, R. J.; Payne, H.; Hayes, J.
1998-01-01
This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.
Browndye: A Software Package for Brownian Dynamics
McCammon, J. Andrew
2010-01-01
A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109
Development of a software package for solid-angle calculations using the Monte Carlo method
NASA Astrophysics Data System (ADS)
Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai
2014-02-01
Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.
Can I Trust This Software Package? An Exercise in Validation of Computational Results
ERIC Educational Resources Information Center
Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.
2008-01-01
Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…
Burgos, Manuel A; Sevilla García, Maria Agustina; Sanmiguel Rojas, Enrique; Del Pino, Carlos; Fernández Velez, Carlos; Piqueras, Francisco; Esteban Ortega, Francisco
Computational fluid dynamics (CFD) is a mathematical tool to analyse airflow. We present a novel CFD software package to improve results following nasal surgery for obstruction. A group of engineers in collaboration with otolaryngologists have developed a very intuitive CFD software package called MeComLand®, which uses the patient's cross-sectional (tomographic) images, thus showing in detail results originated by CFD such as airflow distributions, velocity profiles, pressure, or wall shear stress. NOSELAND® helps medical evaluation with dynamic reports by using a 3D endoscopic view. Using this CFD-based software a patient underwent virtual surgery (septoplasty, turbinoplasty, spreader grafts, lateral crural J-flap and combinations) to choose the best improvement in nasal flow. To present a novel software package to improve nasal surgery results. To apply the software on CT slices from a patient affected by septal deviation. To evaluate several surgical procedures (septoplasty, turbinectomy, spreader-grafts, J-flap and combination among them) to find the best alternative with less morbidity. The combination of all the procedures does not provide the best nasal flow improvement. Septoplasty plus turbinoplasty obtained the best results. Turbinoplasty alone rendered almost similar results to septoplasty in our simulation. CFD provides useful complementary information to cover diagnosis, prognosis, and follow-up of nasal pathologies based on quantitative magnitudes linked to fluid flow. MeComLand®, DigBody® and NoseLand® represent a non-invasive, low-cost alternative for the functional study of patients with nasal obstruction. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. All rights reserved.
Report of AAPM Task Group 162: Software for planar image quality metrology.
Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J
2018-02-01
The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.
Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E
2017-08-01
Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society for Leukocyte Biology.
International Inventory of Software Packages in the Information Field.
ERIC Educational Resources Information Center
Keren, Carl, Ed.; Sered, Irina, Ed.
Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Glioblastoma Segmentation: Comparison of Three Different Software Packages.
Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid
2016-01-01
To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.
A cross-validation package driving Netica with python
Fienen, Michael N.; Plant, Nathaniel G.
2014-01-01
Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).
Astronomical Software Directory Service
NASA Astrophysics Data System (ADS)
Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey
1997-01-01
With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as influencing the software development. The Web interface to the search engine is provided by a gateway program written in C++ by a consultant to the project (A. Warnock).
Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh
2014-08-01
Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.
Development of a Nevada Statewide Database for Safety Analyst Software
DOT National Transportation Integrated Search
2017-02-02
Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...
Evaluation of FNS control systems: software development and sensor characterization.
Riess, J; Abbas, J J
1997-01-01
Functional Neuromuscular Stimulation (FNS) systems activate paralyzed limbs by electrically stimulating motor neurons. These systems have been used to restore functions such as standing and stepping in people with thoracic level spinal cord injury. Research in our laboratory is directed at the design and evaluation of the control algorithms for generating posture and movement. This paper describes software developed for implementing FNS control systems and the characterization of a sensor system used to implement and evaluate controllers in the laboratory. In order to assess FNS control algorithms, we have developed a versatile software package using Lab VIEW (National Instruments, Corp). This package provides the ability to interface with sensor systems via serial port or A/D board, implement data processing and real-time control algorithms, and interface with neuromuscular stimulation devices. In our laboratory, we use the Flock of Birds (Ascension Technology Corp.) motion tracking sensor system to monitor limb segment position and orientation (6 degrees of freedom). Errors in the sensor system have been characterized and nonlinear polynomial models have been developed to account for these errors. With this compensation, the error in the distance measurement is reduced by 90 % so that the maximum error is less than 1 cm.
ERIC Educational Resources Information Center
Pollard, Jim
This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…
Differential maneuvering simulator data reduction and analysis software
NASA Technical Reports Server (NTRS)
Beasley, G. P.; Sigman, R. S.
1972-01-01
A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.
Bayesian Hierarchical Random Effects Models in Forensic Science.
Aitken, Colin G G
2018-01-01
Statistical modeling of the evaluation of evidence with the use of the likelihood ratio has a long history. It dates from the Dreyfus case at the end of the nineteenth century through the work at Bletchley Park in the Second World War to the present day. The development received a significant boost in 1977 with a seminal work by Dennis Lindley which introduced a Bayesian hierarchical random effects model for the evaluation of evidence with an example of refractive index measurements on fragments of glass. Many models have been developed since then. The methods have now been sufficiently well-developed and have become so widespread that it is timely to try and provide a software package to assist in their implementation. With that in mind, a project (SAILR: Software for the Analysis and Implementation of Likelihood Ratios) was funded by the European Network of Forensic Science Institutes through their Monopoly programme to develop a software package for use by forensic scientists world-wide that would assist in the statistical analysis and implementation of the approach based on likelihood ratios. It is the purpose of this document to provide a short review of a small part of this history. The review also provides a background, or landscape, for the development of some of the models within the SAILR package and references to SAILR as made as appropriate.
Large Scale Software Building with CMake in ATLAS
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.
Western aeronautical test range real-time graphics software package MAGIC
NASA Technical Reports Server (NTRS)
Malone, Jacqueline C.; Moore, Archie L.
1988-01-01
The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.
CheMentor Software System by H. A. Peoples
NASA Astrophysics Data System (ADS)
Reid, Brian P.
1997-09-01
CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This guide developed by MicroSIFT, a clearinghouse for microcomputer-based educational software and courseware, provides background information and forms to aid teachers and other educators in evaluating available microcomputer courseware. The evaluation process comprises six stages: (1) sifting, which screens out those programs that are not…
ERIC Educational Resources Information Center
Thompson, Douglas E.
2013-01-01
In today's complex music software packages, many features can remain unexplored and unused. Software plug-ins--available in most every music software package, yet easily overlooked in the software's basic operations--are one such feature. In this article, I introduce readers to plug-ins and offer tips for purchasing plug-ins I have…
Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan
2015-09-01
Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Approximated adjusted fractional Bayes factors: A general method for testing informative hypotheses.
Gu, Xin; Mulder, Joris; Hoijtink, Herbert
2018-05-01
Informative hypotheses are increasingly being used in psychological sciences because they adequately capture researchers' theories and expectations. In the Bayesian framework, the evaluation of informative hypotheses often makes use of default Bayes factors such as the fractional Bayes factor. This paper approximates and adjusts the fractional Bayes factor such that it can be used to evaluate informative hypotheses in general statistical models. In the fractional Bayes factor a fraction parameter must be specified which controls the amount of information in the data used for specifying an implicit prior. The remaining fraction is used for testing the informative hypotheses. We discuss different choices of this parameter and present a scheme for setting it. Furthermore, a software package is described which computes the approximated adjusted fractional Bayes factor. Using this software package, psychological researchers can evaluate informative hypotheses by means of Bayes factors in an easy manner. Two empirical examples are used to illustrate the procedure. © 2017 The British Psychological Society.
Introduction to Software Packages. [Final Report.
ERIC Educational Resources Information Center
Frankel, Sheila, Ed.; And Others
This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…
PAVECHECK : training material updated user's manual including GPS.
DOT National Transportation Integrated Search
2009-01-01
PAVECHECK is a software package used to integrate nondestructive test data from various testing systems to provide the pavement engineer with a comprehensive evaluation of both surface and subsurface conditions. This User's Manual is intended to demo...
User manual of the CATSS system (version 1.0) communication analysis tool for space station
NASA Technical Reports Server (NTRS)
Tsang, C. S.; Su, Y. T.; Lindsey, W. C.
1983-01-01
The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.
NASA Technical Reports Server (NTRS)
Klumpp, A. R.
1994-01-01
Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package requires 205K of main memory on a DEC VAX running VMS. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.
Evaluating foodservice software: a suggested approach.
Fowler, K D
1986-09-01
In an era of cost containment, the computer has become a viable management tool. Its use in health care has demonstrated accelerated growth in recent years, and a literature review supports an increased trend in this direction. Foodservice, which is a major cost center, is no exception to this predicted trend. Because software has proliferated, foodservice managers and dietitians are experiencing growing concern about how to evaluate the numerous software packages from which to choose. A suggested approach to evaluating software is offered to dietitians and managers alike to lessen the confusion in software selection and to improve the system satisfaction level post-purchase. Steps of the software evaluatory approach include: delineation of goals, assessment of needs, assignment of value weight factors, development of a vendor checklist, survey of vendors by means of the vendor checklist and elimination of inappropriate systems, thorough development of the request for proposal (RFP) for submission to the selected vendors, an analysis of the returned RFPs in terms of system features and cost factors, and selection of the system(s) for implementation.
Gruber, Bernd; Unmack, Peter J; Berry, Oliver F; Georges, Arthur
2018-05-01
Although vast technological advances have been made and genetic software packages are growing in number, it is not a trivial task to analyse SNP data. We announce a new r package, dartr, enabling the analysis of single nucleotide polymorphism data for population genomic and phylogenomic applications. dartr provides user-friendly functions for data quality control and marker selection, and permits rigorous evaluations of conformation to Hardy-Weinberg equilibrium, gametic-phase disequilibrium and neutrality. The package reports standard descriptive statistics, permits exploration of patterns in the data through principal components analysis and conducts standard F-statistics, as well as basic phylogenetic analyses, population assignment, isolation by distance and exports data to a variety of commonly used downstream applications (e.g., newhybrids, faststructure and phylogeny applications) outside of the r environment. The package serves two main purposes: first, a user-friendly approach to lower the hurdle to analyse such data-therefore, the package comes with a detailed tutorial targeted to the r beginner to allow data analysis without requiring deep knowledge of r. Second, we use a single, well-established format-genlight from the adegenet package-as input for all our functions to avoid data reformatting. By strictly using the genlight format, we hope to facilitate this format as the de facto standard of future software developments and hence reduce the format jungle of genetic data sets. The dartr package is available via the r CRAN network and GitHub. © 2017 John Wiley & Sons Ltd.
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Evaluation of Job Queuing/Scheduling Software: Phase I Report
NASA Technical Reports Server (NTRS)
Jones, James Patton
1996-01-01
The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.
RSEIS and RFOC: Seismic Analysis in R
NASA Astrophysics Data System (ADS)
Lees, J. M.
2015-12-01
Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.
SIRU development. Volume 3: Software description and program documentation
NASA Technical Reports Server (NTRS)
Oehrle, J.
1973-01-01
The development and initial evaluation of a strapdown inertial reference unit (SIRU) system are discussed. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. The basic SIRU software coding system used in the DDP-516 computer is documented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
Comparison of particle tracking algorithms in commercial CFD packages: sedimentation and diffusion.
Robinson, Risa J; Snyder, Pam; Oldham, Michael J
2007-05-01
Computational fluid dynamic modeling software has enabled microdosimetry patterns of inhaled toxins and toxicants to be predicted and visualized, and is being used in inhalation toxicology and risk assessment. These predicted microdosimetry patterns in airway structures are derived from predicted airflow patterns within these airways and particle tracking algorithms used in computational fluid dynamics (CFD) software packages. Although these commercial CFD codes have been tested for accuracy under various conditions, they have not been well tested for respiratory flows in general. Nor has their particle tracking algorithm accuracy been well studied. In this study, three software packages, Fluent Discrete Phase Model (DPM), Fluent Fine Particle Model (FPM), and ANSYS CFX, were evaluated. Sedimentation and diffusion were each isolated in a straight tube geometry and tested for accuracy. A range of flow rates corresponding to adult low activity (minute ventilation = 10 L/min) and to heavy exertion (minute ventilation = 60 L/min) were tested by varying the range of dimensionless diffusion and sedimentation parameters found using the Weibel symmetric 23 generation lung morphology. Numerical results for fully developed parabolic and uniform (slip) profiles were compared respectively, to Pich (1972) and Yu (1977) analytical sedimentation solutions. Schum and Yeh (1980) equations for sedimentation were also compared. Numerical results for diffusional deposition were compared to analytical solutions of Ingham (1975) for parabolic and uniform profiles. Significant differences were found among the various CFD software packages and between numerical and analytical solutions. Therefore, it is prudent to validate CFD predictions against analytical solutions in idealized geometry before tackling the complex geometries of the respiratory tract.
Evaluation of Emergency Vehicle Signal Preemption on the Route 7 Virginia Corridor
DOT National Transportation Integrated Search
1999-07-01
This study analyzed the impact of emergency vehicle traffic signal preemption across three coordinated intersections on Route 7 (Leesburg Pike near Landsdowne) in Virginia. FHWA's Traffic Software Integrated System (TSIS) package, which includes the ...
Technology Assessment Software Package: Final Report.
ERIC Educational Resources Information Center
Hutinger, Patricia L.
This final report describes the Technology Assessment Software Package (TASP) Project, which produced developmentally appropriate technology assessment software for children from 18 months through 8 years of age who have moderate to severe disabilities that interfere with their interaction with people, objects, tasks, and events in their…
The Hidden Cost of Buying a Computer.
ERIC Educational Resources Information Center
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
METAPHOR: Programmer's guide, Version 1
NASA Technical Reports Server (NTRS)
Furchtgott, D. G.
1979-01-01
The internal structure of the Michigan Evaluation Aid for Perphormability (METAPHOR), an interactive software package to facilitate performability modeling and evaluation is described. Revised supplemented guides are prepared in order to maintain an up-to-date documentation of the system. Programmed tools to facilitate each step of performability model construction and model solution are given.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.
Dill: an algorithm and a symbolic software package for doing classical supersymmetry calculations
NASA Astrophysics Data System (ADS)
Luc̆ić, Vladan
1995-11-01
An algorithm is presented that formalizes different steps in a classical Supersymmetric (SUSY) calculation. Based on the algorithm Dill, a symbolic software package, that can perform the calculations, is developed in the Mathematica programming language. While the algorithm is quite general, the package is created for the 4 - D, N = 1 model. Nevertheless, with little modification, the package could be used for other SUSY models. The package has been tested and some of the results are presented.
NASA Astrophysics Data System (ADS)
Ramkilowan, A.; Griffith, D. J.
2017-10-01
Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.
An Ada Linear-Algebra Software Package Modeled After HAL/S
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.; Lawson, Charles L.
1990-01-01
New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.
2011-01-01
Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252
NDAS Hardware Translation Layer Development
NASA Technical Reports Server (NTRS)
Nazaretian, Ryan N.; Holladay, Wendy T.
2011-01-01
The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.
Academic Web Authoring Mulitmedia Development and Course Management Tools
ERIC Educational Resources Information Center
Halloran, Margaret E.
2005-01-01
Course management software enables faculty members to learn one software package for web-based curriculum, assessment, synchronous and asynchronous discussions, collaborative work, multimedia and interactive resource development. There are as many as 109 different course management software packages on the market and several studies have evaluated…
Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M
2018-06-01
This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.
Information systems analysis approach in hospitals: a national survey.
Wong, B K; Sellaro, C L; Monaco, J A
1995-03-01
A survey of 216 hospitals reveals that some hospitals do not conduct cost-benefit analyses or analyze possible adverse effects in feasibility studies. In determining and analyzing system requirements, external factors that initiate the transaction are not examined, and computer-aided software engineering (CASE) tools are seldom used. Some hospitals do not investigate the advantages and disadvantages of using in-house-developed software versus purchased software packages in the evaluation of alternatives. The survey finds that, overall, most hospitals follow the traditional systems development life cycle (SDLC) approach in analyzing information systems.
A Network Primer: How They're Used...and How They Could Be Used.
ERIC Educational Resources Information Center
Lehrer, Ariella
1988-01-01
Examined are large curriculum software packages that currently dominate school networks. Indicates ways that networks could serve schools. Discusses different Integrated Learning Systems (ILS), evaluates their use and proposes future uses of these networks. (CW)
Adams, Jean V.; Slaght, Karen; Boogaard, Michael A.
2016-01-01
The authors developed a package, LW1949, for use with the statistical software R to automatically carry out the manual steps of Litchfield and Wilcoxon's method of evaluating dose–effect experiments. The LW1949 package consistently finds the best fitting dose–effect relation by minimizing the chi-squared statistic of the observed and expected number of affected individuals and substantially speeds up the line-fitting process and other calculations that Litchfield and Wilcoxon originally carried out by hand. Environ Toxicol Chem 2016;9999:1–4. Published 2016 Wiley Periodicals Inc. on behalf of SETAC. This article is a US Government work and, as such, is in the public domain in the United States of America.
Modeling and MBL: Software Tools for Science.
ERIC Educational Resources Information Center
Tinker, Robert F.
Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…
Scientific Software: How to Find What You Need and Get What You Pay for.
ERIC Educational Resources Information Center
Gabaldon, Diana J.
1984-01-01
Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Interactive Visualization of Assessment Data: The Software Package Mondrian
ERIC Educational Resources Information Center
Unlu, Ali; Sargin, Anatol
2009-01-01
Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…
An Overview of Software for Conducting Dimensionality Assessment in Multidimensional Models
ERIC Educational Resources Information Center
Svetina, Dubravka; Levy, Roy
2012-01-01
An overview of popular software packages for conducting dimensionality assessment in multidimensional models is presented. Specifically, five popular software packages are described in terms of their capabilities to conduct dimensionality assessment with respect to the nature of analysis (exploratory or confirmatory), types of data (dichotomous,…
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python.
Gorgolewski, Krzysztof; Burns, Christopher D; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O; Waskom, Michael L; Ghosh, Satrajit S
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research.
Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python
Gorgolewski, Krzysztof; Burns, Christopher D.; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O.; Waskom, Michael L.; Ghosh, Satrajit S.
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research. PMID:21897815
DRUGDOG 3.0: U.S. Navy Random Urinalysis Software Package
1994-03-15
NAVAL PO11GRADUATE SCHOOL Monterey, California AD-A281 748 THESIS LJuEoTE DRUGDOG 3.0: U. S . NAVY RANDOM URINALYSIS SOFTWARE PACKAGE by (% Dale E...ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED 15 MAR 94 Master’s Thesis 4. TITLE AND SUBTITLE DRUGDOG 3.0: U. S . NAVY RANDOM 5...FUNDING NUMBERS URINALYSIS SOFTWARE PACKAGE 6. AUTHOR( S ) Dale E. Wilson 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) 8. PERFORMING Naval
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
scoringRules - A software package for probabilistic model evaluation
NASA Astrophysics Data System (ADS)
Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian
2016-04-01
Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.
Development of a computer-assisted learning software package on dental traumatology.
Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C
1998-10-01
The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.
Liu, Yijin; Meirer, Florian; Williams, Phillip A.; Wang, Junyue; Andrews, Joy C.; Pianetta, Piero
2012-01-01
Transmission X-ray microscopy (TXM) has been well recognized as a powerful tool for non-destructive investigation of the three-dimensional inner structure of a sample with spatial resolution down to a few tens of nanometers, especially when combined with synchrotron radiation sources. Recent developments of this technique have presented a need for new tools for both system control and data analysis. Here a software package developed in MATLAB for script command generation and analysis of TXM data is presented. The first toolkit, the script generator, allows automating complex experimental tasks which involve up to several thousand motor movements. The second package was designed to accomplish computationally intense tasks such as data processing of mosaic and mosaic tomography datasets; dual-energy contrast imaging, where data are recorded above and below a specific X-ray absorption edge; and TXM X-ray absorption near-edge structure imaging datasets. Furthermore, analytical and iterative tomography reconstruction algorithms were implemented. The compiled software package is freely available. PMID:22338691
Information Metacatalog for a Grid
NASA Technical Reports Server (NTRS)
Kolano, Paul
2007-01-01
SWIM is a Software Information Metacatalog that gathers detailed information about the software components and packages installed on a grid resource. Information is currently gathered for Executable and Linking Format (ELF) executables and shared libraries, Java classes, shell scripts, and Perl and Python modules. SWIM is built on top of the POUR framework, which is described in the preceding article. SWIM consists of a set of Perl modules for extracting software information from a system, an XML schema defining the format of data that can be added by users, and a POUR XML configuration file that describes how these elements are used to generate periodic, on-demand, and user-specified information. Periodic software information is derived mainly from the package managers used on each system. SWIM collects information from native package managers in FreeBSD, Solaris, and IRX as well as the RPM, Perl, and Python package managers on multiple platforms. Because not all software is available, or installed in package form, SWIM also crawls the set of relevant paths from the File System Hierarchy Standard that defines the standard file system structure used by all major UNIX distributions. Using these two techniques, the vast majority of software installed on a system can be located. SWIM computes the same information gathered by the periodic routines for specific files on specific hosts, and locates software on a system given only its name and type.
The Package-Based Development Process in the Flight Dynamics Division
NASA Technical Reports Server (NTRS)
Parra, Amalia; Seaman, Carolyn; Basili, Victor; Kraft, Stephen; Condon, Steven; Burke, Steven; Yakimovich, Daniil
1997-01-01
The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.
Bravo, Paco E; Chien, David; Javadi, Mehrbod; Merrill, Jennifer; Bengel, Frank M
2010-06-01
Electrocardiographic gating is increasingly used for (82)Rb cardiac PET/CT, but reference ranges for global functional parameters are not well defined. We sought to establish reference values for left ventricular ejection fraction (LVEF), end systolic volume (ESV), and end diastolic volume (EDV) using 4 different commercial software packages. Additionally, we compared 2 different approaches for the definition of a healthy individual. Sixty-two subjects (mean age +/- SD, 49 +/- 9 y; 85% women; mean body mass index +/- SD, 34 +/- 10 kg/m(2)) who underwent (82)Rb-gated myocardial perfusion PET/CT were evaluated. All subjects had normal myocardial perfusion and no history of coronary artery disease (CAD) or cardiomyopathy. Subgroup 1 consisted of 34 individuals with low pretest probability of CAD (<10%), and subgroup 2 comprised 28 subjects who had no atherosclerosis on a coronary CT angiogram obtained concurrently during the PET/CT session. LVEF, ESV, and EDV were calculated at rest and during dipyridamole-induced stress, using CardIQ Physio (a dedicated PET software) and the 3 major SPECT software packages (Emory Cardiac Toolbox, Quantitative Gated SPECT, and 4DM-SPECT). Mean LVEF was significantly different among all 4 software packages. LVEF was most comparable between CardIQ Physio (62% +/- 6% and 54% +/- 7% at stress and rest, respectively) and 4DM-SPECT (64% +/- 7% and 56% +/- 8%, respectively), whereas Emory Cardiac Toolbox yielded higher values (71% +/- 6% and 65% +/- 6%, respectively, P < 0.001) and Quantitated Gated SPECT lower values (56% +/- 8% and 50% +/- 8%, respectively, P < 0.001). Subgroup 1 (low likelihood) demonstrated higher LVEF values than did subgroup 2 (normal CT angiography findings), using all software packages (P < 0.05). However, mean ESV and EDV at stress and rest were comparable between both subgroups (p = NS). Intra- and interobserver agreement were excellent for all methods. The reference range of LVEF and LV volumes from gated (82)Rb PET/CT varies significantly among available software programs and therefore cannot be used interchangeably. LVEF results were higher when healthy subjects were defined by a low pretest probability of CAD than by normal CT angiography results.
Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project
NASA Astrophysics Data System (ADS)
Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo
2017-04-01
The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.
Software Comparison for Renewable Energy Deployment in a Distribution Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian
The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less
Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M
2011-09-10
The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.
Second Evaluation of Job Queuing/Scheduling Software. Phase 1
NASA Technical Reports Server (NTRS)
Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)
1997-01-01
The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.
NASA Technical Reports Server (NTRS)
Gill, Esther Naomi
1986-01-01
A review was conducted of software packages currently on the market which might be integrated with the interface language and aid in reaching the objectives of customization, standardization, transparency, reliability, maintainability, language substitutions, expandability, portability, and flexibility. Recommendations are given for best choices in hardware and software acquisition for inhouse testing of these possible integrations. Software acquisition in the line of tools to aid expert-system development and/or novice program development, artificial intelligent voice technology and touch screen or joystick or mouse utilization as well as networking were recommended. Other recommendations concerned using the language Ada for the user interface language shell because of its high level of standardization, structure, and ability to accept and execute programs written in other programming languages, its DOD ownership and control, and keeping the user interface language simple so that multiples of users will find the commercialization of space within their realm of possibility which is, after all, the purpose of the Space Station.
Preliminary design review package for the solar heating and cooling central data processing system
NASA Technical Reports Server (NTRS)
1976-01-01
The Central Data Processing System (CDPS) is designed to transform the raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems. Software requirements for the CDPS are described. The programming standards to be used in development, documentation, and maintenance of the software are discussed along with the CDPS operations approach in support of daily data collection and processing.
General-Purpose Ada Software Packages
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.
1991-01-01
Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Advance Directives and Do Not Resuscitate Orders
... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...
Nested Cohort - R software package
NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.
Case Study: Audio-Guided Learning, with Computer Graphics.
ERIC Educational Resources Information Center
Koumi, Jack; Daniels, Judith
1994-01-01
Describes teaching packages which involve the use of audiotape recordings with personal computers in Open University (United Kingdom) mathematics courses. Topics addressed include software development; computer graphics; pedagogic principles for distance education; feedback, including course evaluations and student surveys; and future plans.…
Use of multispectral data in design of forest sample surveys
NASA Technical Reports Server (NTRS)
Titus, S. J.; Wensel, L. C.
1977-01-01
The use of multispectral data in design of forest sample surveys using a computer software package is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.
Use of multispectral data in design of forest sample surveys
NASA Technical Reports Server (NTRS)
Titus, S. J.; Wensel, L. C.
1977-01-01
The use of multispectral data in design of forest sample surveys using a computer software package, WILLIAM, is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.
An Integrated Software Package to Enable Predictive Simulation Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less
Software packager user's guide
NASA Technical Reports Server (NTRS)
Callahan, John R.
1995-01-01
Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, Heather; Flach, Greg; Smith, Frank
2014-01-10
The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The CBP Software Toolbox – “Version 1.0” was released early in FY2013 and was used to support DOE-EM performance assessments in evaluating various degradation mechanisms that included sulfate attack, carbonation and constituent leaching. The sulfate attackmore » analysis predicted the extent and damage that sulfate ingress will have on concrete vaults over extended time (i.e., > 1000 years) and the carbonation analysis provided concrete degradation predictions from rebar corrosion. The new release “Version 2.0” includes upgraded carbonation software and a new software module to evaluate degradation due to chloride attack. Also included in the newer version are a dual regime module allowing evaluation of contaminant release in two regimes – both fractured and un-fractured. The integrated software package has also been upgraded with new plotting capabilities and many other features that increase the “user-friendliness” of the package. Experimental work has been generated to provide data to calibrate the models to improve the credibility of the analysis and reduce the uncertainty. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox is and will continue to produce tangible benefits to the working DOE Performance Assessment (PA) community.« less
ERIC Educational Resources Information Center
Rhein, Deborah; Alibrandi, Mary; Lyons, Mary; Sammons, Janice; Doyle, Luther
This bibliography, developed by Project RIMES (Reading Instructional Methods of Efficacy with Students) lists 80 software packages for teaching early reading and spelling to students at risk for reading and spelling failure. The software packages are presented alphabetically by title. Entries usually include a grade level indicator, a brief…
Vasilyev, K N
2013-01-01
When developing new software products and adapting existing software, project leaders have to decide which functionalities to keep, adapt or develop. They have to consider that the cost of making errors during the specification phase is extremely high. In this paper a formalised approach is proposed that considers the main criteria for selecting new software functions. The application of this approach minimises the chances of making errors in selecting the functions to apply. Based on the work on software development and support projects in the area of water resources and flood damage evaluation in economic terms at CH2M HILL (the developers of the flood modelling package ISIS), the author has defined seven criteria for selecting functions to be included in a software product. The approach is based on the evaluation of the relative significance of the functions to be included into the software product. Evaluation is achieved by considering each criterion and the weighting coefficients of each criterion in turn and applying the method of normalisation. This paper includes a description of this new approach and examples of its application in the development of new software products in the are of the water resources management.
Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis
2016-10-01
Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.
Moreno-Montañés, Javier; Antón, Vanesa; Antón, Alfonso; Larrosa, José M; Martinez-de-la-Casa, José María; Rebolleda, Gema; Ussa, Fernando; García-Granero, Marta
2017-04-01
It is important to evaluate intraobserver and interobserver agreement using visual field (VF) testing and optical coherence tomography (OCT) software in order to understand whether the use of this software is sufficient to detect glaucoma progression and to make decisions regarding its treatment. To evaluate agreement in VF and OCT software among 5 glaucoma specialists. The printout pages from VF progression software and OCT progression software from 100 patients were randomized, and the 5 glaucoma specialists subjectively and independently evaluated them for glaucoma. Each image was classified as having no progression, questionable progression, or progression. The principal investigator classified the patients previously as without variability (normal) or with high variability among tests (difficult). Using both software, the specialists also evaluated whether the glaucoma damage had progressed and if treatment change was needed. One month later, the same observers reevaluated the patients in a different order to determine intraobserver reproducibility. Intraobserver and interobserver agreement was estimated using κ statistics and Gwet second-order agreement coefficient. The agreement was compared with other factors. Of the 100 observed patients, half were male and all were white; the mean (SD) age was 69.7 (14.1) years. Intraobserver agreement was substantial to almost perfect for VF software (overall κ [95% CI], 0.59 [0.46-0.72] to 0.87 [0.79-0.96]) and similar for OCT software (overall κ [95% CI], 0.59 [0.46-0.71] to 0.85 [0.76-0.94]). Interobserver agreement among the 5 glaucoma specialists with the VF progression software was moderate (κ, 0.48; 95% CI, 0.41-0.55) and similar to OCT progression software (κ, 0.52; 95% CI, 0.44-0.59). Interobserver agreement was substantial in images classified as having no progression but only fair in those classified as having questionable glaucoma progression or glaucoma progression. Interobserver agreement was fair regarding questions about glaucoma progression (κ, 0.39; 95% CI, 0.32-0.48) and consideration about treatment changes (κ, 0.39; 95% CI, 0.32-0.48). The factors associated with agreement were the glaucoma stage and case difficulty. There was substantial intraobserver agreement but moderate interobserver agreement among glaucoma specialists using 2 glaucoma progression software packages. These data suggest that these glaucoma progression software packages are insufficient to obtain high interobserver agreement in both devices except in patients with no progression. The low agreement regarding progression or treatment changes suggests that both software programs used in isolation are insufficient for decision making.
Long, Zaiyang; Tradup, Donald J; Stekel, Scott F; Gorny, Krzysztof R; Hangiandreou, Nicholas J
2018-03-01
We evaluated a commercially available software package that uses B-mode images to semi-automatically measure quantitative metrics of ultrasound image quality, such as contrast response, depth of penetration (DOP), and spatial resolution (lateral, axial, and elevational). Since measurement of elevational resolution is not a part of the software package, we achieved it by acquiring phantom images with transducers tilted at 45 degrees relative to the phantom. Each measurement was assessed in terms of measurement stability, sensitivity, repeatability, and semi-automated measurement success rate. All assessments were performed on a GE Logiq E9 ultrasound system with linear (9L or 11L), curved (C1-5), and sector (S1-5) transducers, using a CIRS model 040GSE phantom. In stability tests, the measurements of contrast, DOP, and spatial resolution remained within a ±10% variation threshold in 90%, 100%, and 69% of cases, respectively. In sensitivity tests, contrast, DOP, and spatial resolution measurements followed the expected behavior in 100%, 100%, and 72% of cases, respectively. In repeatability testing, intra- and inter-individual coefficients of variations were equal to or less than 3.2%, 1.3%, and 4.4% for contrast, DOP, and spatial resolution (lateral and axial), respectively. The coefficients of variation corresponding to the elevational resolution test were all within 9.5%. Overall, in our assessment, the evaluated package performed well for objective and quantitative assessment of the above-mentioned image qualities under well-controlled acquisition conditions. We are finding it to be useful for various clinical ultrasound applications including performance comparison between scanners from different vendors. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Software Library for Bruker TopSpin NMR Data Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.
The Raid distributed database system
NASA Technical Reports Server (NTRS)
Bhargava, Bharat; Riedl, John
1989-01-01
Raid, a robust and adaptable distributed database system for transaction processing (TP), is described. Raid is a message-passing system, with server processes on each site to manage concurrent processing, consistent replicated copies during site failures, and atomic distributed commitment. A high-level layered communications package provides a clean location-independent interface between servers. The latest design of the package delivers messages via shared memory in a configuration with several servers linked into a single process. Raid provides the infrastructure to investigate various methods for supporting reliable distributed TP. Measurements on TP and server CPU time are presented, along with data from experiments on communications software, consistent replicated copy control during site failures, and concurrent distributed checkpointing. A software tool for evaluating the implementation of TP algorithms in an operating-system kernel is proposed.
Software Review. Macintosh Laboratory Automation: Three Software Packages.
ERIC Educational Resources Information Center
Jezl, Barbara Ann
1990-01-01
Reviewed are "LABTECH NOTEBOOK,""LabVIEW," and "Parameter Manager pmPLUS/pmTALK." Each package is described including functions, uses, hardware, and costs. Advantages and disadvantages of this type of laboratory approach are discussed. (CW)
White, Gary C.; Hines, J.E.
2004-01-01
The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).
Advances in the REDCAT software package
2013-01-01
Background Residual Dipolar Couplings (RDCs) have emerged in the past two decades as an informative source of experimental restraints for the study of structure and dynamics of biological macromolecules and complexes. The REDCAT software package was previously introduced for the analysis of molecular structures using RDC data. Here we report additional features that have been included in this software package in order to expand the scope of its analyses. We first discuss the features that enhance REDCATs user-friendly nature, such as the integration of a number of analyses into one single operation and enabling convenient examination of a structural ensemble in order to identify the most suitable structure. We then describe the new features which expand the scope of RDC analyses, performing exercises that utilize both synthetic and experimental data to illustrate and evaluate different features with regard to structure refinement and structure validation. Results We establish the seamless interaction that takes place between REDCAT, VMD, and Xplor-NIH in demonstrations that utilize our newly developed REDCAT-VMD and XplorGUI interfaces. These modules enable visualization of RDC analysis results on the molecular structure displayed in VMD and refinement of structures with Xplor-NIH, respectively. We also highlight REDCAT’s Error-Analysis feature in reporting the localized fitness of a structure to RDC data, which provides a more effective means of recognizing local structural anomalies. This allows for structurally sound regions of a molecule to be identified, and for any refinement efforts to be focused solely on locally distorted regions. Conclusions The newly engineered REDCAT software package, which is available for download via the WWW from http://ifestos.cse.sc.edu, has been developed in the Object Oriented C++ environment. Our most recent enhancements to REDCAT serve to provide a more complete RDC analysis suite, while also accommodating a more user-friendly experience, and will be of great interest to the community of researchers and developers since it hides the complications of software development. PMID:24098943
Calibration of HERS-ST for estimating traffic impact on pavement deterioration in Texas.
DOT National Transportation Integrated Search
2012-08-01
The Highway Economic Requirements System-State Version (or the HERS-ST) is a software package which was developed by the Federal Highway Administration as a tool for evaluating the performance of state highway systems. HERS-ST has the capabilities of...
Performance Evaluation of Real-Time Precise Point Positioning Method
NASA Astrophysics Data System (ADS)
Alcay, Salih; Turgut, Muzeyyen
2017-12-01
Post-Processed Precise Point Positioning (PPP) is a well-known zero-difference positioning method which provides accurate and precise results. After the experimental tests, IGS Real Time Service (RTS) officially provided real time orbit and clock products for the GNSS community that allows real-time (RT) PPP applications. Different software packages can be used for RT-PPP. In this study, in order to evaluate the performance of RT-PPP, 3 IGS stations are used. Results, obtained by using BKG Ntrip Client (BNC) Software v2.12, are examined in terms of both accuracy and precision.
Kumar, Yadhu; Westram, Ralf; Kipfer, Peter; Meier, Harald; Ludwig, Wolfgang
2006-01-01
Background Availability of high-resolution RNA crystal structures for the 30S and 50S ribosomal subunits and the subsequent validation of comparative secondary structure models have prompted the biologists to use three-dimensional structure of ribosomal RNA (rRNA) for evaluating sequence alignments of rRNA genes. Furthermore, the secondary and tertiary structural features of rRNA are highly useful and successfully employed in designing rRNA targeted oligonucleotide probes intended for in situ hybridization experiments. RNA3D, a program to combine sequence alignment information with three-dimensional structure of rRNA was developed. Integration into ARB software package, which is used extensively by the scientific community for phylogenetic analysis and molecular probe designing, has substantially extended the functionality of ARB software suite with 3D environment. Results Three-dimensional structure of rRNA is visualized in OpenGL 3D environment with the abilities to change the display and overlay information onto the molecule, dynamically. Phylogenetic information derived from the multiple sequence alignments can be overlaid onto the molecule structure in a real time. Superimposition of both statistical and non-statistical sequence associated information onto the rRNA 3D structure can be done using customizable color scheme, which is also applied to a textual sequence alignment for reference. Oligonucleotide probes designed by ARB probe design tools can be mapped onto the 3D structure along with the probe accessibility models for evaluation with respect to secondary and tertiary structural conformations of rRNA. Conclusion Visualization of three-dimensional structure of rRNA in an intuitive display provides the biologists with the greater possibilities to carry out structure based phylogenetic analysis. Coupled with secondary structure models of rRNA, RNA3D program aids in validating the sequence alignments of rRNA genes and evaluating probe target sites. Superimposition of the information derived from the multiple sequence alignment onto the molecule dynamically allows the researchers to observe any sequence inherited characteristics (phylogenetic information) in real-time environment. The extended ARB software package is made freely available for the scientific community via . PMID:16672074
Steyn, Rachelle; Boniaszczuk, John; Geldenhuys, Theodore
2014-01-01
To determine how two software packages, supplied by Siemens and Hermes, for processing gated blood pool (GBP) studies should be used in our department and whether the use of different cameras for the acquisition of raw data influences the results. The study had two components. For the first component, 200 studies were acquired on a General Electric (GE) camera and processed three times by three operators using the Siemens and Hermes software packages. For the second part, 200 studies were acquired on two different cameras (GE and Siemens). The matched pairs of raw data were processed by one operator using the Siemens and Hermes software packages. The Siemens method consistently gave estimates that were 4.3% higher than the Hermes method (p < 0.001). The differences were not associated with any particular level of left ventricular ejection fraction (LVEF). There was no difference in the estimates of LVEF obtained by the three operators (p = 0.1794). The reproducibility of estimates was good. In 95% of patients, using the Siemens method, the SD of the three estimates of LVEF by operator 1 was ≤ 1.7, operator 2 was ≤ 2.1 and operator 3 was ≤ 1.3. The corresponding values for the Hermes method were ≤ 2.5, ≤ 2.0 and ≤ 2.1. There was no difference in the results of matched pairs of data acquired on different cameras (p = 0.4933) CONCLUSION: Software packages for processing GBP studies are not interchangeable. The report should include the name and version of the software package used. Wherever possible, the same package should be used for serial studies. If this is not possible, the report should include the limits of agreement of the different packages. Data acquisition on different cameras did not influence the results.
ELAS: A powerful, general purpose image processing package
NASA Technical Reports Server (NTRS)
Walters, David; Rickman, Douglas
1991-01-01
ELAS is a software package which has been utilized as an image processing tool for more than a decade. It has been the source of several commercial packages. Now available on UNIX workstations it is a very powerful, flexible set of software. Applications at Stennis Space Center have included a very wide range of areas including medicine, forestry, geology, ecological modeling, and sonar imagery. It remains one of the most powerful image processing packages available, either commercially or in the public domain.
On the release of cppxfel for processing X-ray free-electron laser images.
Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian
2016-06-01
As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.
Analyzing longitudinal data with the linear mixed models procedure in SPSS.
West, Brady T
2009-09-01
Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.
On the release of cppxfel for processing X-ray free-electron laser images
Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K.; ...
2016-05-11
As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Herecppxfel, a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set.Cppxfelis released with the hope that the unique and useful elements of this package can be repurposed formore » existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.« less
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Back to the future: An online OSCE Management Information System for nursing OSCEs.
Meskell, Pauline; Burke, Eimear; Kropmans, Thomas J B; Byrne, Evelyn; Setyonugroho, Winny; Kennedy, Kieran M
2015-11-01
The Objective Structured Clinical Examination (OSCE) is an established tool in the repertoire of clinical assessment methods in nurse education. The use of OSCEs facilitates the assessment of psychomotor skills as well as knowledge and attitudes. Identified benefits of OSCE assessment include development of students' confidence in their clinical skills and preparation for clinical practice. However, a number of challenges exist with the traditional paper methodology, including documentation errors and inadequate student feedback. To explore electronic OSCE delivery and evaluate the benefits of using an electronic OSCE management system. To explore assessors' perceptions of and attitudes to the computer based package. This study was conducted using electronic software in the management of a four station OSCE assessment with a cohort of first year undergraduate nursing students delivered over two consecutive years (n=203) in one higher education institution in Ireland. A quantitative descriptive survey methodology was used to obtain the views of the assessors on the process and outcome of using the software. OSCE documentation was converted to electronic format. Assessors were trained in the use of the OSCE management software package and laptops were procured to facilitate electronic management of the OSCE assessment. Following the OSCE assessment, assessors were invited to evaluate the experience. Electronic software facilitated the storage and analysis of overall group and individual results thereby offering considerable time savings. Submission of electronic forms was allowed only when fully completed thus removing the potential for missing data. The feedback facility allowed the student to receive timely evaluation on their performance and to benchmark their performance against the class. Assessors' satisfaction with the software was high. Analysis of assessment results can highlight issues around internal consistency being moderate and examiners variability. Regression analysis increases fairness of result calculations. Copyright © 2015. Published by Elsevier Ltd.
A survey of packages for large linear systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Kesheng; Milne, Brent
2000-02-11
This paper evaluates portable software packages for the iterative solution of very large sparse linear systems on parallel architectures. While we cannot hope to tell individual users which package will best suit their needs, we do hope that our systematic evaluation provides essential unbiased information about the packages and the evaluation process may serve as an example on how to evaluate these packages. The information contained here include feature comparisons, usability evaluations and performance characterizations. This review is primarily focused on self-contained packages that can be easily integrated into an existing program and are capable of computing solutions to verymore » large sparse linear systems of equations. More specifically, it concentrates on portable parallel linear system solution packages that provide iterative solution schemes and related preconditioning schemes because iterative methods are more frequently used than competing schemes such as direct methods. The eight packages evaluated are: Aztec, BlockSolve,ISIS++, LINSOL, P-SPARSLIB, PARASOL, PETSc, and PINEAPL. Among the eight portable parallel iterative linear system solvers reviewed, we recommend PETSc and Aztec for most application programmers because they have well designed user interface, extensive documentation and very responsive user support. Both PETSc and Aztec are written in the C language and are callable from Fortran. For those users interested in using Fortran 90, PARASOL is a good alternative. ISIS++is a good alternative for those who prefer the C++ language. Both PARASOL and ISIS++ are relatively new and are continuously evolving. Thus their user interface may change. In general, those packages written in Fortran 77 are more cumbersome to use because the user may need to directly deal with a number of arrays of varying sizes. Languages like C++ and Fortran 90 offer more convenient data encapsulation mechanisms which make it easier to implement a clean and intuitive user interface. In addition to reviewing these portable parallel iterative solver packages, we also provide a more cursory assessment of a range of related packages, from specialized parallel preconditioners to direct methods for sparse linear systems.« less
GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.
Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André
2010-09-21
Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.
Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.
2016-01-01
The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
Tondare, Vipin N; Villarrubia, John S; Vlada R, András E
2017-10-01
Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.
Criteria for the Evaluation of Microcomputer Courseware.
ERIC Educational Resources Information Center
Cohen, Vicki Blum
1983-01-01
Discusses attributes which are offered as set of standards to judge instructional software--those unique to design of microcomputer courseware and those included in design of all instruction. Curriculum role, modes of interaction, computer managed instruction, graphics, feedback, packaging, and manuals are noted. Fourteen references are included.…
CATCHING THE WIND: A LOW COST METHOD FOR WIND POWER SITE ASSESSMENT
Our Phase I successes involve the installation of a wind monitoring station in Humboldt County, the evaluation of four different measure-correlate-predict methods for wind site assessment, and the creation of SWEET, an open source software package implementing the prediction ...
MOPEX: a software package for astronomical image processing and visualization
NASA Astrophysics Data System (ADS)
Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley
2006-06-01
We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.
CONRAD—A software framework for cone-beam imaging in radiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maier, Andreas; Choi, Jang-Hwan; Riess, Christian
2013-11-15
Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform withmore » extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.« less
User Documentation for Multiple Software Releases
NASA Technical Reports Server (NTRS)
Humphrey, R.
1982-01-01
In proposed solution to problems of frequent software releases and updates, documentation would be divided into smaller packages, each of which contains data relating to only one of several software components. Changes would not affect entire document. Concept would improve dissemination of information regarding changes and would improve quality of data supporting packages. Would help to insure both timeliness and more thorough scrutiny of changes.
Versatile Software Package For Near Real-Time Analysis of Experimental Data
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Hoadley, Sherwood T.
1998-01-01
This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.
Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis
2011-01-01
Background A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. Results The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. Conclusions With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites. PMID:21266047
Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis.
Nemoto, Kiyotaka; Dan, Ippeita; Rorden, Christopher; Ohnishi, Takashi; Tsuzuki, Daisuke; Okamoto, Masako; Yamashita, Fumio; Asada, Takashi
2011-01-25
A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites.
Environmental databases and other computerized information tools
NASA Technical Reports Server (NTRS)
Clark-Ingram, Marceia
1995-01-01
Increasing environmental legislation has brought about the development of many new environmental databases and software application packages to aid in the quest for environmental compliance. These databases and software packages are useful tools and applicable to a wide range of environmental areas from atmospheric modeling to materials replacement technology. The great abundance of such products and services can be very overwhelming when trying to identify the tools which best meet specific needs. This paper will discuss the types of environmental databases and software packages available. This discussion will also encompass the affected environmental areas of concern, product capabilities, and hardware requirements for product utilization.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
ERIC Educational Resources Information Center
Radcliffe, George; And Others
1988-01-01
Reviews three software packages: 1) a package containing 68 programs covering general topics in chemistry; 2) a package dealing with acid-base titration curves and allows for variables to be changed; 3) a chemistry tutorial and drill package. (MVL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha
2012-10-19
The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less
NASA Technical Reports Server (NTRS)
Wolf, Stephen W. D.
1988-01-01
The Wall Adjustment Strategy (WAS) software provides successful on-line control of the 2-D flexible walled test section of the Langley 0.3-m Transonic Cryogenic Tunnel. This software package allows the level of operator intervention to be regulated as necessary for research and production type 2-D testing using and Adaptive Wall Test Section (AWTS). The software is designed to accept modification for future requirements, such as 3-D testing, with a minimum of complexity. The WAS software described is an attempt to provide a user friendly package which could be used to control any flexible walled AWTS. Control system constraints influence the details of data transfer, not the data type. Then this entire software package could be used in different control systems, if suitable interface software is available. A complete overview of the software highlights the data flow paths, the modular architecture of the software and the various operating and analysis modes available. A detailed description of the software modules includes listings of the code. A user's manual is provided to explain task generation, operating environment, user options and what to expect at execution.
Halonen, Niina; Kilpijärvi, Joni; Sobocinski, Maciej; Datta-Chaudhuri, Timir; Hassinen, Antti; Prakash, Someshekar B; Möller, Peter; Abshire, Pamela; Kellokumpu, Sakari; Lloyd Spetz, Anita
2016-01-01
Cell viability monitoring is an important part of biosafety evaluation for the detection of toxic effects on cells caused by nanomaterials, preferably by label-free, noninvasive, fast, and cost effective methods. These requirements can be met by monitoring cell viability with a capacitance-sensing integrated circuit (IC) microchip. The capacitance provides a measurement of the surface attachment of adherent cells as an indication of their health status. However, the moist, warm, and corrosive biological environment requires reliable packaging of the sensor chip. In this work, a second generation of low temperature co-fired ceramic (LTCC) technology was combined with flip-chip bonding to provide a durable package compatible with cell culture. The LTCC-packaged sensor chip was integrated with a printed circuit board, data acquisition device, and measurement-controlling software. The packaged sensor chip functioned well in the presence of cell medium and cells, with output voltages depending on the medium above the capacitors. Moreover, the manufacturing of microfluidic channels in the LTCC package was demonstrated.
Considerations for choosing an electronic medical record for an ophthalmology practice.
DeBry, P W
2001-04-01
To give a brief overview of issues pertinent to selecting an ophthalmic electronic medical record (EMR) program and to outline the company demographics and software capabilities of the major vendors in this area. Software companies shipping an EMR package were contacted to obtain information on their software and company demographics. The focus was on companies selectively marketing to ophthalmology practices, and, therefore, most were selected based on their representation at the 1998 and/or 1999 American Academy of Ophthalmology meeting. Software companies that responded to repeated inquiries in a timely fashion were included. Sixteen companies were evaluated. Electronic medical records packages ranged from $3000 to $80 000 (mean, approximately $30 000). Company demographics revealed a range from 1 to 1600 employees (mean, 204). Most of these companies have been in business for 6 years or less (range, 1-15 years; mean, 6 years). My opinions concerning various aspects of the EMR are presented. There is a wide range of EMR products available for the ophthalmology practice. Computer technology has matured to a point at which the graphical demands of the ophthalmology EMR can be satisfied. Weaknesses do exist in the inherent difficulty of recording an ophthalmology encounter, the relative adolescence of software companies, and the lack of standards in the industry.
Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.
Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G
2012-05-01
This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.
Evaluation and Selection of a Telecommunication System at the Naval Postgraduate School BOQ
1991-03-01
software, and operating system software as a part of a complete package. An internal modem must be included for remote diagnostic and programming...2 service delivery will follow the CCITT V.35 recommendation for physical, functional and electrical interfaces. Type 1 & 2 will transfer data at 56K ...by a dedicated access line at 4.8 kbs and 9.6 Kbps and 56k /64Kbps. PSS will follow the CCITT X.25 recommendations. E-mail service may be provided on
Evaluation of Magnetospheric Internal Magnetic Field models and Existing Software
1990-01-31
than an order of magnitude. The polar maxima are still visible, but they are not as distinct. The SAA is still apparent, but apin it is not as...completely overlap each other either. These plots show polar maxima (the right two panels) and the SAA minimum (the lower right panel). Note, that the...coGRF85 83M Contours Year - 1990.0 Atitude 1.5000 to 40 1221020.0 -225 4. DISCUSSION OF SOFWARE 4.1 INTRODUCION iliree basic software packages were used
Evaluating Predictive Models of Software Quality
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.
2014-06-01
Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevins, N; Vanderhoek, M; Lang, S
2014-06-15
Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary andmore » secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.« less
Ellefsen, Karl J.
2017-06-27
MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.
FREQ: A computational package for multivariable system loop-shaping procedures
NASA Technical Reports Server (NTRS)
Giesy, Daniel P.; Armstrong, Ernest S.
1989-01-01
Many approaches in the field of linear, multivariable time-invariant systems analysis and controller synthesis employ loop-sharing procedures wherein design parameters are chosen to shape frequency-response singular value plots of selected transfer matrices. A software package, FREQ, is documented for computing within on unified framework many of the most used multivariable transfer matrices for both continuous and discrete systems. The matrices are evaluated at user-selected frequency-response values, and singular values against frequency. Example computations are presented to demonstrate the use of the FREQ code.
Non Contacting Evaluation of Strains and Cracking Using Optical and Infrared Imaging Techniques
1988-08-22
Compatible Zenith Z-386 microcomputer with plotter II. 3-D Motion Measurinq System 1. Complete OPTOTRAK three dimensional digitizing system. System includes...acquisition unit - 16 single ended analog input channels 3. Data Analysis Package software (KINEPLOT) 4. Extra OPTOTRAK Camera (max 224 per system
USDA-ARS?s Scientific Manuscript database
Geographical information systems (GIS) software packages have been used for nearly three decades as analytical tools in natural resource management for geospatial data assembly, processing, storage, and visualization of input data and model output. However, with increasing availability and use of fu...
Web-Based Notes Is an Inadequate Learning Resource.
ERIC Educational Resources Information Center
Amory, Alan; Naicker, Kevin
Development of online courses requires the use of appropriate educational philosophies that discourage rote learning and passive transfer of information from teacher to learner. This paper reports on the development, use and evaluation of two second year Biology online software packages used by students in constructivist environments. The courses…
Marine beaches are occasionally contaminated by unacceptably high levels of fecal indicator bacteria (FIB) that exceed EPA water quality criteria. Here we describe application of a recent version of the software package Virtual Beach tool (VB 3.0.6) to build and evaluate multiple...
Student Evaluation of CALL Tools during the Design Process
ERIC Educational Resources Information Center
Nesbitt, Dallas
2013-01-01
This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…
Code of Federal Regulations, 2011 CFR
2011-10-01
... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...-compatible format. All databases must be supported with adequate documentation on data attributes, SQL...
Scout 2008 Version 1.0 User Guide
The Scout 2008 version 1.0 software package provides a wide variety of classical and robust statistical methods that are not typically available in other commercial software packages. A major part of Scout deals with classical, robust, and resistant univariate and multivariate ou...
INTERFACING SAS TO ORACLE IN THE UNIX ENVIRONMENT
SAS is an EPA standard data and statistical analysis software package while ORACLE is EPA's standard data base management system software package. RACLE has the advantage over SAS in data retrieval and storage capabilities but has limited data and statistical analysis capability....
Laptop Computer - Based Facial Recognition System Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. A. Cain; G. B. Singleton
2001-03-01
The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results.more » After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in remote locations. Remote users could perform real-time searches where network connectivity is not available. As images are enrolled at the remote locations, periodic database synchronization is necessary.« less
’Pushing a Big Rock Up a Steep Hill’: Acquisition Lessons Learned from DoD Applications Storefront
2014-04-30
software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver software from a central...automated delivery of software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver... mobile technologies, hoping to enhance warfighter situational awareness and access to information. Unfortunately, the Defense Acquisition System has not
PIV Data Validation Software Package
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.
ERIC Educational Resources Information Center
Wholeben, Brent Edward
This report describing the use of operations research techniques to determine which courseware packages or what microcomputer systems best address varied instructional objectives focuses on the MICROPIK model, a highly structured evaluation technique for making such complex instructional decisions. MICROPIK is a multiple alternatives model (MAA)…
Hirsch, Robert M.; De Cicco, Laura A.
2015-01-01
Evaluating long-term changes in river conditions (water quality and discharge) is an important use of hydrologic data. To carry out such evaluations, the hydrologist needs tools to facilitate several key steps in the process: acquiring the data records from a variety of sources, structuring it in ways that facilitate the analysis, processing the data with routines that extract information about changes that may be happening, and displaying findings with graphical techniques. A pair of tightly linked R packages, called dataRetrieval and EGRET (Exploration and Graphics for RivEr Trends), have been developed for carrying out each of these steps in an integrated manner. They are designed to easily accept data from three sources: U.S. Geological Survey hydrologic data, U.S. Environmental Protection Agency (EPA) STORET data, and user-supplied flat files. The dataRetrieval package not only serves as a “front end” to the EGRET package, it can also be used to easily download many types of hydrologic data and organize it in ways that facilitate many other hydrologic applications. The EGRET package has components oriented towards the description of long-term changes in streamflow statistics (high flow, average flow, and low flow) as well as changes in water quality. For the water-quality analysis, it uses Weighted Regressions on Time, Discharge and Season (WRTDS) to describe long-term trends in both concentration and flux. EGRET also creates a wide range of graphical presentations of the water-quality data and of the WRTDS results. This report serves as a user guide to these two R packages, providing detailed guidance on installation and use of the software, documentation of the analysis methods used, as well as guidance on some of the kinds of questions and approaches that the software can facilitate.
QFASAR: Quantitative fatty acid signature analysis with R
Bromaghin, Jeffrey F.
2017-01-01
Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.
AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S
NASA Technical Reports Server (NTRS)
Klumpp, A. R.
1994-01-01
This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.
SIMA: Python software for analysis of dynamic fluorescence imaging data.
Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila
2014-01-01
Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.
New generation of exploration tools: interactive modeling software and microcomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krajewski, S.A.
1986-08-01
Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less
Regional demand forecasting and simulation model: user's manual. Task 4, final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parhizgari, A M
1978-09-25
The Department of Energy's Regional Demand Forecasting Model (RDFOR) is an econometric and simulation system designed to estimate annual fuel-sector-region specific consumption of energy for the US. Its purposes are to (1) provide the demand side of the Project Independence Evaluation System (PIES), (2) enhance our empirical insights into the structure of US energy demand, and (3) assist policymakers in their decisions on and formulations of various energy policies and/or scenarios. This report provides a self-contained user's manual for interpreting, utilizing, and implementing RDFOR simulation software packages. Chapters I and II present the theoretical structure and the simulation of RDFOR,more » respectively. Chapter III describes several potential scenarios which are (or have been) utilized in the RDFOR simulations. Chapter IV presents an overview of the complete software package utilized in simulation. Chapter V provides the detailed explanation and documentation of this package. The last chapter describes step-by-step implementation of the simulation package using the two scenarios detailed in Chapter III. The RDFOR model contains 14 fuels: gasoline, electricity, natural gas, distillate and residual fuels, liquid gases, jet fuel, coal, oil, petroleum products, asphalt, petroleum coke, metallurgical coal, and total fuels, spread over residential, commercial, industrial, and transportation sectors.« less
Diagnosis diagrams for passing signals on an automatic block signaling railway section
NASA Astrophysics Data System (ADS)
Spunei, E.; Piroi, I.; Chioncel, C. P.; Piroi, F.
2018-01-01
This work presents a diagnosis method for railway traffic security installations. More specifically, the authors present a series of diagnosis charts for passing signals on a railway block equipped with an automatic block signaling installation. These charts are based on the exploitation electric schemes, and are subsequently used to develop a diagnosis software package. The thus developed software package contributes substantially to a reduction of failure detection and remedy for these types of installation faults. The use of the software package eliminates making wrong decisions in the fault detection process, decisions that may result in longer remedy times and, sometimes, to railway traffic events.
Orbit determination for ISRO satellite missions
NASA Astrophysics Data System (ADS)
Rao, Ch. Sreehari; Sinha, S. K.
Indian Space Research Organisation (ISRO) has been successful in using the in-house developed orbit determination and prediction software for satellite missions of Bhaskara, Rohini and APPLE. Considering the requirements of satellite missions, software packages are developed, tested and their accuracies are assessed. Orbit determination packages developed are SOIP, for low earth orbits of Bhaskara and Rohini missions, ORIGIN and ODPM, for orbits related to all phases of geo-stationary missions and SEGNIP, for drift and geo-stationary orbits. Software is tested and qualified using tracking data of SIGNE-3, D5-B, OTS, SYMPHONIE satellites with the help of software available with CNES, ESA and DFVLR. The results match well with those available from these agencies. These packages have supported orbit determination successfully throughout the mission life for all ISRO satellite missions. Member-Secretary
Prototyping with Data Dictionaries for Requirements Analysis.
1985-03-01
statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in
Electronic and software subsystems for an autonomous roving vehicle. M.S. Thesis
NASA Technical Reports Server (NTRS)
Doig, G. A.
1980-01-01
The complete electronics packaging which controls the Mars roving vehicle is described in order to provide a broad overview of the systems that are part of that package. Some software debugging tools are also discussed. Particular emphasis is given to those systems that are controlled by the microprocessor. These include the laser mast, the telemetry system, the command link prime interface board, and the prime software.
The Shock and Vibration Digest. Volume 17, Number 4
1985-04-01
software packages for engineering signed to be easy to use from the outset, computations which were specifically writ- and this design philosophy is largely...re- ten for use on microcomputers. Software sponsible for their increasing popularity; packages related to shock and vibration are this same design...philosophy appears to have available for both experimental and for been carried over to the design of today’s analytical applications. Typical software
Hierarchical Petascale Simulation Framework For Stress Corrosion Cracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grama, Ananth
2013-12-18
A number of major accomplishments resulted from the project. These include: • Data Structures, Algorithms, and Numerical Methods for Reactive Molecular Dynamics. We have developed a range of novel data structures, algorithms, and solvers (amortized ILU, Spike) for use with ReaxFF and charge equilibration. • Parallel Formulations of ReactiveMD (Purdue ReactiveMolecular Dynamics Package, PuReMD, PuReMD-GPU, and PG-PuReMD) for Messaging, GPU, and GPU Cluster Platforms. We have developed efficient serial, parallel (MPI), GPU (Cuda), and GPU Cluster (MPI/Cuda) implementations. Our implementations have been demonstrated to be significantly better than the state of the art, both in terms of performance and scalability.more » • Comprehensive Validation in the Context of Diverse Applications. We have demonstrated the use of our software in diverse systems, including silica-water, silicon-germanium nanorods, and as part of other projects, extended it to applications ranging from explosives (RDX) to lipid bilayers (biomembranes under oxidative stress). • Open Source Software Packages for Reactive Molecular Dynamics. All versions of our soft- ware have been released over the public domain. There are over 100 major research groups worldwide using our software. • Implementation into the Department of Energy LAMMPS Software Package. We have also integrated our software into the Department of Energy LAMMPS software package.« less
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Description of the IV + V System Software Package.
ERIC Educational Resources Information Center
Microcomputers for Information Management: An International Journal for Library and Information Services, 1984
1984-01-01
Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Evolution of a modular software network
Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.
2011-01-01
“Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260
Image analysis software versus direct anthropometry for breast measurements.
Quieregatto, Paulo Rogério; Hochman, Bernardo; Furtado, Fabianne; Machado, Aline Fernanda Perez; Sabino Neto, Miguel; Ferreira, Lydia Masako
2014-10-01
To compare breast measurements performed using the software packages ImageTool(r), AutoCAD(r) and Adobe Photoshop(r) with direct anthropometric measurements. Points were marked on the breasts and arms of 40 volunteer women aged between 18 and 60 years. When connecting the points, seven linear segments and one angular measurement on each half of the body, and one medial segment common to both body halves were defined. The volunteers were photographed in a standardized manner. Photogrammetric measurements were performed by three independent observers using the three software packages and compared to direct anthropometric measurements made with calipers and a protractor. Measurements obtained with AutoCAD(r) were the most reproducible and those made with ImageTool(r) were the most similar to direct anthropometry, while measurements with Adobe Photoshop(r) showed the largest differences. Except for angular measurements, significant differences were found between measurements of line segments made using the three software packages and those obtained by direct anthropometry. AutoCAD(r) provided the highest precision and intermediate accuracy; ImageTool(r) had the highest accuracy and lowest precision; and Adobe Photoshop(r) showed intermediate precision and the worst accuracy among the three software packages.
Simulating Freak Waves in the Ocean with CFD Modeling
NASA Astrophysics Data System (ADS)
Manolidis, M.; Orzech, M.; Simeonov, J.
2017-12-01
Rogue, or freak, waves constitute an active topic of research within the world scientific community, as various maritime authorities around the globe seek to better understand and more accurately assess the risks that the occurrence of such phenomena entail. Several experimental studies have shed some light on the mechanics of rogue wave formation. In our work we numerically simulate the formation of such waves in oceanic conditions by means of Computational Fluid Dynamics (CFD) software. For this purpose we implement the NHWAVE and OpenFOAM software packages. Both are non-hydrostatic, turbulent flow solvers, but NHWAVE implements a shock-capturing scheme at the free surface-interface, while OpenFOAM utilizes the Volume Of Fluid (VOF) method. NHWAVE has been shown to accurately reproduce highly nonlinear surface wave phenomena, such as soliton propagation and wave shoaling. We conducted a range of tests simulating rogue wave formation and horizontally varying currents to evaluate and compare the capabilities of the two software packages. Then we used each model to investigate the effect of ocean currents and current gradients on the formation of rogue waves. We present preliminary results.
PyPedal, an open source software package for pedigree analysis
USDA-ARS?s Scientific Manuscript database
The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...
A Simple Interactive Software Package for Plotting, Animating, and Calculating
ERIC Educational Resources Information Center
Engelhardt, Larry
2012-01-01
We introduce a new open source (free) software package that provides a simple, highly interactive interface for carrying out certain mathematical tasks that are commonly encountered in physics. These tasks include plotting and animating functions, solving systems of coupled algebraic equations, and basic calculus (differentiating and integrating…
Analysis of Variance: What Is Your Statistical Software Actually Doing?
ERIC Educational Resources Information Center
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
A Multi-User Microcomputer System for Small Libraries.
ERIC Educational Resources Information Center
Leggate, Peter
1988-01-01
Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)
Software, Copyright, and Site-License Agreements: Publishers' Perspective of Library Practice.
ERIC Educational Resources Information Center
Happer, Stephanie K.
Thirty-one academic publishers of stand-alone software and book/disk packages were surveyed to determine whether publishers have addressed the copyright issues inherent in circulating these packages within the library environment. Twenty-two questionnaires were returned, providing a 71% return rate. There were 18 usable questionnaires. Publishers…
Datson, D J; Carter, N G
1988-10-01
The use of personal computers in accountancy and business generally has been stimulated by the availability of flexible software packages. We describe the implementation of a commercial software package designed for interfacing with laboratory instruments and highlight the ease with which it can be implemented, without the need for specialist computer programming staff.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piburn, Jesse
2016-04-22
Availability of accessing the World Bank Data API through the R language was limited to one existing package, which is limited in its ability. The software provides access to all of the features in World Bank API in one software package for the R language and provides functions for searching and downloading data from the World Bank API.
Propensity Score Analysis in R: A Software Review
ERIC Educational Resources Information Center
Keller, Bryan; Tipton, Elizabeth
2016-01-01
In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…
A Systematic Approach for Understanding Slater-Gaussian Functions in Computational Chemistry
ERIC Educational Resources Information Center
Stewart, Brianna; Hylton, Derrick J.; Ravi, Natarajan
2013-01-01
A systematic way to understand the intricacies of quantum mechanical computations done by a software package known as "Gaussian" is undertaken via an undergraduate research project. These computations involve the evaluation of key parameters in a fitting procedure to express a Slater-type orbital (STO) function in terms of the linear…
The Structure of the Library Market for Scientific Journals: The Case of Chemistry.
ERIC Educational Resources Information Center
Bensman, Stephen J.
1996-01-01
An analysis of price and scientific value of chemistry journals concluded that scientific value does not play a role in the pricing of scientific journals and that consequently little relationship exists between scientific value and the prices charged libraries for journals. Describes a software package, Serials Evaluator, being developed at…
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.
Advanced fingerprint verification software
NASA Astrophysics Data System (ADS)
Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.
2016-05-01
We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.
Evaluation of RDBMS packages for use in astronomy
NASA Technical Reports Server (NTRS)
Page, C. G.; Davenhall, A. C.
1992-01-01
Tabular data sets arise in many areas of astronomical data analysis, from raw data (such as photon event lists) to final results (such as source catalogs). The Starlink catalog access and reporting package, SCAR, was originally developed to handle IRAS data and it has been the principal relational DBMS in the Starlink software collection for several years. But SCAR has many limitations and is VMS-specific, while Starlink is in transition from VMS to Unix. Rather than attempt a major re-write of SCAR for Unix, it seemed more sensible to see whether any existing database packages are suitable for general astronomical use. The authors first drew up a list of desirable properties for such a system and then used these criteria to evaluate a number of packages, both free ones and those commercially available. It is already clear that most commercial DBMS packages are not very well suited to the requirements; for example, most cannot carry out efficiently even fairly basic operations such as joining two catalogs on an approximate match of celestial positions. This paper reports the results of the evaluation exercise and notes the problems in using a standard DBMS package to process scientific data. In parallel with this the authors have started to develop a simple database engine that can handle tabular data in a range of common formats including simple direct-access files (such as SCAR and Exosat DBMS tables) and FITS tables (both ASCII and binary).
NASA Astrophysics Data System (ADS)
Witzel, Gunther; Lu, Jessica R.; Ghez, Andrea M.; Martinez, Gregory D.; Fitzgerald, Michael P.; Britton, Matthew; Sitarski, Breann N.; Do, Tuan; Campbell, Randall D.; Service, Maxwell; Matthews, Keith; Morris, Mark R.; Becklin, E. E.; Wizinowich, Peter L.; Ragland, Sam; Doppmann, Greg; Neyman, Chris; Lyke, James; Kassis, Marc; Rizzi, Luca; Lilley, Scott; Rampy, Rachel
2016-07-01
General relativity can be tested in the strong gravity regime by monitoring stars orbiting the supermassive black hole at the Galactic Center with adaptive optics. However, the limiting source of uncertainty is the spatial PSF variability due to atmospheric anisoplanatism and instrumental aberrations. The Galactic Center Group at UCLA has completed a project developing algorithms to predict PSF variability for Keck AO images. We have created a new software package (AIROPA), based on modified versions of StarFinder and Arroyo, that takes atmospheric turbulence profiles, instrumental aberration maps, and images as inputs and delivers improved photometry and astrometry on crowded fields. This software package will be made publicly available soon.
WannierTools: An open-source software package for novel topological materials
NASA Astrophysics Data System (ADS)
Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.
2018-03-01
We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).
Use of symbolic computation in robotics education
NASA Technical Reports Server (NTRS)
Vira, Naren; Tunstel, Edward
1992-01-01
An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.
Hybrid Optimization Parallel Search PACKage
DOE Office of Scientific and Technical Information (OSTI.GOV)
2009-11-10
HOPSPACK is open source software for solving optimization problems without derivatives. Application problems may have a fully nonlinear objective function, bound constraints, and linear and nonlinear constraints. Problem variables may be continuous, integer-valued, or a mixture of both. The software provides a framework that supports any derivative-free type of solver algorithm. Through the framework, solvers request parallel function evaluation, which may use MPI (multiple machines) or multithreading (multiple processors/cores on one machine). The framework provides a Cache and Pending Cache of saved evaluations that reduces execution time and facilitates restarts. Solvers can dynamically create other algorithms to solve subproblems, amore » useful technique for handling multiple start points and integer-valued variables. HOPSPACK ships with the Generating Set Search (GSS) algorithm, developed at Sandia as part of the APPSPACK open source software project.« less
Wang, Anliang; Yan, Xiaolong; Wei, Zhijun
2018-04-27
This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.
Problems in Analyzing Time Series with Gaps and Their Solution with the WinABD Software Package
NASA Astrophysics Data System (ADS)
Desherevskii, A. V.; Zhuravlev, V. I.; Nikolsky, A. N.; Sidorin, A. Ya.
2017-12-01
Technologies for the analysis of time series with gaps are considered. Some algorithms of signal extraction (purification) and evaluation of its characteristics, such as rhythmic components, are discussed for series with gaps. Examples are given for the analysis of data obtained during long-term observations at the Garm geophysical test site and in other regions. The technical solutions used in the WinABD software are considered to most efficiently arrange the operation of relevant algorithms in the presence of observational defects.
A User-Friendly Software Package for HIFU Simulation
NASA Astrophysics Data System (ADS)
Soneson, Joshua E.
2009-04-01
A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.
Comparison of requirements and capabilities of major multipurpose software packages.
Igo, Robert P; Schnell, Audrey H
2012-01-01
The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.
Software package for modeling spin-orbit motion in storage rings
NASA Astrophysics Data System (ADS)
Zyuzin, D. V.
2015-12-01
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.
Painting a picture across the landscape with ModelMap
Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino
2017-01-01
Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...
"FluSpec": A Simulated Experiment in Fluorescence Spectroscopy
ERIC Educational Resources Information Center
Bigger, Stephen W.; Bigger, Andrew S.; Ghiggino, Kenneth P.
2014-01-01
The "FluSpec" educational software package is a fully contained tutorial on the technique of fluorescence spectroscopy as well as a simulator on which experiments can be performed. The procedure for each of the experiments is also contained within the package along with example analyses of results that are obtained using the software.
Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)
ERIC Educational Resources Information Center
Yavuz, Guler; Hambleton, Ronald K.
2017-01-01
Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…
Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.
ERIC Educational Resources Information Center
Senn, Gary J.; Smyth, Thomas J. C.
Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…
Cooperative Work and Sustainable Scientific Software Practices in R
NASA Astrophysics Data System (ADS)
Weber, N.
2013-12-01
Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.
NASA Technical Reports Server (NTRS)
Jumper, Judith K.
1994-01-01
The Laser Velocimeter Data Acquisition System (LVDAS) in the Langley 14- by 22-Foot Tunnel is controlled by a comprehensive software package. The software package was designed to control the data acquisition process during wind tunnel tests which employ a laser velocimeter measurement system. This report provides detailed explanations on how to configure and operate the LVDAS system to acquire laser velocimeter and static wind tunnel data.
Space-Shuttle Emulator Software
NASA Technical Reports Server (NTRS)
Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram;
2007-01-01
A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.
Hubert: Software for efficient analysis of in-situ nuclear forward scattering experiments
NASA Astrophysics Data System (ADS)
Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel
2016-10-01
Combination of short data acquisition time and local investigation of a solid state through hyperfine parameters makes nuclear forward scattering (NFS) a unique experimental technique for investigation of fast processes. However, the total number of acquired NFS time spectra may be very high. Therefore an efficient way of the data evaluation is needed. In this paper we report the development of Hubert software package as a response to the rapidly developing field of in-situ NFS experiments. Hubert offers several useful features for data files processing and could significantly shorten the evaluation time by using a simple connection between the neighboring time spectra through their input and output parameter values.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.
Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld
2016-08-01
There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tanpitukpongse, Teerath P.; Mazurowski, Maciej A.; Ikhena, John; Petrella, Jeffrey R.
2016-01-01
Background and Purpose To assess prognostic efficacy of individual versus combined regional volumetrics in two commercially-available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer's disease. Materials and Methods Data was obtained through the Alzheimer's Disease Neuroimaging Initiative. 192 subjects (mean age 74.8 years, 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1WI MRI sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant® and Neuroreader™. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated using a univariable approach employing individual regional brain volumes, as well as two multivariable approaches (multiple regression and random forest), combining multiple volumes. Results On univariable analysis of 11 NeuroQuant® and 11 Neuroreader™ regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69 NeuroQuant®, 0.68 Neuroreader™), and was not significantly different (p > 0.05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63 logistic regression, 0.60 random forest NeuroQuant®; 0.65 logistic regression, 0.62 random forest Neuroreader™). Conclusion Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer's disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in MCI, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. PMID:28057634
Comparison of Perfusion CT Software to Predict the Final Infarct Volume After Thrombectomy.
Austein, Friederike; Riedel, Christian; Kerby, Tina; Meyne, Johannes; Binder, Andreas; Lindner, Thomas; Huhndorf, Monika; Wodarg, Fritz; Jansen, Olav
2016-09-01
Computed tomographic perfusion represents an interesting physiological imaging modality to select patients for reperfusion therapy in acute ischemic stroke. The purpose of our study was to determine the accuracy of different commercial perfusion CT software packages (Philips (A), Siemens (B), and RAPID (C)) to predict the final infarct volume (FIV) after mechanical thrombectomy. Single-institutional computed tomographic perfusion data from 147 mechanically recanalized acute ischemic stroke patients were postprocessed. Ischemic core and FIV were compared about thrombolysis in cerebral infarction (TICI) score and time interval to reperfusion. FIV was measured at follow-up imaging between days 1 and 8 after stroke. In 118 successfully recanalized patients (TICI 2b/3), a moderately to strongly positive correlation was observed between ischemic core and FIV. The highest accuracy and best correlation are shown in early and fully recanalized patients (Pearson r for A=0.42, B=0.64, and C=0.83; P<0.001). Bland-Altman plots and boxplots demonstrate smaller ranges in package C than in A and B. Significant differences were found between the packages about over- and underestimation of the ischemic core. Package A, compared with B and C, estimated more than twice as many patients with a malignant stroke profile (P<0.001). Package C best predicted hypoperfusion volume in nonsuccessfully recanalized patients. Our study demonstrates best accuracy and approximation between the results of a fully automated software (RAPID) and FIV, especially in early and fully recanalized patients. Furthermore, this software package overestimated the FIV to a significantly lower degree and estimated a malignant mismatch profile less often than other software. © 2016 American Heart Association, Inc.
NASA Technical Reports Server (NTRS)
Thompson David S.; Soni, Bharat K.
2001-01-01
An integrated geometry/grid/simulation software package, ICEG2D, is being developed to automate computational fluid dynamics (CFD) simulations for single- and multi-element airfoils with ice accretions. The current version, ICEG213 (v2.0), was designed to automatically perform four primary functions: (1) generate a grid-ready surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generate high-quality structured and generalized grids starting from a defined surface definition, (3) generate the input and restart files needed to run the structured grid CFD solver NPARC or the generalized grid CFD solver HYBFL2D, and (4) using the flow solutions, generate solution-adaptive grids. ICEG2D (v2.0) can be operated in either a batch mode using a script file or in an interactive mode by entering directives from a command line within a Unix shell. This report summarizes activities completed in the first two years of a three-year research and development program to address automation issues related to CFD simulations for airfoils with ice accretions. As well as describing the technology employed in the software, this document serves as a users manual providing installation and operating instructions. An evaluation of the software is also presented.
ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.
Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie
2018-03-01
ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .
Proposed software system for atomic-structure calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, C.F.
1981-07-01
Atomic structure calculations are understood well enough that, at a routine level, an atomic structure software package can be developed. At the Atomic Physics Conference in Riga, 1978 L.V. Chernysheva and M.Y. Amusia of Leningrad University, presented a paper on Software for Atomic Calculations. Their system, called ATOM is based on the Hartree-Fock approximation and correlation is included within the framework of RPAE. Energy level calculations, transition probabilities, photo-ionization cross-sections, electron scattering cross-sections are some of the physical properties that can be evaluated by their system. The MCHF method, together with CI techniques and the Breit-Pauli approximation also provides amore » sound theoretical basis for atomic structure calculations.« less
User's Guide for MapIMG 2: Map Image Re-projection Software Package
Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.
2006-01-01
BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.
Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang
2016-12-23
A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .
A user-friendly software package to ease the use of VIC hydrologic model for practitioners
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P.; Brown, C.
2016-12-01
The VIC (Variable Infiltration Capacity) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides users with useful information regarding the quantity and timing of available water at points of interest within the basin. However, despite its popularity (proved by numerous applications in the literature), its wider adoption is hampered by the considerable effort required to prepare model inputs; e.g., input files storing spatial information related to watershed topography, soil properties, and land cover. This study presents a user-friendly software package (named VIC Setup Toolkit) developed within the MATLAB (matrix laboratory) framework and accessible through an intuitive graphical user interface. The VIC Setup Toolkit enables users to navigate the model building process confidently through prompts and automation, with an intention to promote the use of the model for both practical and academic purposes. The automated processes include watershed delineation, climate and geographical input set-up, model parameter calibration, graph generation and output evaluation. We demonstrate the package's usefulness in various case studies with the American River, Oklahoma River, Feather River and Zambezi River basins.
NASA Astrophysics Data System (ADS)
Martsynkovskyy, V. A.; Deineka, A.; Kovalenko, V.
2017-08-01
The article presents forced axial vibrations of the rotor with an automatic unloading machine in an oxidizer pump. A feature of the design is the use in the autoloading system of slotted throttles with mutually inverse throttling. Their conductivity is determined by a numerical experiment in the ANSYS CFX software package.
User-Adaptable Microcomputer Graphics Software for Life Science Instruction. Final Project Report.
ERIC Educational Resources Information Center
Spain, James D.
The objectives of the SUMIT project was to develop, evaluate, and disseminate 20 course modules (microcomputer programs) for instruction in general biology and ecology. To encourage broad utilization, the programs were designed for the Apple II microcomputer and written in Applesoft Basic with a user-adaptable format. Each package focused on a key…
Initial Ada components evaluation
NASA Technical Reports Server (NTRS)
Moebes, Travis
1989-01-01
The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coltrin, M.E.; Kee, R.J.; Rupley, F.M.
1991-07-01
Heterogeneous reaction at the interface between a solid surface and adjacent gas is central to many chemical processes. Our purpose for developing the software package SURFACE CHEMKIN was motivated by our need to understand the complex surface chemistry in chemical vapor deposition systems involving silicon, silicon nitride, and gallium arsenide. However, we have developed the approach and implemented the software in a general setting. Thus, we expect it will find use in such diverse applications as chemical vapor deposition, chemical etching, combustion of solids, and catalytic processes, and for a wide range of chemical systems. We believe that it providesmore » a powerful capability to help model, understand, and optimize important industrial and research chemical processes. The SURFACE CHEMKIN software is designed to work in conjunction with the CHEMKIN-2 software, which handles the chemical kinetics in the gas phase. It may also be used in conjunction with the Transport Property Package, which provides information about molecular diffusion. Thus, these three packages provide a foundation on which a user can build applications software to analyze gas-phase and heterogeneous chemistry in flowing systems. These packages should not be considered programs'' in the ordinary sense. That is, they are not designed to accept input, solve a particular problem, and report the answer. Instead, they are software tools intended to help a user work efficiently with large systems of chemical reactions and develop Fortran representations of systems of equations that define a particular problem. It is up the user to solve the problem and interpret the answer. 11 refs., 15 figs., 5 tabs.« less
Comparison of four software packages for CT lung volumetry in healthy individuals.
Nemec, Stefan F; Molinari, Francesco; Dufresne, Valerie; Gosset, Natacha; Silva, Mario; Bankier, Alexander A
2015-06-01
To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. • Volumetric differences, assessed by different CTLV software, are small but statistically significant. • Volumetric differences are smaller at TLC than at MIC and FRC. • Volumetric differences rarely exceed spirometric repeatability thresholds at MIC and FRC. • Differences between CTLV measurements should be interpreted based on comparison of absolute differences. • MLD increases with decreasing volumes, and is larger in lower compared to upper lobes.
Flight simulation software at NASA Dryden Flight Research Center
NASA Technical Reports Server (NTRS)
Norlin, Ken A.
1995-01-01
The NASA Dryden Flight Research Center has developed a versatile simulation software package that is applicable to a broad range of fixed-wing aircraft. This package has evolved in support of a variety of flight research programs. The structure is designed to be flexible enough for use in batch-mode, real-time pilot-in-the-loop, and flight hardware-in-the-loop simulation. Current simulations operate on UNIX-based platforms and are coded with a FORTRAN shell and C support routines. This paper discusses the features of the simulation software design and some basic model development techniques. The key capabilities that have been included in the simulation are described. The NASA Dryden simulation software is in use at other NASA centers, within industry, and at several universities. The straightforward but flexible design of this well-validated package makes it especially useful in an engineering environment.
The Ettention software package.
Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp
2016-02-01
We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pandey, Palak; Kunte, Pravin D.
2016-10-01
This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.
Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1
NASA Technical Reports Server (NTRS)
Schlosser, E. H.
1980-01-01
The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.
Using R to implement spatial analysis in open source environment
NASA Astrophysics Data System (ADS)
Shao, Yixi; Chen, Dong; Zhao, Bo
2007-06-01
R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.
A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.
ERIC Educational Resources Information Center
McConkie, George W.; And Others
A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…
Overview of Current Activities in Combustion Instability
2015-10-02
and avoid liquid rocket engine combustion stability problems Approach: 1) Develop a SOA combustion stability software package called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion
Sigma 2 Graphic Display Software Program Description
NASA Technical Reports Server (NTRS)
Johnson, B. T.
1973-01-01
A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.
How Conoco uses GIS technology to map geology, geography through time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, D.C.; Ghazi, T.Y.
1995-05-08
Conoco Inc.`s Advanced Exploration Organization (AEO) is in the business of studying foreign sedimentary basins from a regional perspective to evaluate their potential for petroleum exploration. Recently the company decided to focus some of the AEO`s resources on developing a global ranking system for those areas of the world where hydrocarbons might occur. AEO obtained software from the University of Texas, Arlington that rotates continents or portions of continents through time. Using the software, company geoscientists have created a series of maps, known as a PaleoAtlas, that depicts the geography and selected geological features for different periods in Phanerozoic time.more » In addition, the AEO has developed a software package based on ARC/INFO (ESRI Inc., Redlands, Calif.), a commercial GIS platform, to manage, integrate, and analyze those time-slice maps. Entitled PaleoAtlas Geographic Evaluation system (Pages), this software also sequences portions of the maps in a montage effect that geoscientists can use to study the geological evolution of petroleum source rocks. The paper describes the AEO project and its software.« less
Textbook Software versus Professional Software: Which Is Better for Instructional Purposes?
ERIC Educational Resources Information Center
Snell, Meggan; Yatsenko, Olga
2002-01-01
Compares textbook software with professional packages such as Peachtree for teaching accounting, in terms of cost, availability, ease of teaching and learning, and applicability. Makes suggestions for choosing accounting software. (SK)
User-friendly technology to help family carers cope.
Chambers, Mary; Connor, Samantha L
2002-12-01
Increases in the older adult population are occurring simultaneously with a growth in new technology. Modern technology presents an opportunity to enhance the quality of life and independence of older people and their family carers through communication and access to health care information. To evaluate the usability of a multimedia software application designed to provide family carers of the elderly or disabled with information, advice and psychological support to increase their coping capacity. The interactive application consisted of an information-based package that provided carers with advice on the promotion of psychological health, including relaxation and other coping strategies. The software application also included a carer self-assessment instrument, designed to provide both family and professional carers with information to assess how family carers were coping with their care-giving role. Usability evaluation was carried out in two stages. In the first stage (verification), user trials and an evaluation questionnaire were used to refine and develop the content and usability of the multimedia software application. In the second (demonstration), stage evaluation questionnaires were used to appraise the usability of the modified software application. The findings evidenced that the majority of users found the software to be usable and informative. Some areas were highlighted for improvement in the navigation of the software. The authors conclude that with further refinement, the software application has the potential to offer information and support to those who are caring for the elderly and disabled at home.
Object-oriented design of medical imaging software.
Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R
1994-01-01
A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.
StreamThermal: A software package for calculating thermal metrics from stream temperature data
Tsang, Yin-Phan; Infante, Dana M.; Stewart, Jana S.; Wang, Lizhu; Tingly, Ralph; Thornbrugh, Darren; Cooper, Arthur; Wesley, Daniel
2016-01-01
Improving quality and better availability of continuous stream temperature data allows natural resource managers, particularly in fisheries, to understand associations between different characteristics of stream thermal regimes and stream fishes. However, there is no convenient tool to efficiently characterize multiple metrics reflecting stream thermal regimes with the increasing amount of data. This article describes a software program packaged as a library in R to facilitate this process. With this freely-available package, users will be able to quickly summarize metrics that describe five categories of stream thermal regimes: magnitude, variability, frequency, timing, and rate of change. The installation and usage instruction of this package, the definition of calculated thermal metrics, as well as the output format from the package are described, along with an application showing the utility for multiple metrics. We believe this package can be widely utilized by interested stakeholders and greatly assist more studies in fisheries.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
''Do-it-yourself'' software program calculates boiler efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-03-01
An easy-to-use software package is described which runs on the IBM Personal Computer. The package calculates boiler efficiency, an important parameter of operating costs and equipment wellbeing. The program stores inputs and calculated results for 20 sets of boiler operating data, called cases. Cases can be displayed and modified on the CRT screen through multiple display pages or copied to a printer. All intermediate calculations are performed by this package. They include: steam enthalpy; water enthalpy; air humidity; gas, oil, coal, and wood heat capacity; and radiation losses.
Rothenbacher, Thorsten; Schwack, Wolfgang
2009-01-01
Plastic packaging materials may release compounds into packed foodstuffs. To identify potential migrants of toxicological concern, resins, and multilayer foils (mainly polyethylene) intended for the production of food contact materials were extracted and analyzed by GC/mass spectrometry. To identify even compounds of low concentrations, AMDIS software was used and data evaluation was safeguarded by the Kovats retention index (RI) system. In this way, 46 compounds were identified as possible migrants. The expert structure-activity relationship software DEREK for Windows was utilized to evaluate all identified substances in terms of carcinogenicity, genotoxicity, thyroid toxicity, and miscellaneous endpoints for humans. Additionally, a literature search for these compounds was performed with Sci-Finder, but relevant data were missing for 28 substances. Seven compounds with adverse toxicological effects were identified. In addition, the RIs of 24 commercial additive standards, measured with a GC capillary column of intermediate polarity, are given.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
NASA Astrophysics Data System (ADS)
Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip
2017-10-01
Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.
Mladinich, C.
2010-01-01
Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.
visCOS: An R-package to evaluate model performance of hydrological models
NASA Astrophysics Data System (ADS)
Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten
2016-04-01
The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a pivotal tool in model evaluation. They allow inferences about different systematic model-shortcomings and are an efficient way for communicating these in practice (Schulz et al., 2015). The evaluation and construction of such water balances is implemented with the presented package. During the (manual) calibration of a model or in the scope of model development, many model runs and iterations are necessary. Thus, users are often interested in comparing different model results in a visual way in order to learn about the model and to analyse parameter-changes on the output. A method to illuminate these differences and the evolution of changes is also included. References: • Gupta, H.V.; Wagener, T.; Liu, Y. (2008): Reconciling theory with observations: elements of a diagnostic approach to model evaluation, Hydrol. Process. 22, doi: 10.1002/hyp.6989. • Klemeš, V. (1986): Operational testing of hydrological simulation models, Hydrolog. Sci. J., doi: 10.1080/02626668609491024. • Kling, H.; Stanzel, P.; Fuchs, M.; and Nachtnebel, H. P. (2014): Performance of the COSERO precipitation-runoff model under non-stationary conditions in basins with different climates, Hydrolog. Sci. J., doi: 10.1080/02626667.2014.959956. • Schulz, K., Herrnegger, M., Wesemann, J., Klotz, D. Senoner, T. (2015): Kalibrierung COSERO - Mur für Pro Vis, Verbund Trading GmbH (Abteilung STG), final report, Institute of Water Management, Hydrology and Hydraulic Engineering, University of Natural Resources and Applied Life Sciences, Vienna, Austria, 217pp. • Zambrano-Bigiarini, M; Bellin, A. (2010): Comparing Goodness-of-fit Measures for Calibration of Models Focused on Extreme Events. European Geosciences Union (EGU), Geophysical Research Abstracts 14, EGU2012-11549-1.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
Bi-Force: large-scale bicluster editing and its application to gene expression data biclustering
Sun, Peng; Speicher, Nora K.; Röttger, Richard; Guo, Jiong; Baumbach, Jan
2014-01-01
Abstract The explosion of the biological data has dramatically reformed today's biological research. The need to integrate and analyze high-dimensional biological data on a large scale is driving the development of novel bioinformatics approaches. Biclustering, also known as ‘simultaneous clustering’ or ‘co-clustering’, has been successfully utilized to discover local patterns in gene expression data and similar biomedical data types. Here, we contribute a new heuristic: ‘Bi-Force’. It is based on the weighted bicluster editing model, to perform biclustering on arbitrary sets of biological entities, given any kind of pairwise similarities. We first evaluated the power of Bi-Force to solve dedicated bicluster editing problems by comparing Bi-Force with two existing algorithms in the BiCluE software package. We then followed a biclustering evaluation protocol in a recent review paper from Eren et al. (2013) (A comparative analysis of biclustering algorithms for gene expressiondata. Brief. Bioinform., 14:279–292.) and compared Bi-Force against eight existing tools: FABIA, QUBIC, Cheng and Church, Plaid, BiMax, Spectral, xMOTIFs and ISA. To this end, a suite of synthetic datasets as well as nine large gene expression datasets from Gene Expression Omnibus were analyzed. All resulting biclusters were subsequently investigated by Gene Ontology enrichment analysis to evaluate their biological relevance. The distinct theoretical foundation of Bi-Force (bicluster editing) is more powerful than strict biclustering. We thus outperformed existing tools with Bi-Force at least when following the evaluation protocols from Eren et al. Bi-Force is implemented in Java and integrated into the open source software package of BiCluE. The software as well as all used datasets are publicly available at http://biclue.mpi-inf.mpg.de. PMID:24682815
Defense AT and L. Volume 42, Number 1
2013-02-01
Agnish The U.S. Army late last year began equipping brigade combat teams with its first package of radios, satellite systems, software applications...Army’s first package of radios, satellite systems, software applications, smartphone-like devices, and other network components that provide integrated... satellite communications, intelligence, mission command applications, and the integration of C4ISR equip- ment onto various vehicle platforms. This
ERIC Educational Resources Information Center
Gagne, Phill; Furlow, Carolyn; Ross, Terris
2009-01-01
In item response theory (IRT) simulation research, it is often necessary to use one software package for data generation and a second software package to conduct the IRT analysis. Because this can substantially slow down the simulation process, it is sometimes offered as a justification for using very few replications. This article provides…
Software for Teaching about AIDS & Sex: A Critical Review of Products. A MicroSIFT Report.
ERIC Educational Resources Information Center
Weaver, Dave
This document contains critical reviews of 10 microcomputer software packages and two interactive videodisc products designed for use in teaching about Acquired Immune Deficiency Syndrome (AIDS) and sex at the secondary school level and above. Each package was reviewed by one or two secondary school health teachers and by a staff member from the…
User's manual for the VAX-Gerber link software package. Revision 1. 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isobe, G.W.
1985-10-01
This manual provides a user the information necessary to run the VAX-Gerber link software package. It is expected that the user already knows how to login to the VAX, and is familiar with the Gerber Photo Plotter. It is also highly desirable that the user be familiar with the full screen editor on the VAX, EDT.
Sampling and sensitivity analyses tools (SaSAT) for computational modelling
Hoare, Alexander; Regan, David G; Wilson, David P
2008-01-01
SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361
Li, Zhucui; Lu, Yan; Guo, Yufeng; Cao, Haijie; Wang, Qinhong; Shui, Wenqing
2018-10-31
Data analysis represents a key challenge for untargeted metabolomics studies and it commonly requires extensive processing of more than thousands of metabolite peaks included in raw high-resolution MS data. Although a number of software packages have been developed to facilitate untargeted data processing, they have not been comprehensively scrutinized in the capability of feature detection, quantification and marker selection using a well-defined benchmark sample set. In this study, we acquired a benchmark dataset from standard mixtures consisting of 1100 compounds with specified concentration ratios including 130 compounds with significant variation of concentrations. Five software evaluated here (MS-Dial, MZmine 2, XCMS, MarkerView, and Compound Discoverer) showed similar performance in detection of true features derived from compounds in the mixtures. However, significant differences between untargeted metabolomics software were observed in relative quantification of true features in the benchmark dataset. MZmine 2 outperformed the other software in terms of quantification accuracy and it reported the most true discriminating markers together with the fewest false markers. Furthermore, we assessed selection of discriminating markers by different software using both the benchmark dataset and a real-case metabolomics dataset to propose combined usage of two software for increasing confidence of biomarker identification. Our findings from comprehensive evaluation of untargeted metabolomics software would help guide future improvements of these widely used bioinformatics tools and enable users to properly interpret their metabolomics results. Copyright © 2018 Elsevier B.V. All rights reserved.
Inferring Phylogenetic Networks Using PhyloNet.
Wen, Dingqiao; Yu, Yun; Zhu, Jiafan; Nakhleh, Luay
2018-07-01
PhyloNet was released in 2008 as a software package for representing and analyzing phylogenetic networks. At the time of its release, the main functionalities in PhyloNet consisted of measures for comparing network topologies and a single heuristic for reconciling gene trees with a species tree. Since then, PhyloNet has grown significantly. The software package now includes a wide array of methods for inferring phylogenetic networks from data sets of unlinked loci while accounting for both reticulation (e.g., hybridization) and incomplete lineage sorting. In particular, PhyloNet now allows for maximum parsimony, maximum likelihood, and Bayesian inference of phylogenetic networks from gene tree estimates. Furthermore, Bayesian inference directly from sequence data (sequence alignments or biallelic markers) is implemented. Maximum parsimony is based on an extension of the "minimizing deep coalescences" criterion to phylogenetic networks, whereas maximum likelihood and Bayesian inference are based on the multispecies network coalescent. All methods allow for multiple individuals per species. As computing the likelihood of a phylogenetic network is computationally hard, PhyloNet allows for evaluation and inference of networks using a pseudolikelihood measure. PhyloNet summarizes the results of the various analyzes and generates phylogenetic networks in the extended Newick format that is readily viewable by existing visualization software.
Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy
NASA Astrophysics Data System (ADS)
Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo
2014-01-01
In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.
Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.
Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo
2014-01-01
In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.
PsyToolkit: a software package for programming psychological experiments using Linux.
Stoet, Gijsbert
2010-11-01
PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
SIGKit: a New Data-based Software for Learning Introductory Geophysics
NASA Astrophysics Data System (ADS)
Zhang, Y.; Kruse, S.; George, O.; Esmaeili, S.; Papadimitrios, K. S.; Bank, C. G.; Cadmus, A.; Kenneally, N.; Patton, K.; Brusher, J.
2016-12-01
Students of diverse academic backgrounds take introductory geophysics courses to learn the theory of a variety of measurement and analysis methods with the expectation to be able to apply their basic knowledge to real data. Ideally, such data is collected in field courses and also used in lecture-based courses because they provide a critical context for better learning and understanding of geophysical methods. Each method requires a separate software package for the data processing steps, and the complexity and variety of professional software makes the path through data processing to data interpretation a strenuous learning process for students and a challenging teaching task for instructors. SIGKit (Student Investigation of Geophysics Toolkit) being developed as a collaboration between the University of South Florida, the University of Toronto, and MathWorks intends to address these shortcomings by showing the most essential processing steps and allowing students to visualize the underlying physics of the various methods. It is based on MATLAB software and offered as an easy-to-use graphical user interface and packaged so it can run as an executable in the classroom and the field even on computers without MATLAB licenses. An evaluation of the software based on student feedback from focus-group interviews and think-aloud observations helps drive its development and refinement. The toolkit provides a logical gateway into the more sophisticated and costly software students will encounter later in their training and careers by combining essential visualization, modeling, processing, and analysis steps for seismic, GPR, magnetics, gravity, resistivity, and electromagnetic data.
Software package for modeling spin–orbit motion in storage rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de
2015-12-15
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less
Application of remote sensing to state and regional problems. [mississippi
NASA Technical Reports Server (NTRS)
Miller, W. F.; Powers, J. S.; Clark, J. R.; Solomon, J. L.; Williams, S. G. (Principal Investigator)
1981-01-01
The methods and procedures used, accomplishments, current status, and future plans are discussed for each of the following applications of LANDSAT in Mississippi: (1) land use planning in Lowndes County; (2) strip mine inventory and reclamation; (3) white-tailed deer habitat evaluation; (4) remote sensing data analysis support systems; (5) discrimination of unique forest habitats in potential lignite areas; (6) changes in gravel operations; and (7) determining freshwater wetlands for inventory and monitoring. The documentation of all existing software and the integration of the image analysis and data base software into a single package are now considered very high priority items.
[Propensity score matching in SPSS].
Huang, Fuqiang; DU, Chunlin; Sun, Menghui; Ning, Bing; Luo, Ying; An, Shengli
2015-11-01
To realize propensity score matching in PS Matching module of SPSS and interpret the analysis results. The R software and plug-in that could link with the corresponding versions of SPSS and propensity score matching package were installed. A PS matching module was added in the SPSS interface, and its use was demonstrated with test data. Score estimation and nearest neighbor matching was achieved with the PS matching module, and the results of qualitative and quantitative statistical description and evaluation were presented in the form of a graph matching. Propensity score matching can be accomplished conveniently using SPSS software.
ERIC Educational Resources Information Center
Ayres, Kevin; Cihak, David
2010-01-01
The purpose of this study was to evaluate the effects of a computer-based video instruction (CBVI) program to teach life skills. Three middle school-aged students with intellectual disabilities were taught how to make a sandwich, use a microwave, and set the table with a CBVI software package. A multiple probe across behaviors design was used to…
ERIC Educational Resources Information Center
Levy, Roy
2010-01-01
SEMModComp, a software package for conducting likelihood ratio tests for mean and covariance structure modeling is described. The package is written in R and freely available for download or on request.
The design, deployment, and testing of kriging models in GEOframe with SIK-0.9.8
NASA Astrophysics Data System (ADS)
Bancheri, Marialaura; Serafin, Francesco; Bottazzi, Michele; Abera, Wuletawu; Formetta, Giuseppe; Rigon, Riccardo
2018-06-01
This work presents a software package for the interpolation of climatological variables, such as temperature and precipitation, using kriging techniques. The purposes of the paper are (1) to present a geostatistical software that is easy to use and easy to plug in to a hydrological model; (2) to provide a practical example of an accurately designed software from the perspective of reproducible research; and (3) to demonstrate the goodness of the results of the software and so have a reliable alternative to other, more traditional tools. A total of 11 types of theoretical semivariograms and four types of kriging were implemented and gathered into Object Modeling System-compliant components. The package provides real-time optimization for semivariogram and kriging parameters. The software was tested using a year's worth of hourly temperature readings and a rain storm event (11 h) recorded in 2008 and retrieved from 97 meteorological stations in the Isarco River basin, Italy. For both the variables, good interpolation results were obtained and then compared to the results from the R package gstat.
Advanced Software Development Workstation Project
NASA Technical Reports Server (NTRS)
Lee, Daniel
1989-01-01
The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.
A Digital Control Algorithm for Magnetic Suspension Systems
NASA Technical Reports Server (NTRS)
Britton, Thomas C.
1996-01-01
An ongoing program exists to investigate and develop magnetic suspension technologies and modelling techniques at NASA Langley Research Center. Presently, there is a laboratory-scale large air-gap suspension system capable of five degree-of-freedom (DOF) control that is operational and a six DOF system that is under development. Those systems levitate a cylindrical element containing a permanent magnet core above a planar array of electromagnets, which are used for levitation and control purposes. In order to evaluate various control approaches with those systems, the Generic Real-Time State-Space Controller (GRTSSC) software package was developed. That control software package allows the user to implement multiple control methods and allows for varied input/output commands. The development of the control algorithm is presented. The desired functionality of the software is discussed, including the ability to inject noise on sensor inputs and/or actuator outputs. Various limitations, common issues, and trade-offs are discussed including data format precision; the drawbacks of using either Direct Memory Access (DMA), interrupts, or program control techniques for data acquisition; and platform dependent concerns related to the portability of the software, such as memory addressing formats. Efforts to minimize overall controller loop-rate and a comparison of achievable controller sample rates are discussed. The implementation of a modular code structure is presented. The format for the controller input data file and the noise information file is presented. Controller input vector information is available for post-processing by mathematical analysis software such as MATLAB1.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
2012-01-01
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386
The STARLINK software collection
NASA Astrophysics Data System (ADS)
Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.
1993-12-01
A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.
Efficient Calculation of Exact Exchange Within the Quantum Espresso Software Package
NASA Astrophysics Data System (ADS)
Barnes, Taylor; Kurth, Thorsten; Carrier, Pierre; Wichmann, Nathan; Prendergast, David; Kent, Paul; Deslippe, Jack
Accurate simulation of condensed matter at the nanoscale requires careful treatment of the exchange interaction between electrons. In the context of plane-wave DFT, these interactions are typically represented through the use of approximate functionals. Greater accuracy can often be obtained through the use of functionals that incorporate some fraction of exact exchange; however, evaluation of the exact exchange potential is often prohibitively expensive. We present an improved algorithm for the parallel computation of exact exchange in Quantum Espresso, an open-source software package for plane-wave DFT simulation. Through the use of aggressive load balancing and on-the-fly transformation of internal data structures, our code exhibits speedups of approximately an order of magnitude for practical calculations. Additional optimizations are presented targeting the many-core Intel Xeon-Phi ``Knights Landing'' architecture, which largely powers NERSC's new Cori system. We demonstrate the successful application of the code to difficult problems, including simulation of water at a platinum interface and computation of the X-ray absorption spectra of transition metal oxides.
Evaluation of tools for highly variable gene discovery from single-cell RNA-seq data.
Yip, Shun H; Sham, Pak Chung; Wang, Junwen
2018-02-21
Traditional RNA sequencing (RNA-seq) allows the detection of gene expression variations between two or more cell populations through differentially expressed gene (DEG) analysis. However, genes that contribute to cell-to-cell differences are not discoverable with RNA-seq because RNA-seq samples are obtained from a mixture of cells. Single-cell RNA-seq (scRNA-seq) allows the detection of gene expression in each cell. With scRNA-seq, highly variable gene (HVG) discovery allows the detection of genes that contribute strongly to cell-to-cell variation within a homogeneous cell population, such as a population of embryonic stem cells. This analysis is implemented in many software packages. In this study, we compare seven HVG methods from six software packages, including BASiCS, Brennecke, scLVM, scran, scVEGs and Seurat. Our results demonstrate that reproducibility in HVG analysis requires a larger sample size than DEG analysis. Discrepancies between methods and potential issues in these tools are discussed and recommendations are made.
The GRIDView Visualization Package
NASA Astrophysics Data System (ADS)
Kent, B. R.
2011-07-01
Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.
NASA Astrophysics Data System (ADS)
Sardina, V.
2012-12-01
The US Tsunami Warning Centers (TWCs) have traditionally generated their tsunami message products primarily as blocks of text then tagged with headers that identify them on each particular communications' (comms) circuit. Each warning center has a primary area of responsibility (AOR) within which it has an authoritative role regarding parameters such as earthquake location and magnitude. This means that when a major tsunamigenic event occurs the other warning centers need to quickly access the earthquake parameters issued by the authoritative warning center before issuing their message products intended for customers in their own AOR. Thus, within the operational context of the TWCs the scientists on duty have an operational need to access the information contained in the message products issued by other warning centers as quickly as possible. As a solution to this operational problem we designed and implemented a C++ software package that allows scanning for and parsing the entire suite of tsunami message products issued by the Pacific Tsunami Warning Center (PTWC), the West Coast and Alaska Tsunami Warning Center (WCATWC), and the Japan Meteorological Agency (JMA). The scanning and parsing classes composing the resulting C++ software package allow parsing both non-official message products(observatory messages) routinely issued by the TWCs, and all official tsunami message products such as tsunami advisories, watches, and warnings. This software package currently allows scientists on duty at the PTWC to automatically retrieve the parameters contained in tsunami messages issued by WCATWC, JMA, or PTWC itself. Extension of the capabilities of the classes composing the software package would make it possible to generate XML and CAP compliant versions of the TWCs' message products until new messaging software natively adds this capabilities. Customers who receive the TWCs' tsunami message products could also use the package to automatically retrieve information from messages sent via any text-based communications' circuit currently used by the TWCs to disseminate their tsunami message products.
21 CFR 801.50 - Labeling requirements for stand-alone software.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Labeling requirements for stand-alone software....50 Labeling requirements for stand-alone software. (a) Stand-alone software that is not distributed... in packaged form, stand-alone software regulated as a medical device must provide its unique device...
O'Callaghan, Sean; De Souza, David P; Isaac, Andrew; Wang, Qiao; Hodkinson, Luke; Olshansky, Moshe; Erwin, Tim; Appelbe, Bill; Tull, Dedreia L; Roessner, Ute; Bacic, Antony; McConville, Malcolm J; Likić, Vladimir A
2012-05-30
Gas chromatography-mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface.
NASA Astrophysics Data System (ADS)
Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.
2017-08-01
Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
Tanpitukpongse, T P; Mazurowski, M A; Ikhena, J; Petrella, J R
2017-03-01
Alzheimer disease is a prevalent neurodegenerative disease. Computer assessment of brain atrophy patterns can help predict conversion to Alzheimer disease. Our aim was to assess the prognostic efficacy of individual-versus-combined regional volumetrics in 2 commercially available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer disease. Data were obtained through the Alzheimer's Disease Neuroimaging Initiative. One hundred ninety-two subjects (mean age, 74.8 years; 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1-weighted MR imaging sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant and Neuroreader. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated by using a univariable approach using individual regional brain volumes and 2 multivariable approaches (multiple regression and random forest), combining multiple volumes. On univariable analysis of 11 NeuroQuant and 11 Neuroreader regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69, NeuroQuant; 0.68, Neuroreader) and was not significantly different ( P > .05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63, logistic regression; 0.60, random forest NeuroQuant; 0.65, logistic regression; 0.62, random forest Neuroreader). Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in mild cognitive impairment, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. © 2017 by American Journal of Neuroradiology.
Management of an affiliated Physics Residency Program using a commercial software tool.
Zacarias, Albert S; Mills, Michael D
2010-06-01
A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.
DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS
The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...
1988-03-01
PACKAGE BODY ) TLCSC P661 (CATALOG #P106-0) This package contains the CAMP parts required to do the vaypoint steering portion of navigation. The...3.3.4.1.6 PROCESSING The following describes the processing performed by this part: package body WaypointSteering is package body ...Steering_Vector_Operations is separate; package body Steering_Vector_Operations_with_Arcsin is separate; procedure Compute Turn_Angle_and Direction (UnitNormal C
IPO: a tool for automated optimization of XCMS parameters.
Libiseller, Gunnar; Dvorzak, Michaela; Kleb, Ulrike; Gander, Edgar; Eisenberg, Tobias; Madeo, Frank; Neumann, Steffen; Trausinger, Gert; Sinner, Frank; Pieber, Thomas; Magnes, Christoph
2015-04-16
Untargeted metabolomics generates a huge amount of data. Software packages for automated data processing are crucial to successfully process these data. A variety of such software packages exist, but the outcome of data processing strongly depends on algorithm parameter settings. If they are not carefully chosen, suboptimal parameter settings can easily lead to biased results. Therefore, parameter settings also require optimization. Several parameter optimization approaches have already been proposed, but a software package for parameter optimization which is free of intricate experimental labeling steps, fast and widely applicable is still missing. We implemented the software package IPO ('Isotopologue Parameter Optimization') which is fast and free of labeling steps, and applicable to data from different kinds of samples and data from different methods of liquid chromatography - high resolution mass spectrometry and data from different instruments. IPO optimizes XCMS peak picking parameters by using natural, stable (13)C isotopic peaks to calculate a peak picking score. Retention time correction is optimized by minimizing relative retention time differences within peak groups. Grouping parameters are optimized by maximizing the number of peak groups that show one peak from each injection of a pooled sample. The different parameter settings are achieved by design of experiments, and the resulting scores are evaluated using response surface models. IPO was tested on three different data sets, each consisting of a training set and test set. IPO resulted in an increase of reliable groups (146% - 361%), a decrease of non-reliable groups (3% - 8%) and a decrease of the retention time deviation to one third. IPO was successfully applied to data derived from liquid chromatography coupled to high resolution mass spectrometry from three studies with different sample types and different chromatographic methods and devices. We were also able to show the potential of IPO to increase the reliability of metabolomics data. The source code is implemented in R, tested on Linux and Windows and it is freely available for download at https://github.com/glibiseller/IPO . The training sets and test sets can be downloaded from https://health.joanneum.at/IPO .
An image-processing software package: UU and Fig for optical metrology applications
NASA Astrophysics Data System (ADS)
Chen, Lujie
2013-06-01
Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.
Lobb, Eric C
2016-07-08
Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.
MOSAIC: Software for creating mosaics from collections of images
NASA Technical Reports Server (NTRS)
Varosi, F.; Gezari, D. Y.
1992-01-01
We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.
R-Based Software for the Integration of Pathway Data into Bioinformatic Algorithms
Kramer, Frank; Bayerlová, Michaela; Beißbarth, Tim
2014-01-01
Putting new findings into the context of available literature knowledge is one approach to deal with the surge of high-throughput data results. Furthermore, prior knowledge can increase the performance and stability of bioinformatic algorithms, for example, methods for network reconstruction. In this review, we examine software packages for the statistical computing framework R, which enable the integration of pathway data for further bioinformatic analyses. Different approaches to integrate and visualize pathway data are identified and packages are stratified concerning their features according to a number of different aspects: data import strategies, the extent of available data, dependencies on external tools, integration with further analysis steps and visualization options are considered. A total of 12 packages integrating pathway data are reviewed in this manuscript. These are supplemented by five R-specific packages for visualization and six connector packages, which provide access to external tools. PMID:24833336
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. K.
FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
ERIC Educational Resources Information Center
Risley, John, Ed.
1988-01-01
Compares the features of the sonic rangers available from HRM Software, MICROMEASUREMENTS, NAGAWTIS Software Research, and PASCO Scientific for demonstrations and experiments in mechanics. Presents the advantages of the sonic rangers and the typical graphics displayed by each software package. (YP)
Mesoscale and severe storms (Mass) data management and analysis system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.; Dickerson, M.
1984-01-01
Progress on the Mesoscale and Severe Storms (MASS) data management and analysis system is described. An interactive atmospheric data base management software package to convert four types of data (Sounding, Single Level, Grid, Image) into standard random access formats is implemented and integrated with the MASS AVE80 Series general purpose plotting and graphics display data analysis software package. An interactive analysis and display graphics software package (AVE80) to analyze large volumes of conventional and satellite derived meteorological data is enhanced to provide imaging/color graphics display utilizing color video hardware integrated into the MASS computer system. Local and remote smart-terminal capability is provided by installing APPLE III computer systems within individual scientist offices and integrated with the MASS system, thus providing color video display, graphics, and characters display of the four data types.
Comparison of methods for quantitative evaluation of endoscopic distortion
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua
2015-03-01
Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.
Attributes and Behaviors of Performance-Centered Systems.
ERIC Educational Resources Information Center
Gery, Gloria
1995-01-01
Examines attributes, characteristics, and behaviors of performance-centered software packages that are emerging in the consumer software marketplace and compares them with large-scale systems software being designed by internal information systems staffs and vendors of large-scale software designed for financial, manufacturing, processing, and…
WGCNA: an R package for weighted correlation network analysis.
Langfelder, Peter; Horvath, Steve
2008-12-29
Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.
WGCNA: an R package for weighted correlation network analysis
Langfelder, Peter; Horvath, Steve
2008-01-01
Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008
Yavorska, Olena O; Burgess, Stephen
2017-12-01
MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.
Investigation into Text Classification With Kernel Based Schemes
2010-03-01
Document Matrix TDMs Term-Document Matrices TMG Text to Matrix Generator TN True Negative TP True Positive VSM Vector Space Model xxii THIS PAGE...are represented as a term-document matrix, common evaluation metrics, and the software package Text to Matrix Generator ( TMG ). The classifier...AND METRICS This chapter introduces the indexing capabilities of the Text to Matrix Generator ( TMG ) Toolbox. Specific attention is placed on the
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANN, F.M.
Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.
An Integrated Research Program for the Modeling, Analysis and Control of Aerospace Systems
1992-03-03
Fabiano, Jr. - Brown University Mitchell Feigenbaum - Rockefeller University Elena Fernandez - Institudo de Desarrollo Techologico, para la Industria...system. The system runs under DEC Ultrix; we have installed the GKS graphics system and language compilers (FORTRAN and C). The DELIGHT.MIMO software ...which links a sophisticated non-smooth optimization package to some linear system software , is on the system. The package was kindly furnished by
An Integrated Research Program for the Modeling, Analysis and Control of Aerospace Systems
1992-03-03
Mitchell Feigenbaum - Rockefeller University Elena Fernandez - Institudo de Desarrollo Techologico, para la Industria Quimica Wilfred M. Greenlee...Ultrix; we have installed the GKS graphics system and language compilers (FORTRAN and C). The DELIGHT.MIMO software , which links a sophisticated non...smooth optimization package to some linear system software , is on the system. The package was kindly furnished by Professor E. Polak, Electrical and
Advanced Simulation in Undergraduate Pilot Training: Automatic Instructional System
1975-10-01
an addressable reel-to--reel audio tape recorder, a random access audio memory drum , and an interactive software package which permits the user to...audio memory drum , and an interactive software package which permits the user to develop preptogtahmed exercises. Figure 2 illustrates overall...Data Recprding System consists of two elements; an overlay program which performs the real-time sampling of specified variables and stores data to disc
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn; ...
2016-11-07
We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less
tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.
Becker, Alexander D; Grenfell, Bryan T
2017-01-01
tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.
Aspects on Transfer of Aided - Design Files
NASA Astrophysics Data System (ADS)
Goanta, A. M.; Anghelache, D. G.
2016-08-01
At this stage of development of hardware and software, each company that makes design software packages has a certain type of file created and customized in time to distinguish that company from its competitors. Thus today are widely known the DWG files belonging AutoCAD, IPT / IAM belonging to Inventor, PAR / ASM of Solid Edge's, PRT from the NX and so on. Behind every type of file there is a mathematical model which is common to more types of files. A specific aspect of the computer -aided design is that all softwares are working with both individual parts and assemblies, but their approach is different in that some use the same type of file both for each part and for the whole (PRT ), while others use different types of files (IPT / IAM, PAR / ASM, etc.). Another aspect of the computer -aided design is to transfer files between different companies which use different software packages or even the same software package but in different versions. Each of these situations generates distinct issues. Thus, to solve the partial reading by a project different from the native one, transfer files of STEP and IGES type are used
NASA Astrophysics Data System (ADS)
Yang, Kun; Xu, Quan-li; Peng, Shuang-yun; Cao, Yan-bo
2008-10-01
Based on the necessity analysis of GIS applications in earthquake disaster prevention, this paper has deeply discussed the spatial integration scheme of urban earthquake disaster loss evaluation models and visualization technologies by using the network development methods such as COM/DCOM, ActiveX and ASP, as well as the spatial database development methods such as OO4O and ArcSDE based on ArcGIS software packages. Meanwhile, according to Software Engineering principles, a solution of Urban Earthquake Emergency Response Decision Support Systems based on GIS technologies have also been proposed, which include the systems logical structures, the technical routes,the system realization methods and function structures etc. Finally, the testing systems user interfaces have also been offered in the paper.
A software tool for automatic classification and segmentation of 2D/3D medical images
NASA Astrophysics Data System (ADS)
Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur
2013-02-01
Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.
Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring
Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...
2006-01-01
The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less
Shi, Xu; Barnes, Robert O; Chen, Li; Shajahan-Haq, Ayesha N; Hilakivi-Clarke, Leena; Clarke, Robert; Wang, Yue; Xuan, Jianhua
2015-07-15
Identification of protein interaction subnetworks is an important step to help us understand complex molecular mechanisms in cancer. In this paper, we develop a BMRF-Net package, implemented in Java and C++, to identify protein interaction subnetworks based on a bagging Markov random field (BMRF) framework. By integrating gene expression data and protein-protein interaction data, this software tool can be used to identify biologically meaningful subnetworks. A user friendly graphic user interface is developed as a Cytoscape plugin for the BMRF-Net software to deal with the input/output interface. The detailed structure of the identified networks can be visualized in Cytoscape conveniently. The BMRF-Net package has been applied to breast cancer data to identify significant subnetworks related to breast cancer recurrence. The BMRF-Net package is available at http://sourceforge.net/projects/bmrfcjava/. The package is tested under Ubuntu 12.04 (64-bit), Java 7, glibc 2.15 and Cytoscape 3.1.0. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tubiana, Luca; Polles, Guido; Orlandini, Enzo; Micheletti, Cristian
2018-06-07
The KymoKnot software package and web server identifies and locates physical knots or proper knots in a series of polymer conformations. It is mainly intended as an analysis tool for trajectories of linear or circular polymers, but it can be used on single instances too, e.g. protein structures in PDB format. A key element of the software package is the so-called minimally interfering chain closure algorithm that is used to detect physical knots in open chains and to locate the knotted region in both open and closed chains. The web server offers a user-friendly graphical interface that identifies the knot type and highlights the knotted region on each frame of the trajectory, which the user can visualize interactively from various viewpoints. The dynamical evolution of the knotted region along the chain contour is presented as a kymograph. All data can be downloaded in text format. The KymoKnot package is licensed under the BSD 3-Clause licence. The server is publicly available at http://kymoknot.sissa.it/kymoknot/interactive.php .
[Application of animal models in gingival retraction experimental curriculum].
Cai, He; Yang, Shu-ying; Zeng, Yong-xiang; Qin, Han; Hu, Shan-shan; Wang, Jian
2016-02-01
To introduce a teaching method for gingival retraction, and evaluate its efficacy for implementation into experimental curricula. First, two kinds of animal models using pigs and cows (below 6 months of age) were established. Twenty-two experienced prosthodontists were then asked to apply gingival retraction on each animal model and evaluate the biofidelity of the 2 models' dento-gingival environment. The data was analyzed with SPSS19.0 software package for paired t test.Then, eighty pre-internship students were randomly divided into 2 groups. Besides the traditional teaching (lecture-based teaching), the experimental group (group A) also had access to skill training (using animal models to practice gingival retraction), while the control group (group B) only used the traditional teaching modality. All students' performance in gingival retraction and impression taking were evaluated in their internship. The data was analyzed with SPSS19.0 software package for Chi-square test. Both pig and cow's dento-gingival environment were similar to that of human being, and there was no significant difference between the two models'biofidelities (P>0.05). In addition, both the effect of gingival retraction and the quality of impression in group A were significantly better than those in group B (P<0.05). Compared with the traditional strategy,practising gingival retraction on animal models can offer greater opportunities for skill development,and be implemented for a wider range of applications.
Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments
ERIC Educational Resources Information Center
Wang, Shuo; Wang, Jing; Gao, Yanjing
2017-01-01
An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…
Increasing Accessibility by Pooling Digital Resources
ERIC Educational Resources Information Center
Cushion, Steve
2004-01-01
There are now many CALL authoring packages that can create interactive websites and a large number of language teachers are writing materials for the whole range of such packages. Currently, each product stores its data in different formats thus hindering interoperability, pooling of digital resources and moving between software packages based in…
Multiple-Group Analysis Using the sem Package in the R System
ERIC Educational Resources Information Center
Evermann, Joerg
2010-01-01
Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…
LANZ: Software solving the large sparse symmetric generalized eigenproblem
NASA Technical Reports Server (NTRS)
Jones, Mark T.; Patrick, Merrell L.
1990-01-01
A package, LANZ, for solving the large symmetric generalized eigenproblem is described. The package was tested on four different architectures: Convex 200, CRAY Y-MP, Sun-3, and Sun-4. The package uses a Lanczos' method and is based on recent research into solving the generalized eigenproblem.
Rule-based optimization and multicriteria decision support for packaging a truck chassis
NASA Astrophysics Data System (ADS)
Berger, Martin; Lindroth, Peter; Welke, Richard
2017-06-01
Trucks are highly individualized products where exchangeable parts are flexibly combined to suit different customer requirements, this leading to a great complexity in product development. Therefore, an optimization approach based on constraint programming is proposed for automatically packaging parts of a truck chassis by following packaging rules expressed as constraints. A multicriteria decision support system is developed where a database of truck layouts is computed, among which interactive navigation then can be performed. The work has been performed in cooperation with Volvo Group Trucks Technology (GTT), from which specific rules have been used. Several scenarios are described where the methods developed can be successfully applied and lead to less time-consuming manual work, fewer mistakes, and greater flexibility in configuring trucks. A numerical evaluation is also presented showing the efficiency and practical relevance of the methods, which are implemented in a software tool.
Hoffman, John; Young, Stefano; Noo, Frédéric; McNitt-Gray, Michael
2016-03-01
With growing interest in quantitative imaging, radiomics, and CAD using CT imaging, the need to explore the impacts of acquisition and reconstruction parameters has grown. This usually requires extensive access to the scanner on which the data were acquired and its workflow is not designed for large-scale reconstruction projects. Therefore, the authors have developed a freely available, open-source software package implementing a common reconstruction method, weighted filtered backprojection (wFBP), for helical fan-beam CT applications. FreeCT_wFBP is a low-dependency, GPU-based reconstruction program utilizing c for the host code and Nvidia CUDA C for GPU code. The software is capable of reconstructing helical scans acquired with arbitrary pitch-values, and sampling techniques such as flying focal spots and a quarter-detector offset. In this work, the software has been described and evaluated for reconstruction speed, image quality, and accuracy. Speed was evaluated based on acquisitions of the ACR CT accreditation phantom under four different flying focal spot configurations. Image quality was assessed using the same phantom by evaluating CT number accuracy, uniformity, and contrast to noise ratio (CNR). Finally, reconstructed mass-attenuation coefficient accuracy was evaluated using a simulated scan of a FORBILD thorax phantom and comparing reconstructed values to the known phantom values. The average reconstruction time evaluated under all flying focal spot configurations was found to be 17.4 ± 1.0 s for a 512 row × 512 column × 32 slice volume. Reconstructions of the ACR phantom were found to meet all CT Accreditation Program criteria including CT number, CNR, and uniformity tests. Finally, reconstructed mass-attenuation coefficient values of water within the FORBILD thorax phantom agreed with original phantom values to within 0.0001 mm(2)/g (0.01%). FreeCT_wFBP is a fast, highly configurable reconstruction package for third-generation CT available under the GNU GPL. It shows good performance with both clinical and simulated data.
NASA Technical Reports Server (NTRS)
Bodechtel, J. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The geological interpretation on data exhibiting the Italian peninsula led to the recognition of tectonic features which are explained by a clockwise rotation of various blocks along left-handed transform faults. These faults can be interpreted as resulting from shear due to main stress directed north-eastwards. A land use map of the mountainous regions of Italy was produced on a scale of 1:250,000. For the digital treatment of MSS-CCTs an image processing software was written in FORTRAN 4. The software package includes descriptive statistics and also classification algorithms.
Binary-mask generation for diffractive optical elements using microcomputers.
O'Shea, D C; Beletic, J W; Poutous, M
1993-05-10
A new technique for generation of binary masks for the fabrication of diffractive optical elements is investigated. This technique, which uses commercially available desktop-publishing hardware and software in conjunction with a standard photoreduction camera, is much faster and less expensive thanhe conventional methods. The short turnaround time and low cost should give researchers a much greater degree of flexibility in the field of binary optics and enable wider application of diffractive-optics technology. Techniques for generating optical elements by using standard software packages that produce PostScript output are described. An evaluation of the dimensional fidelity of the mask reproduction from design to its realization in photoresist is presented.
Innovative Techniques Simplify Vibration Analysis
NASA Technical Reports Server (NTRS)
2010-01-01
In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."
Modal Analysis for Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANGO software is to provide a solution for improving small signal stability of power systems through adjusting operator-controllable variables using PMU measurement. System oscillation problems are one of the major threats to the grid stability and reliability in California and the Western Interconnection. These problems result in power fluctuations, lower grid operation efficiency, and may even lead to large-scale grid breakup and outages. This MANGO software aims to solve this problem by automatically generating recommended operation procedures termed Modal Analysis for Grid Operation (MANGO) to improve damping of inter-area oscillation modes. The MANGO procedure includes three steps: recognizing small signalmore » stability problems, implementing operating point adjustment using modal sensitivity, and evaluating the effectiveness of the adjustment. The MANGO software package is designed to help implement the MANGO procedure.« less
Educational Software for Illustration of Drainage, Evapotranspiration, and Crop Yield.
ERIC Educational Resources Information Center
Khan, A. H.; And Others
1996-01-01
Describes a study that developed a software package for illustrating drainage, evapotranspiration, and crop yield as influenced by water conditions. The software is a tool for depicting water's influence on crop production in western Kansas. (DDR)
Modelling robotic systems with DADS
NASA Technical Reports Server (NTRS)
Churchill, L. W.; Sharf, I.
1993-01-01
With the appearance of general off-the-shelf software packages for the simulation of mechanical systems, modelling and simulation of mechanisms has become an easier task. The authors have recently used one such package, DADS, to model the dynamics of rigid and flexible-link robotic manipulators. In this paper, we present this overview of our learning experiences with DADS, in the hope that it will shorten the learning process for others interested in this software.
NASA Technical Reports Server (NTRS)
1997-01-01
DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.
A streamlined Python framework for AT-TPC data analysis
NASA Astrophysics Data System (ADS)
Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.
2017-09-01
User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.
NASA Astrophysics Data System (ADS)
Niu, Yingli; Li, Wenqiang; Peng, Qian; Geng, Hua; Yi, Yuanping; Wang, Linjun; Nan, Guangjun; Wang, Dong; Shuai, Zhigang
2018-04-01
MOlecular MAterials Property Prediction Package (MOMAP) is a software toolkit for molecular materials property prediction. It focuses on luminescent properties and charge mobility properties. This article contains a brief descriptive introduction of key features, theoretical models and algorithms of the software, together with examples that illustrate the performance. First, we present the theoretical models and algorithms for molecular luminescent properties calculation, which includes the excited-state radiative/non-radiative decay rate constant and the optical spectra. Then, a multi-scale simulation approach and its algorithm for the molecular charge mobility are described. This approach is based on hopping model and combines with Kinetic Monte Carlo and molecular dynamics simulations, and it is especially applicable for describing a large category of organic semiconductors, whose inter-molecular electronic coupling is much smaller than intra-molecular charge reorganisation energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
KiT: a MATLAB package for kinetochore tracking.
Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J
2016-06-15
During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.
MPTinR: analysis of multinomial processing tree models in R.
Singmann, Henrik; Kellen, David
2013-06-01
We introduce MPTinR, a software package developed for the analysis of multinomial processing tree (MPT) models. MPT models represent a prominent class of cognitive measurement models for categorical data with applications in a wide variety of fields. MPTinR is the first software for the analysis of MPT models in the statistical programming language R, providing a modeling framework that is more flexible than standalone software packages. MPTinR also introduces important features such as (1) the ability to calculate the Fisher information approximation measure of model complexity for MPT models, (2) the ability to fit models for categorical data outside the MPT model class, such as signal detection models, (3) a function for model selection across a set of nested and nonnested candidate models (using several model selection indices), and (4) multicore fitting. MPTinR is available from the Comprehensive R Archive Network at http://cran.r-project.org/web/packages/MPTinR/ .
ERIC Educational Resources Information Center
Science Teacher, 1988
1988-01-01
Reviews four software packages available for IBM PC or Apple II. Includes "Graphical Analysis III"; "Space Max: Space Station Construction Simulation"; "Guesstimation"; and "Genetic Engineering Toolbox." Focuses on each packages' strengths in a high school context. (CW)
Volumetric neuroimage analysis extensions for the MIPAV software package.
Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L
2007-09-15
We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.
[Microcomputer control of a LED stimulus display device].
Ohmoto, S; Kikuchi, T; Kumada, T
1987-02-01
A visual stimulus display system controlled by a microcomputer was constructed at low cost. The system consists of a LED stimulus display device, a microcomputer, two interface boards, a pointing device (a "mouse") and two kinds of software. The first software package is written in BASIC. Its functions are: to construct stimulus patterns using the mouse, to construct letter patterns (alphabet, digit, symbols and Japanese letters--kanji, hiragana, katakana), to modify the patterns, to store the patterns on a floppy disc, to translate the patterns into integer data which are used to display the patterns in the second software. The second software package, written in BASIC and machine language, controls display of a sequence of stimulus patterns in predetermined time schedules in visual experiments.
Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter
2012-09-01
Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.
LHCb Build and Deployment Infrastructure for run 2
NASA Astrophysics Data System (ADS)
Clemencic, M.; Couturier, B.
2015-12-01
After the successful run 1 of the LHC, the LHCb Core software team has taken advantage of the long shutdown to consolidate and improve its build and deployment infrastructure. Several of the related projects have already been presented like the build system using Jenkins, as well as the LHCb Performance and Regression testing infrastructure. Some components are completely new, like the Software Configuration Database (using the Graph DB Neo4j), or the new packaging installation using RPM packages. Furthermore all those parts are integrated to allow easier and quicker releases of the LHCb Software stack, therefore reducing the risk of operational errors. Integration and Regression tests are also now easier to implement, allowing to improve further the software checks.
MathBrowser: Web-Enabled Mathematical Software with Application to the Chemistry Curriculum, v 1.0
NASA Astrophysics Data System (ADS)
Goldsmith, Jack G.
1997-10-01
MathSoft: Cambridge, MA, 1996; free via ftp from www.mathsoft.com. The movement to provide computer-based applications in chemistry has come to focus on three main areas: software aimed at specific applications (drawing, simulation, data analysis, etc.), multimedia applications designed to assist in the presentation of conceptual information, and packages to be used in conjunction with a particular textbook at a specific point in the chemistry curriculum. The result is a situation where no single software package devoted to problem solving can be used across a large segment of the curriculum. Adoption of World Wide Web (WWW) technology by a manufacturer of mathematical software, however, has produced software that provides an attractive means of providing a problem-solving resource to students in courses from freshman through senior level.
MINDS: A microcomputer interactive data system for 8086-based controllers
NASA Technical Reports Server (NTRS)
Soeder, J. F.
1985-01-01
A microcomputer interactive data system (MINDS) software package for the 8086 family of microcomputers is described. To enhance program understandability and ease of code maintenance, the software is written in PL/M-86, Intel Corporation's high-level system implementation language. The MINDS software is intended to run in residence with real-time digital control software to provide displays of steady-state and transient data. In addition, the MINDS package provides classic monitor capabilities along with extended provisions for debugging an executing control system. The software uses the CP/M-86 operating system developed by Digital Research, Inc., to provide program load capabilities along with a uniform file structure for data and table storage. Finally, a library of input and output subroutines to be used with consoles equipped with PL/M-86 and assembly language is described.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
Software Tools for Development on the Peregrine System | High-Performance
Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python
Mirion--a software package for automatic processing of mass spectrometric images.
Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B
2013-08-01
Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.
The SeaDAS Processing and Analysis System: SeaWiFS, MODIS, and Beyond
NASA Astrophysics Data System (ADS)
MacDonald, M. D.; Ruebens, M.; Wang, L.; Franz, B. A.
2005-12-01
The SeaWiFS Data Analysis System (SeaDAS) is a comprehensive software package for the processing, display, and analysis of ocean data from a variety of satellite sensors. Continuous development and user support by programmers and scientists for more than a decade has helped to make SeaDAS the most widely used software package in the world for ocean color applications, with a growing base of users from the land and sea surface temperature community. Full processing support for past (CZCS, OCTS, MOS) and present (SeaWiFS, MODIS) sensors, and anticipated support for future missions such as NPP/VIIRS, enables end users to reproduce the standard ocean archive product suite distributed by NASA's Ocean Biology Processing Group (OBPG), as well as a variety of evaluation and intermediate ocean, land, and atmospheric products. Availability of the processing algorithm source codes and a software build environment also provide users with the tools to implement custom algorithms. Recent SeaDAS enhancements include synchronization of MODIS processing with the latest code and calibration updates from the MODIS Calibration Support Team (MCST), support for all levels of MODIS processing including Direct Broadcast, a port to the Macintosh OS X operating system, release of the display/analysis-only SeaDAS-Lite, and an extremely active web-based user support forum.
SolTrack: an automatic video processing software for in situ interface tracking.
Griesser, S; Pierer, R; Reid, M; Dippenaar, R
2012-10-01
High-Resolution in situ observation of solidification experiments has become a powerful technique to improve the fundamental understanding of solidification processes of metals and alloys. In the present study, high-temperature laser-scanning confocal microscopy (HTLSCM) was utilized to observe and capture in situ solidification and phase transformations of alloys for subsequent post processing and analysis. Until now, this analysis has been very time consuming as frame-by-frame manual evaluation of propagating interfaces was used to determine the interface velocities. SolTrack has been developed using the commercial software package MATLAB and is designed to automatically detect, locate and track propagating interfaces during solidification and phase transformations as well as to calculate interfacial velocities. Different solidification phenomena have been recorded to demonstrate a wider spectrum of applications of this software. A validation, through comparison with manual evaluation, is included where the accuracy is shown to be very high. © 2012 The Authors Journal of Microscopy © 2012 Royal Microscopical Society.
Progress in the Development of a Prototype Reuse Enablement System
NASA Astrophysics Data System (ADS)
Marshall, J. J.; Downs, R. R.; Gilliam, L. J.; Wolfe, R. E.
2008-12-01
An important part of promoting software reuse is to ensure that reusable software assets are readily available to the software developers who want to use them. Through dialogs with the community, the NASA Earth Science Data Systems Software Reuse Working Group has learned that the lack of a centralized, domain- specific software repository or catalog system addressing the needs of the Earth science community is a major barrier to software reuse within the community. The Working Group has proposed the creation of such a reuse enablement system, which would provide capabilities for contributing and obtaining reusable software, to remove this barrier. The Working Group has recommended the development of a Reuse Enablement System to NASA and has performed a trade study to review systems with similar capabilities and to identify potential platforms for the proposed system. This was followed by an architecture study to determine an expeditious and cost-effective solution for this system. A number of software packages and systems were examined through both creating prototypes and examining existing systems that use the same software packages and systems. Based on the results of the architecture study, the Working Group developed a prototype of the proposed system using the recommended software package, through an iterative process of identifying needed capabilities and improving the system to provide those capabilities. Policies for the operation and maintenance of the system are being established for the system, and the identification of system policies also has contributed to the development process. Additionally, a test plan is being developed for formal testing of the prototype, to ensure that it meets all of the requirements previously developed by the Working Group. This poster summarizes the results of our work to date, focusing on the most recent activities.
FTOOLS: A general package of software to manipulate FITS files
NASA Astrophysics Data System (ADS)
Blackburn, J. K.; Shaw, R. A.; Payne, H. E.; Hayes, J. J. E.; Heasarc
1999-12-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. The FTOOLS package contains many utility programs which perform modular tasks on any FITS image or table, as well as higher-level analysis programs designed specifically for data from current and past high energy astrophysics missions. The utility programs for FITS tables are especially rich and powerful, and provide functions for presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual FTOOLS programs can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. FTOOLS development began in 1991 and has produced the main set of data analysis software for the current ASCA and RXTE space missions and for other archival sets of X-ray and gamma-ray data. The FTOOLS software package is supported on most UNIX platforms and on Windows machines. The user interface is controlled by standard parameter files that are very similar to those used by IRAF. The package is self documenting through a stand alone help task called fhelp. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
A Freeware Path to Neutron Computed Tomography
NASA Astrophysics Data System (ADS)
Schillinger, Burkhard; Craft, Aaron E.
Neutron computed tomography has become a routine method at many neutron sources due to the availability of digital detection systems, powerful computers and advanced software. The commercial packages Octopus by Inside Matters and VGStudio by Volume Graphics have been established as a quasi-standard for high-end computed tomography. However, these packages require a stiff investment and are available to the users only on-site at the imaging facility to do their data processing. There is a demand from users to have image processing software at home to do further data processing; in addition, neutron computed tomography is now being introduced even at smaller and older reactors. Operators need to show a first working tomography setup before they can obtain a budget to build an advanced tomography system. Several packages are available on the web for free; however, these have been developed for X-rays or synchrotron radiation and are not immediately useable for neutron computed tomography. Three reconstruction packages and three 3D-viewers have been identified and used even for Gigabyte datasets. This paper is not a scientific publication in the classic sense, but is intended as a review to provide searchable help to make the described packages usable for the tomography community. It presents the necessary additional preprocessing in ImageJ, some workarounds for bugs in the software, and undocumented or badly documented parameters that need to be adapted for neutron computed tomography. The result is a slightly complicated, but surprisingly high-quality path to neutron computed tomography images in 3D, but not a replacement for the even more powerful commercial software mentioned above.
Parallel computation and the Basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1992-12-16
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Parallel computation and the basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1993-05-01
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong
2017-01-01
Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936
Community-driven computational biology with Debian Linux
2010-01-01
Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984
WinTRAX: A raytracing software package for the design of multipole focusing systems
NASA Astrophysics Data System (ADS)
Grime, G. W.
2013-07-01
The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.
NASA Technical Reports Server (NTRS)
Thompson, David S.; Soni, Bharat K.
2000-01-01
An integrated software package, ICEG2D, was developed to automate computational fluid dynamics (CFD) simulations for single-element airfoils with ice accretion. ICEG2D is designed to automatically perform three primary functions: (1) generating a grid-ready, surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generating a high-quality grid using the generated surface point distribution, and (3) generating the input and restart files needed to run the general purpose CFD solver NPARC. ICEG2D can be executed in batch mode using a script file or in an interactive mode by entering directives from a command line. This report summarizes activities completed in the first year of a three-year research and development program to address issues related to CFD simulations for aircraft components with ice accretion. Specifically, this document describes the technology employed in the software, the installation procedure, and a description of the operation of the software package. Validation of the geometry and grid generation modules of ICEG2D is also discussed.
PINT, A Modern Software Package for Pulsar Timing
NASA Astrophysics Data System (ADS)
Luo, Jing; Ransom, Scott M.; Demorest, Paul; Ray, Paul S.; Stovall, Kevin; Jenet, Fredrick; Ellis, Justin; van Haasteren, Rutger; Bachetti, Matteo; NANOGrav PINT developer team
2018-01-01
Pulsar timing, first developed decades ago, has provided an extremely wide range of knowledge about our universe. It has been responsible for many important discoveries, such as the discovery of the first exoplanet and the orbital period decay of double neutron star systems. Currently pulsar timing is the leading technique for detecting low frequency (about 10^-9 Hertz) gravitational waves (GW) using an array of pulsars as the detectors. To achieve this goal, high precision pulsar timing data, at about nanoseconds level, is required. Most high precision pulsar timing data are analyzed using the widely adopted software TEMPO/TEMPO2. But for a robust and believable GW detection, it is important to have independent software that can cross-check the result. In this poster we present the new generation pulsar timing software PINT. This package will provide a robust system to cross check high-precision timing results, completely independent of TEMPO and TEMPO2. In addition, PINT is designed to be a package that is easy to extend and modify, through use of flexible code architecture and a modern programming language, Python, with modern technology and libraries.
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2017-04-01
Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.
Microcomputer Software Packages--Choose with Caution.
ERIC Educational Resources Information Center
Naumer, Janet Noll
1983-01-01
Briefly discusses types of software available for library and media center operations and library instruction, suggests three sources of software reviews, and describes almost 50 specific application programs available for bibliographic management, cataloging, circulation, inventory and purchasing, readability, and teaching library skills in…
ConcreteWorks v3 training/user manual (P1) : ConcreteWorks software (P2).
DOT National Transportation Integrated Search
2017-04-01
ConcreteWorks is designed to be a user-friendly software package that can help concrete : professionals optimize concrete mixture proportioning, perform a concrete thermal analysis, and : increase the chloride diffusion service life. The software pac...
Using WNTR to Model Water Distribution System Resilience ...
The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres
MASTOS: Mammography Simulation Tool for design Optimization Studies.
Spyrou, G; Panayiotakis, G; Tzanakos, G
2000-01-01
Mammography is a high quality imaging technique for the detection of breast lesions, which requires dedicated equipment and optimum operation. The design parameters of a mammography unit have to be decided and evaluated before the construction of such a high cost of apparatus. The optimum operational parameters also must be defined well before the real breast examination. MASTOS is a software package, based on Monte Carlo methods, that is designed to be used as a simulation tool in mammography. The input consists of the parameters that have to be specified when using a mammography unit, and also the parameters specifying the shape and composition of the breast phantom. In addition, the input may specify parameters needed in the design of a new mammographic apparatus. The main output of the simulation is a mammographic image and calculations of various factors that describe the image quality. The Monte Carlo simulation code is PC-based and is driven by an outer shell of a graphical user interface. The entire software package is a simulation tool for mammography and can be applied in basic research and/or in training in the fields of medical physics and biomedical engineering as well as in the performance evaluation of new designs of mammography units and in the determination of optimum standards for the operational parameters of a mammography unit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.
Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of testsmore » and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.« less
Quality Assurance Information for R Packages "aqfig" and "M3"
R packages “aqfig" and “M3" are optional modules for use with R statistical software (http://www.r-project.org). Package “aqfig" contains functions to aid users in the preparation of publication-quality figures for the display of air quality and other environmental data (e.g., le...
Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.
ERIC Educational Resources Information Center
McArthur, David
This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…
Transit safety retrofit package development : applications requirements document.
DOT National Transportation Integrated Search
2014-05-01
This Application Requirements Document for the Transit Safety Retrofit Package (TRP) Development captures the system, hardware and software requirements towards fulfilling the technical objectives stated within the contract. To achieve the objective ...
NASA Astrophysics Data System (ADS)
Memon, Aamir Mahmood; Liu, Qun; Memon, Khadim Hussain; Baloch, Wazir Ali; Memon, Asfandyar; Baset, Abdul
2015-07-01
Catch and effort data were analyzed to estimate the maximum sustainable yield (MSY) of King Soldier Bream, Argyrops spinifer (Forsskål, 1775, Family: Sparidae), and to evaluate the present status of the fish stocks exploited in Pakistani waters. The catch and effort data for the 25-years period 1985-2009 were analyzed using two computer software packages, CEDA (catch and effort data analysis) and ASPIC (a surplus production model incorporating covariates). The maximum catch of 3 458 t was observed in 1988 and the minimum catch of 1 324 t in 2005, while the average annual catch of A. spinifer over the 25 years was 2 500 t. The surplus production models of Fox, Schaefer, and Pella Tomlinson under three error assumptions of normal, log-normal and gamma are in the CEDA package and the two surplus models of Fox and logistic are in the ASPIC package. In CEDA, the MSY was estimated by applying the initial proportion (IP) of 0.8, because the starting catch was approximately 80% of the maximum catch. Except for gamma, because gamma showed maximization failures, the estimated results of MSY using CEDA with the Fox surplus production model and two error assumptions, were 1 692.08 t ( R 2=0.572) and 1 694.09 t ( R 2=0.606), respectively, and from the Schaefer and the Pella Tomlinson models with two error assumptions were 2 390.95 t ( R 2=0.563), and 2 380.06 t ( R 2=0.605), respectively. The MSY estimated by the Fox model was conservatively compared to the Schaefer and Pella Tomlinson models. The MSY values from Schaefer and Pella Tomlinson models were the same. The computed values of MSY using the ASPIC computer software program with the two surplus production models of Fox and logistic were 1 498 t ( R 2=0.917), and 2 488 t ( R 2=0.897) respectively. The estimated values of MSY using CEDA were about 1 700-2 400 t and the values from ASPIC were 1 500-2 500 t. The estimates output by the CEDA and the ASPIC packages indicate that the stock is overfished, and needs some effective management to reduce the fishing effort of the species in Pakistani waters.
Mass decomposition of galaxies using DECA software package
NASA Astrophysics Data System (ADS)
Mosenkov, A. V.
2014-01-01
The new DECA software package, which is designed to perform photometric analysis of the images of disk and elliptical galaxies having a regular structure, is presented. DECA is written in Python interpreted language and combines the capabilities of several widely used packages for astronomical data processing such as IRAF, SExtractor, and the GALFIT code used to perform two-dimensional decomposition of galaxy images into several photometric components (bulge+disk). DECA has the advantage that it can be applied to large samples of galaxies with different orientations with respect to the line of sight (including edge-on galaxies) and requires minimum human intervention. Examples of using the package to study a sample of simulated galaxy images and a sample of real objects are shown to demonstrate that DECA can be a reliable tool for the study of the structure of galaxies.
ERIC Educational Resources Information Center
Kimball, Jeffrey P.; And Others
1987-01-01
Describes a variety of computer software. The packages reviewed include a variety of simulations, a spread sheet, a printer driver and an alternative operating system for DBM.PCs and compatible programs. (BSR)
Brown, Stephen; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Grima, Daniel; Wells, George; Cameron, Chris
2014-09-29
The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.
2014-01-01
Background The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. Methods We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL’s interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. Results We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. Conclusions Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. PMID:25267416
Thomas, Philipp; Matuschek, Hannes; Grima, Ramon
2012-01-01
The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license.
Grima, Ramon
2012-01-01
The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license. PMID:22723865
Holzner, Bernhard; Giesinger, Johannes M; Pinggera, Jakob; Zugal, Stefan; Schöpf, Felix; Oberguggenberger, Anne S; Gamper, Eva M; Zabernigg, August; Weber, Barbara; Rumpold, Gerhard
2012-11-09
Patient-reported Outcomes (PROs) capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO) with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff.The objective of our project was to develop software (CHES - Computer-based Health Evaluation System) for ePRO in hospital settings and at home with a special focus on the presentation of individual patient's results. Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients) to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients' PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total). Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily clinical practice and on extending cancer registries with PRO data. CHES includes several features facilitating the use of PRO data for individualized medical decision making. With its web-interface it allows ePRO also when patients are home. Thus, it provides complete monitoring of patients'physical and psychosocial symptom burden.
2012-01-01
Background Patient-reported Outcomes (PROs) capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO) with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System) for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients) to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total). Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily clinical practice and on extending cancer registries with PRO data. CHES includes several features facilitating the use of PRO data for individualized medical decision making. With its web-interface it allows ePRO also when patients are home. Thus, it provides complete monitoring of patients‘physical and psychosocial symptom burden. PMID:23140270
Development of the FITS tools package for multiple software environments
NASA Technical Reports Server (NTRS)
Pence, W. D.; Blackburn, J. K.
1992-01-01
The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valassi, A.; Clemencic, M.; Dykstra, D.
The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less
Echelle Data Reduction Cookbook
NASA Astrophysics Data System (ADS)
Clayton, Martin
This document is the first version of the Starlink Echelle Data Reduction Cookbook. It contains scripts and procedures developed by regular or heavy users of the existing software packages. These scripts are generally of two types; templates which readers may be able to modify to suit their particular needs and utilities which carry out a particular common task and can probably be used `off-the-shelf'. In the nature of this subject the recipes given are quite strongly tied to the software packages, rather than being science-data led. The major part of this document is divided into two sections dealing with scripts to be used with IRAF and with Starlink software (SUN/1).
Exploring Convergent Evolution to Provide a Foundation for Protein Engineering
2009-02-26
information if it does not display a currently valid OMB control number PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. RETORT DATE (DD-MM-YYYY...the DivergentSet and MotifCluster Algorithms Using support from this grant, we developed two software packages that provide key infrastructure for...software package we developed, MotifCluster," provides a novel way of detecting distantly related homologs, one of the key aims of the proposal. Unlike
YAMM - Yet Another Menu Manager
NASA Technical Reports Server (NTRS)
Mazer, Alan S.; Weidner, Richard J.
1991-01-01
Yet Another Menu Manager (YAMM) computer program an application-independent menuing package of software designed to remove much difficulty and save much time inherent in implementation of front ends of large packages of software. Provides complete menuing front end for wide variety of applications, with provisions for independence from specific types of terminals, configurations that meet specific needs of users, and dynamic creation of menu trees. Consists of two parts: description of menu configuration and body of application code. Written in C.
Computing Radiative Transfer in a 3D Medium
NASA Technical Reports Server (NTRS)
Von Allmen, Paul; Lee, Seungwon
2012-01-01
A package of software computes the time-dependent propagation of a narrow laser beam in an arbitrary three- dimensional (3D) medium with absorption and scattering, using the transient-discrete-ordinates method and a direct integration method. Unlike prior software that utilizes a Monte Carlo method, this software enables simulation at very small signal-to-noise ratios. The ability to simulate propagation of a narrow laser beam in a 3D medium is an improvement over other discrete-ordinate software. Unlike other direct-integration software, this software is not limited to simulation of propagation of thermal radiation with broad angular spread in three dimensions or of a laser pulse with narrow angular spread in two dimensions. Uses for this software include (1) computing scattering of a pulsed laser beam on a material having given elastic scattering and absorption profiles, and (2) evaluating concepts for laser-based instruments for sensing oceanic turbulence and related measurements of oceanic mixed-layer depths. With suitable augmentation, this software could be used to compute radiative transfer in ultrasound imaging in biological tissues, radiative transfer in the upper Earth crust for oil exploration, and propagation of laser pulses in telecommunication applications.
Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.
ERIC Educational Resources Information Center
Reed, Mary Hutchings
This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…
Software Literacy and Student Learning in the Tertiary Environment: Powerpoint and Beyond
ERIC Educational Resources Information Center
Khoo, Elaine; Hight, Craig; Cowie, Bronwen; Torrens, Rob; Ferrarelli, Lisabeth
2014-01-01
In this paper, we explore the relationship between student success in acquiring software literacy and students' broader engagement and understanding of knowledge across different disciplines. We report on the first phase of a project that examines software literacies associated with Microsoft PowerPoint as a common software package encountered and…
ERIC Educational Resources Information Center
Davies, Denise M.
1985-01-01
Discusses design, development, and use of a database to provide organization and access to a computer software collection at the University of Hawaii School of Library Studies. Field specifications, samples of report forms, and a description of the physical organization of the software collection are included. (MBR)
Software requirements flow-down and preliminary software design for the G-CLEF spectrograph
NASA Astrophysics Data System (ADS)
Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.
2016-08-01
The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
RIECK, C.A.
1999-02-23
This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less
ERIC Educational Resources Information Center
Journal of Chemical Education, 1989
1989-01-01
Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)
ERIC Educational Resources Information Center
Wulfson, Stephen, Ed.
1990-01-01
Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)
ERIC Educational Resources Information Center
Classroom Computer Learning, 1988
1988-01-01
Provides reviews of three software packages including "MusicShapes,""For Comment," and "Colortrope," which were developed for music, writing, and science, respectively. Includes information on grade levels, publishers, hardware needed, and cost. (TW)
"Software Tools" to Improve Student Writing.
ERIC Educational Resources Information Center
Oates, Rita Haugh
1987-01-01
Reviews several software packages that analyze text readability, check for spelling and style problems, offer desktop publishing capabilities, teach interviewing skills, and teach grammar using a computer game. (SRT)
ERIC Educational Resources Information Center
Wulfson, Stephen, Ed.
1988-01-01
Reviews seven instructional software packages covering a variety of topics. Includes: "Science Square-Off"; "The Desert"; "Science Courseware: Physical Science"; "Odell Lake"; "Safety First"; "An Experience in Artificial Intelligence"; and "Master Mapper." (TW)
Selecting Really Excellent Software for Young Adults.
ERIC Educational Resources Information Center
Polly, Jean Armour
1985-01-01
This article discusses criteria of a good computer software package to aid the public librarian in the building, weeding, and maintenance of a software collection for young adults. Highlights include manuals or documentation; bells, whistles, and color; and the true test of time. (EJS)
Front-End/Gateway Software: Availability and Usefulness.
ERIC Educational Resources Information Center
Kesselman, Martin
1985-01-01
Reviews features of front-end software packages (interface between user and online system)--database selection, search strategy development, saving and downloading, hardware and software requirements, training and documentation, online systems and database accession, and costs--and discusses gateway services (user searches through intermediary…
A Formative Evaluation of CU-SeeMe.
1995-02-01
CU- SeeMe is a video conferencing software package that was designed and programmed at Cornell University. The program works with the TCP/IP network...protocol and allows two or more parties to conduct a real-time video conference with full audio support. In this paper we evaluate CU- SeeMe through...caused the problem and why This helps in the process of formulating solutions for observed usability problems. All the testing results are combined in the Appendix in an illustrated partial redesign of the CU- SeeMe Interface.
Evaluation of Honeywell Recoverable Computer System (RCS) in Presence of Electromagnetic Effects
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar
1997-01-01
The design and development of a Closed-Loop System to study and evaluate the performance of the Honeywell Recoverable Computer System (RCS) in electromagnetic environments (EME) is presented. The development of a Windows-based software package to handle the time critical communication of data and commands between the RCS and flight simulation code in real-time, while meeting the stringent hard deadlines is also presented. The performance results of the RCS while exercising flight control laws under ideal conditions as well as in the presence of electromagnetic fields is also discussed.
Design and implementation of a software package to control a network of robotic observatories
NASA Astrophysics Data System (ADS)
Tuparev, G.; Nicolova, I.; Zlatanov, B.; Mihova, D.; Popova, I.; Hessman, F. V.
2006-09-01
We present a description of a reusable software package able to control a large, heterogeneous network of fully and semi-robotic observatories initially developed to run the MONET network of two 1.2 m telescopes. Special attention is given to the design of a robust, long-term observation scheduler which also allows the trading of observation time and facilities within various networks. The handling of the ``Phase I&II" project-development process, the time-accounting between complex organizational structures, and usability issues for making the package accessible not only to professional astronomers, but also to amateurs and high-school students is discussed. A simple RTML-based solution to link multiple networks is demonstrated.
A software package for interactive motor unit potential classification using fuzzy k-NN classifier.
Rasheed, Sarbast; Stashuk, Daniel; Kamel, Mohamed
2008-01-01
We present an interactive software package for implementing the supervised classification task during electromyographic (EMG) signal decomposition process using a fuzzy k-NN classifier and utilizing the MATLAB high-level programming language and its interactive environment. The method employs an assertion-based classification that takes into account a combination of motor unit potential (MUP) shapes and two modes of use of motor unit firing pattern information: the passive and the active modes. The developed package consists of several graphical user interfaces used to detect individual MUP waveforms from a raw EMG signal, extract relevant features, and classify the MUPs into motor unit potential trains (MUPTs) using assertion-based classifiers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn
We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less
ERIC Educational Resources Information Center
Sneider, Cary; DeVore, Edna
1986-01-01
Reviews software packages under these headings: (1) simulations of experiments; (2) space flight simulators; (3) planetariums; (4) space adventure games; and (5) drill and practice packages (designed for testing purposes or for helping students learn basic astronomy vocabulary). (JN)
ERIC Educational Resources Information Center
Mathematics and Computer Education, 1988
1988-01-01
Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)
ERIC Educational Resources Information Center
Dwyer, Donna; And Others
1989-01-01
Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)
de Oliveira, Marcus Vinicius Linhares; Santos, António Carvalho; Paulo, Graciano; Campos, Paulo Sergio Flores; Santos, Joana
2017-06-01
The purpose of this study was to apply a newly developed free software program, at low cost and with minimal time, to evaluate the quality of dental and maxillofacial cone-beam computed tomography (CBCT) images. A polymethyl methacrylate (PMMA) phantom, CQP-IFBA, was scanned in 3 CBCT units with 7 protocols. A macro program was developed, using the free software ImageJ, to automatically evaluate the image quality parameters. The image quality evaluation was based on 8 parameters: uniformity, the signal-to-noise ratio (SNR), noise, the contrast-to-noise ratio (CNR), spatial resolution, the artifact index, geometric accuracy, and low-contrast resolution. The image uniformity and noise depended on the protocol that was applied. Regarding the CNR, high-density structures were more sensitive to the effect of scanning parameters. There were no significant differences between SNR and CNR in centered and peripheral objects. The geometric accuracy assessment showed that all the distance measurements were lower than the real values. Low-contrast resolution was influenced by the scanning parameters, and the 1-mm rod present in the phantom was not depicted in any of the 3 CBCT units. Smaller voxel sizes presented higher spatial resolution. There were no significant differences among the protocols regarding artifact presence. This software package provided a fast, low-cost, and feasible method for the evaluation of image quality parameters in CBCT.
Seismology software: state of the practice
NASA Astrophysics Data System (ADS)
Smith, W. Spencer; Zeng, Zheng; Carette, Jacques
2018-05-01
We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.
Seismology software: state of the practice
NASA Astrophysics Data System (ADS)
Smith, W. Spencer; Zeng, Zheng; Carette, Jacques
2018-02-01
We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.
Buying in to bioinformatics: an introduction to commercial sequence analysis software
2015-01-01
Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. PMID:25183247
Buying in to bioinformatics: an introduction to commercial sequence analysis software.
Smith, David Roy
2015-07-01
Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.
Introducing Python tools for magnetotellurics: MTpy
NASA Astrophysics Data System (ADS)
Krieger, L.; Peacock, J.; Inverarity, K.; Thiel, S.; Robertson, K.
2013-12-01
Within the framework of geophysical exploration techniques, the magnetotelluric method (MT) is relatively immature: It is still not as widely spread as other geophysical methods like seismology, and its processing schemes and data formats are not thoroughly standardized. As a result, the file handling and processing software within the academic community is mainly based on a loose collection of codes, which are sometimes highly adapted to the respective local specifications. Although tools for the estimation of the frequency dependent MT transfer function, as well as inversion and modelling codes, are available, the standards and software for handling MT data are generally not unified throughout the community. To overcome problems that arise from missing standards, and to simplify the general handling of MT data, we have developed the software package "MTpy", which allows the handling, processing, and imaging of magnetotelluric data sets. It is written in Python and the code is open-source. The setup of this package follows the modular approach of successful software packages like GMT or Obspy. It contains sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides pure Python classes and functions, MTpy provides wrappers and convenience scripts to call external software, e.g. modelling and inversion codes. Even though still under development, MTpy already contains ca. 250 functions that work on raw and preprocessed data. However, as our aim is not to produce a static collection of software, we rather introduce MTpy as a flexible framework, which will be dynamically extended in the future. It then has the potential to help standardise processing procedures and at same time be a versatile supplement for existing algorithms. We introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing utilising MTpy on an example data set collected over a geothermal exploration site in South Australia. Workflow of MT data processing. Within the structural diagram, the MTpy sub-packages are shown in red (time series data processing), green (handling of EDI files and impedance tensor data), yellow (connection to modelling/inversion algorithms), black (impedance tensor interpretation, e.g. by Phase Tensor calculations), and blue (generation of visual representations, e.g pseudo sections or resistivity models).
NASA Technical Reports Server (NTRS)
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CARES/Life software developed at the NASA Lewis Research Center eases this by providing a tool that uses probabilistic reliability analysis techniques to optimize the design and manufacture of brittle material components. CARES/Life is an integrated package that predicts the probability of a monolithic ceramic component's failure as a function of its time in service. It couples commercial finite element programs--which resolve a component's temperature and stress distribution - with reliability evaluation and fracture mechanics routines for modeling strength - limiting defects. These routines are based on calculations of the probabilistic nature of the brittle material's strength.
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano
Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.
Network design and quality checks in automatic orientation of close-range photogrammetric blocks.
Dall'Asta, Elisa; Thoeni, Klaus; Santise, Marina; Forlani, Gianfranco; Giacomini, Anna; Roncella, Riccardo
2015-04-03
Due to the recent improvements of automatic measurement procedures in photogrammetry, multi-view 3D reconstruction technologies are becoming a favourite survey tool. Rapidly widening structure-from-motion (SfM) software packages offer significantly easier image processing workflows than traditional photogrammetry packages. However, while most orientation and surface reconstruction strategies will almost always succeed in any given task, estimating the quality of the result is, to some extent, still an open issue. An assessment of the precision and reliability of block orientation is necessary and should be included in every processing pipeline. Such a need was clearly felt from the results of close-range photogrammetric surveys of in situ full-scale and laboratory-scale experiments. In order to study the impact of the block control and the camera network design on the block orientation accuracy, a series of Monte Carlo simulations was performed. Two image block configurations were investigated: a single pseudo-normal strip and a circular highly-convergent block. The influence of surveying and data processing choices, such as the number and accuracy of the ground control points, autofocus and camera calibration was investigated. The research highlights the most significant aspects and processes to be taken into account for adequate in situ and laboratory surveys, when modern SfM software packages are used, and evaluates their effect on the quality of the results of the surface reconstruction.
diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.
Lun, Aaron T L; Smyth, Gordon K
2015-08-19
Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.