NetpathXL - An Excel Interface to the Program NETPATH
Parkhurst, David L.; Charlton, Scott R.
2008-01-01
NetpathXL is a revised version of NETPATH that runs under Windows? operating systems. NETPATH is a computer program that uses inverse geochemical modeling techniques to calculate net geochemical reactions that can account for changes in water composition between initial and final evolutionary waters in hydrologic systems. The inverse models also can account for the isotopic composition of waters and can be used to estimate radiocarbon ages of dissolved carbon in ground water. NETPATH relies on an auxiliary, database program, DB, to enter the chemical analyses and to perform speciation calculations that define total concentrations of elements, charge balance, and redox state of aqueous solutions that are then used in inverse modeling. Instead of DB, NetpathXL relies on Microsoft Excel? to enter the chemical analyses. The speciation calculation formerly included in DB is implemented within the program NetpathXL. A program DBXL can be used to translate files from the old DB format (.lon files) to NetpathXL spreadsheets, or to create new NetpathXL spreadsheets. Once users have a NetpathXL spreadsheet with the proper format, new spreadsheets can be generated by copying or saving NetpathXL spreadsheets. In addition, DBXL can convert NetpathXL spreadsheets to PHREEQC input files. New capabilities in PHREEQC (version 2.15) allow solution compositions to be written to a .lon file, and inverse models developed in PHREEQC to be written as NetpathXL .pat and model files. NetpathXL can open NetpathXL spreadsheets, NETPATH-format path files (.pat files), and NetpathXL-format path files (.pat files). Once the speciation calculations have been performed on a spreadsheet file or a .pat file has been opened, the NetpathXL calculation engine is identical to the original NETPATH. Development of models and viewing results in NetpathXL rely on keyboard entry as in NETPATH.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations
NASA Astrophysics Data System (ADS)
Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.
Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
Definition and maintenance of a telemetry database dictionary
NASA Technical Reports Server (NTRS)
Knopf, William P. (Inventor)
2007-01-01
A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.
Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.
Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory
2016-06-13
Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.
Diary of a Conversion--Lotus 1-2-3 to Symphony 1.1.
ERIC Educational Resources Information Center
Dunnewin, Larry
1986-01-01
Describes the uses of Lotus 1-2-3 (a spreadsheet-graphics-database program created by Lotus Development Corporation) and Symphony 1.1 (a refinement and expansion of Symphony 1.01 providing memory efficiency, speed, ease of use, greater file compatibility). Spreadsheet and graphics capabilities, the use of windows, database environment, and…
NASA Astrophysics Data System (ADS)
Al-Mishwat, Ali T.
2016-05-01
PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.
ListingAnalyst: A program for analyzing the main output file from MODFLOW
Winston, Richard B.; Paulinski, Scott
2014-01-01
ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.
Crovelli, Robert A.; revised by Charpentier, Ronald R.
2012-01-01
The U.S. Geological Survey (USGS) periodically assesses petroleum resources of areas within the United States and the world. The purpose of this report is to explain the development of an analytic probabilistic method and spreadsheet software system called Analytic Cell-Based Continuous Energy Spreadsheet System (ACCESS). The ACCESS method is based upon mathematical equations derived from probability theory. The ACCESS spreadsheet can be used to calculate estimates of the undeveloped oil, gas, and NGL (natural gas liquids) resources in a continuous-type assessment unit. An assessment unit is a mappable volume of rock in a total petroleum system. In this report, the geologic assessment model is defined first, the analytic probabilistic method is described second, and the spreadsheet ACCESS is described third. In this revised version of Open-File Report 00-044 , the text has been updated to reflect modifications that were made to the ACCESS program. Two versions of the program are added as appendixes.
Station Program Note Pull Automation
NASA Technical Reports Server (NTRS)
Delgado, Ivan
2016-01-01
Upon commencement of my internship, I was in charge of maintaining the CoFR (Certificate of Flight Readiness) Tool. The tool acquires data from existing Excel workbooks on NASA's and Boeing's databases to create a new spreadsheet listing out all the potential safety concerns for upcoming flights and software transitions. Since the application was written in Visual Basic, I had to learn a new programming language and prepare to handle any malfunctions within the program. Shortly afterwards, I was given the assignment to automate the Station Program Note (SPN) Pull process. I developed an application, in Python, that generated a GUI (Graphical User Interface) that will be used by the International Space Station Safety & Mission Assurance team here at Johnson Space Center. The application will allow its users to download online files with the click of a button, import SPN's based on three different pulls, instantly manipulate and filter spreadsheets, and compare the three sources to determine which active SPN's (Station Program Notes) must be reviewed for any upcoming flights, missions, and/or software transitions. Initially, to perform the NASA SPN pull (one of three), I had created the program to allow the user to login to a secure webpage that stores data, input specific parameters, and retrieve the desired SPN's based on their inputs. However, to avoid any conflicts with sustainment, I altered it so that the user may login and download the NASA file independently. After the user has downloaded the file with the click of a button, I defined the program to check for any outdated or pre-existing files, for successful downloads, to acquire the spreadsheet, convert it from a text file to a comma separated file and finally into an Excel spreadsheet to be filtered and later scrutinized for specific SPN numbers. Once this file has been automatically manipulated to provide only the SPN numbers that are desired, they are stored in a global variable, shown on the GUI, and transferred over to a new Excel worksheet for comparison. I managed to get my application to acquire the CSWG (Computer Safety Working Group) and the SPNWG (Space Station Working Group) SPN's with just two mouse clicks for each pull, as opposed to several from the original process. When all three pulls are performed, an Excel sheet containing all three different results will be generated for the user to compare and determine which SPN's will be presented or reviewed the following month. The experience from this internship has been spectacular. As a high school senior who will begin attending college in the fall, this internship has been both educationally and occupationally beneficial. The internship has allowed me the opportunities to learn new programming languages, effectively network with NASA personnel from a variety of departments at JSC, and allowed me to learn new professional skills and etiquette. My internship at NASA's Johnson Space Center has further motivated me to pursue a Master's degree in Software Engineering and strive for a prosperous career with NASA as a civil servant.
Spreadsheet-based program for alignment of overlapping DNA sequences.
Anbazhagan, R; Gabrielson, E
1999-06-01
Molecular biology laboratories frequently face the challenge of aligning small overlapping DNA sequences derived from a long DNA segment. Here, we present a short program that can be used to adapt Excel spreadsheets as a tool for aligning DNA sequences, regardless of their orientation. The program runs on any Windows or Macintosh operating system computer with Excel 97 or Excel 98. The program is available for use as an Excel file, which can be downloaded from the BioTechniques Web site. Upon execution, the program opens a specially designed customized workbook and is capable of identifying overlapping regions between two sequence fragments and displaying the sequence alignment. It also performs a number of specialized functions such as recognition of restriction enzyme cutting sites and CpG island mapping without costly specialized software.
A Computer-Based Laboratory Project for the Study of Stimulus Generalization and Peak Shift
ERIC Educational Resources Information Center
Derenne, Adam; Loshek, Eevett
2009-01-01
This paper describes materials designed for classroom projects on stimulus generalization and peak shift. A computer program (originally written in QuickBASIC) is used for data collection and a Microsoft Excel file with macros organizes the raw data on a spreadsheet and creates generalization gradients. The program is designed for use with human…
Dynamic Modeling for Development and Education: From Concepts to Numbers
ERIC Educational Resources Information Center
Van Geert, Paul
2014-01-01
The general aim of the article is to teach the reader how to transform conceptual models of change, development, and learning into mathematical expressions and how to use these equations to build dynamic models by means of the widely used spreadsheet program Excel. The explanation is supported by a number of Excel files, which the reader can…
National geochronological and natural radioelement data bases
Zartman, Robert E.; Bush, Charles A.; Abston, C.C.
1995-01-01
This CD-ROM contains both the National Geochronological Data Base [NGDB] and the Natural Radioelement Data Base [NRDB]. Supporting location, geologic, and reference information is provided for both data bases. The NGDB is a compilation of more than 30,000 individual published Pb-alpha, fission-track, K-Ar, Rb-Sr, U-Th-Pb, and Sm-Nd rock and mineral ages reported on approximately 18,000 dated samples from the United States. A program is provided to search the data files by latitude and longitude, state, analytical method, and age range. The NGDB is provided as quote-comma delimited files that can be entered into most commercial spreadsheet programs. The NRDB gives gamma-ray spectrometric analyses of the natural radioelements (U, Th, and K) for more than 8500 whole-rock samples obtained under the USGS Natural Radioelement Distribution Project. A program is provided to search the data files by state, keyword, U content, Th content, and K content.
Development of a spreadsheet for SNPs typing using Microsoft EXCEL.
Hashiyada, Masaki; Itakura, Yukio; Takahashi, Shirushi; Sakai, Jun; Funayama, Masato
2009-04-01
Single-nucleotide polymorphisms (SNPs) have some characteristics that make them very appropriate for forensic studies and applications. In our institute, SNPs typings were performed by the TaqMan SNP Genotyping Assays using the ABI PRISM 7500 FAST Real-Time PCR System (AppliedBiosystems) and Sequence Detection Software ver.1.4 (AppliedBiosystem). The TaqMan method was desired two positive control (Allele1 and 2) and one negative control to analyze each SNP locus. Therefore, it can be analyzed up to 24 loci of a person on a 96-well-plate at the same time. If SNPs analysis is expected to apply to biometrics authentication, 48 and over loci are required to identify a person. In this study, we designed a spreadsheet package using Microsoft EXCEL, and population data were used from our 120 SNPs population studies. On the spreadsheet, we defined SNP types using 'template files' instead of positive and negative controls. "Template files" consisted of the results of 94 unknown samples and two negative controls of each of 120 SNPs loci we had previously studied. By the use of the files, the spreadsheet could analyze 96 SNPs on a 96-wells-plate simultaneously.
Web-based X-ray quality control documentation.
David, George; Burnett, Lou Ann; Schenkel, Robert
2003-01-01
The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.
Engine Icing Data - An Analytics Approach
NASA Technical Reports Server (NTRS)
Fitzgerald, Brooke A.; Flegel, Ashlie B.
2017-01-01
Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Originally developed in 1999, an updated version 8.8.0 with bug fixes was released on September 30th, 2017. EnergyPlus™ is a whole building energy simulation program that engineers, architects, and researchers use to model both energy consumption—for heating, cooling, ventilation, lighting and plug and process loads—and water use in buildings. EnergyPlus is a console-based program that reads input and writes output to text files. It ships with a number of utilities including IDF-Editor for creating input files using a simple spreadsheet-like interface, EP-Launch for managing input and output files and performing batch simulations, and EP-Compare for graphically comparing the results ofmore » two or more simulations. Several comprehensive graphical interfaces for EnergyPlus are also available. DOE does most of its work with EnergyPlus using the OpenStudio® software development kit and suite of applications. DOE releases major updates to EnergyPlus twice annually.« less
Spreadsheets Answer "What If...?
ERIC Educational Resources Information Center
Pogge, Alfred F.; Lunetta, Vincent N.
1987-01-01
Demonstrates how a spreadsheet program can do calculations, freeing students to question, analyze data and learn science. Notes several popular spreadsheet programs. Gives an example using Lotus 1-2-3 spreadsheets for a sampling experiment in Biology. Shows other examples of spreadsheet use in laboratory activities. (CW)
Hydroshear Simulation Lab Test 2
Bauer, Steve
2014-08-01
This data file is for test 2. In this test a sample of granite with a pre cut (man made fracture) is confined, heated and differential stress is applied. max temperature in this this system development test is 95C. test details on the spreadsheets--note thta there are 2 spreadsheets
XLWrap - Querying and Integrating Arbitrary Spreadsheets with SPARQL
NASA Astrophysics Data System (ADS)
Langegger, Andreas; Wöß, Wolfram
In this paper a novel approach is presented for generating RDF graphs of arbitrary complexity from various spreadsheet layouts. Currently, none of the available spreadsheet-to-RDF wrappers supports cross tables and tables where data is not aligned in rows. Similar to RDF123, XLWrap is based on template graphs where fragments of triples can be mapped to specific cells of a spreadsheet. Additionally, it features a full expression algebra based on the syntax of OpenOffice Calc and various shift operations, which can be used to repeat similar mappings in order to wrap cross tables including multiple sheets and spreadsheet files. The set of available expression functions includes most of the native functions of OpenOffice Calc and can be easily extended by users of XLWrap.
Fallon, Nevada FORGE Thermal-Hydrological-Mechanical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blankenship, Doug; Sonnenthal, Eric
Archive contains thermal-mechanical simulation input/output files. Included are files which fall into the following categories: ( 1 ) Spreadsheets with various input parameter calculations ( 2 ) Final Simulation Inputs ( 3 ) Native-State Thermal-Hydrological Model Input File Folders ( 4 ) Native-State Thermal-Hydrological-Mechanical Model Input Files ( 5 ) THM Model Stimulation Cases See 'File Descriptions.xlsx' resource below for additional information on individual files.
Forensic Analysis of Compromised Computers
NASA Technical Reports Server (NTRS)
Wolfe, Thomas
2004-01-01
Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.
Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe
2006-07-01
types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved
Improving Information Management at Mare Island Naval Shipyard.
1987-03-01
copy reports [Ref. 8: pp. 1-41. C. PRIME TOKEN RING The prime ring is a token-type computer network linking five PRIME computers electronically . Each...the PRIME net are for news (a bulletin board), electronic mail, word processing, and data filing. d 4 o, " ,, " "." ’-" r...communications application) This is a group of general-purpose programs that includes word processing. electronic mail, and spreadsheet applications. Access is
SEDIMENT DATA - ST. PAUL WATERWAY - TACOMA, WA - 1996 MONITORING DATA
Benthic Infauna Monitoring Data Files are Excel-format spreadsheet files which contain data presented in the St. Paul Waterway Area Remedial Action and Habitat Restoration Project, 1996 Monitoring Report. The files can be viewed directly or readily downlo aded and read into most ...
Computer-based Astronomy Labs for Non-science Majors
NASA Astrophysics Data System (ADS)
Smith, A. B. E.; Murray, S. D.; Ward, R. A.
1998-12-01
We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.
Parkhurst, David L.; Appelo, C.A.J.
1999-01-01
PHREEQC version 2 is a computer program written in the C programming language that is designed to perform a wide variety of low-temperature aqueous geochemical calculations. PHREEQC is based on an ion-association aqueous model and has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations involving reversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and irreversible reactions, which include specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters, within specified compositional uncertainty limits.New features in PHREEQC version 2 relative to version 1 include capabilities to simulate dispersion (or diffusion) and stagnant zones in 1D-transport calculations, to model kinetic reactions with user-defined rate expressions, to model the formation or dissolution of ideal, multicomponent or nonideal, binary solid solutions, to model fixed-volume gas phases in addition to fixed-pressure gas phases, to allow the number of surface or exchange sites to vary with the dissolution or precipitation of minerals or kinetic reactants, to include isotope mole balances in inverse modeling calculations, to automatically use multiple sets of convergence parameters, to print user-defined quantities to the primary output file and (or) to a file suitable for importation into a spreadsheet, and to define solution compositions in a format more compatible with spreadsheet programs. This report presents the equations that are the basis for chemical equilibrium, kinetic, transport, and inverse-modeling calculations in PHREEQC; describes the input for the program; and presents examples that demonstrate most of the program's capabilities.
Spreadsheet Toolkit for Ulysses Hi-Scale Measurements of Interplanetary Ions and Electrons
NASA Astrophysics Data System (ADS)
Reza, J. Z.; Lanzerotti, L. J.; Denker, C.; Patterson, D.; Amstrong, T. P.
2004-05-01
Throughout the entire Ulysses out-of-the-ecliptic solar polar mission, the Heliosphere Instrument for Spectra, Composition, and Anisotropy at Low Energies (HI-SCALE) has collected measurements of interplanetary ions and electrons. Time-series of electron and ion fluxes obtained since 1990 have been carefully calibrated and will be stored in a data management system, which will be publicly accessible via the WWW. The goal of the Virtual Solar Observatory (VSO) is to provide data uniformly and efficiently to a diverse user community. However, data dissemination can only be a first step, which has to be followed by a suite of data analysis tools that are tailored towards a diverse user community in science, technology, and education. The widespread use and familiarity of spreadsheets, which are available at low cost or open source for many operating systems, make them an interesting tool to investigate for the analysis of HI-SCALE data. The data are written in comma separated variable (CSV) format, which is commonly used in spreadsheet programs. CSV files can simply be linked as external data to spreadsheet templates, which in turn can be used to generate tables and figures of basic statistical properties and frequency distributions, temporal evolution of electron and ion spectra, comparisons of various energy channels, automatic detection of solar events, solar cycle variations, and space weather. Exploring spreadsheet-assisted data analysis in the context of information technology research, data base information search and retrieval, and data visualization potentially impacts other VSO components, where diverse user communities are targeted. Finally, this presentation is the result of an undergraduate research project, which will allow us to evaluate the performance of user-based spreadsheet analysis "benchmarked" at the undergraduate skill level.
The Evolution of Spreadsheets.
ERIC Educational Resources Information Center
Schuyler, Michael
1985-01-01
Discusses basic features and functions of spreadsheet programs and describes additional capabilities (editing, windowing, graphics, and word processing) of two second-generation spreadsheet programs: Lotus 1-2-3 and Symphony. (MBR)
Visual Basic programs for spreadsheet analysis.
Hunt, Bruce
2005-01-01
A collection of Visual Basic programs, entitled Function.xls, has been written for ground water spreadsheet calculations. This collection includes programs for calculating mathematical functions and for evaluating analytical solutions in ground water hydraulics and contaminant transport. Several spreadsheet examples are given to illustrate their use.
Great Basin NV Play Fairway Analysis - Carson Sink
Jim Faulds
2015-10-28
All datasets and products specific to the Carson Sink Basin. Includes a packed ArcMap (.mpk), individually zipped shapefiles, and a file geodatabase for the Carson Sink area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.
Jim Faulds
2015-10-29
All datasets and products specific to the Steptoe Valley model area. Includes a packed ArcMap project (.mpk), individually zipped shapefiles, and a file geodatabase for the northern Steptoe Valley area; a GeoSoft Oasis montaj project containing GM-SYS 2D gravity profiles along the trace of our seismic reflection lines; a 3D model in EarthVision; spreadsheet of links to published maps; and spreadsheets of well data.
PCDAQ, A Windows Based DAQ System
NASA Astrophysics Data System (ADS)
Hogan, Gary
1998-10-01
PCDAQ is a Windows NT based general DAQ/Analysis/Monte Carlo shell developed as part of the Proton Radiography project at LANL (Los Alamos National Laboratory). It has been adopted by experiments outside of the Proton Radiography project at Brookhaven National Laboratory (BNL) and at LANL. The program provides DAQ, Monte Carlo, and replay (disk file input) modes. Data can be read from hardware (CAMAC) or other programs (ActiveX servers). Future versions will read VME. User supplied data analysis routines can be written in Fortran, C++, or Visual Basic. Histogramming, testing, and plotting packages are provided. Histogram data can be exported to spreadsheets or analyzed in user supplied programs. Plots can be copied and pasted as bitmap objects into other Windows programs or printed. A text database keyed by the run number is provided. Extensive software control flags are provided so that the user can control the flow of data through the program. Control flags can be set either in script command files or interactively. The program can be remotely controlled and data accessed over the Internet through its ActiveX DCOM interface.
ISA-TAB-Nano: a specification for sharing nanomaterial research data in spreadsheet-based format.
Thomas, Dennis G; Gaheen, Sharon; Harper, Stacey L; Fritts, Martin; Klaessig, Fred; Hahn-Dantona, Elizabeth; Paik, David; Pan, Sue; Stafford, Grace A; Freund, Elaine T; Klemm, Juli D; Baker, Nathan A
2013-01-14
The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption.
ISA-TAB-Nano: A Specification for Sharing Nanomaterial Research Data in Spreadsheet-based Format
2013-01-01
Background and motivation The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. Results We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. Conclusion The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption. PMID:23311978
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T.
2000-07-01
The Write One, Run Many (WORM) site (worm.csirc.net) is the on-line home of the WORM language and is hosted by the Criticality Safety Information Resource Center (CSIRC) (www.csirc.net). The purpose of this web site is to create an on-line community for WORM users to gather, share, and archive WORM-related information. WORM is an embedded, functional, programming language designed to facilitate the creation of input decks for computer codes that take standard ASCII text files as input. A functional programming language is one that emphasizes the evaluation of expressions, rather than execution of commands. The simplest and perhaps most common examplemore » of a functional language is a spreadsheet such as Microsoft Excel. The spreadsheet user specifies expressions to be evaluated, while the spreadsheet itself determines the commands to execute, as well as the order of execution/evaluation. WORM functions in a similar fashion and, as a result, is very simple to use and easy to learn. WORM improves the efficiency of today's criticality safety analyst by allowing: (1) input decks for parameter studies to be created quickly and easily; (2) calculations and variables to be embedded into any input deck, thus allowing for meaningful parameter specifications; (3) problems to be specified using any combination of units; and (4) complex mathematically defined models to be created. WORM is completely written in Perl. Running on all variants of UNIX, Windows, MS-DOS, MacOS, and many other operating systems, Perl is one of the most portable programming languages available. As such, WORM works on practically any computer platform.« less
Legacy literature-a need for virtual libraries
USDA-ARS?s Scientific Manuscript database
After years of conducting, writing-up, and reviewing research, many entomologists have examined, organized, and annotated some as 2-3 gigabytes of pdfs and 4-5 file cabinets of hard-copy articles, in addition to thousands of spreadsheets, docs, jpgs, and wav files of data. This is a useful legacy th...
Using Spreadsheets in the Management, Analysis, and Reporting of Evaluation Data.
ERIC Educational Resources Information Center
Glowacki, Margaret L.; Rice, Richard L., Jr.
Currently available spreadsheet programs for microcomputers provide many features that can be very useful for evaluators and researchers. Some of the basic concepts involved in spreadsheet use are introduced, and information is provided on the use of spreadsheets in maintaining and analyzing evaluation data. The spreadsheet used in the discussion…
How to Create Automatically Graded Spreadsheets for Statistics Courses
ERIC Educational Resources Information Center
LoSchiavo, Frank M.
2016-01-01
Instructors often use spreadsheet software (e.g., Microsoft Excel) in their statistics courses so that students can gain experience conducting computerized analyses. Unfortunately, students tend to make several predictable errors when programming spreadsheets. Without immediate feedback, programming errors are likely to go undetected, and as a…
Spreadsheet-Based Program for Simulating Atomic Emission Spectra
ERIC Educational Resources Information Center
Flannigan, David J.
2014-01-01
A simple Excel spreadsheet-based program for simulating atomic emission spectra from the properties of neutral atoms (e.g., energies and statistical weights of the electronic states, electronic partition functions, transition probabilities, etc.) is described. The contents of the spreadsheet (i.e., input parameters, formulas for calculating…
Problem Solving with Spreadsheets.
ERIC Educational Resources Information Center
Catterall, P.; Lewis, R.
1985-01-01
Documents the educational use of spreadsheets through a description of exploratory work which utilizes spreadsheets to achieve the objectives of Conway's Game of Life, a scientific method game for the development of problem-solving techniques. The implementation and classroom use of the spreadsheet programs are discussed. (MBR)
Production and Injection data for NV Binary facilities
Mines, Greg
2013-12-24
Excel files are provided with well production and injection data for binary facilities in Nevada. The files contain the data that reported montly to the Nevada Bureau of Mines and Geology (NBMG) by the facility operators. this data has been complied into Excel spreadsheets for each of the facilities given on the NBMG web site.
Solving Optimization Problems with Spreadsheets
ERIC Educational Resources Information Center
Beigie, Darin
2017-01-01
Spreadsheets provide a rich setting for first-year algebra students to solve problems. Individual spreadsheet cells play the role of variables, and creating algebraic expressions for a spreadsheet to perform a task allows students to achieve a glimpse of how mathematics is used to program a computer and solve problems. Classic optimization…
ERIC Educational Resources Information Center
Gierdien, M. Faaiz
2014-01-01
This paper reports on the initial stages of a small-scale project involving the use of "spreadsheet algebra programs" in the professional development of eight teachers from three township high schools. In terms of the education context, the paper draws on social practice theory. It then details what is meant by spreadsheet algebra. An…
ARS-Media for excel instruction manual
USDA-ARS?s Scientific Manuscript database
ARS-Media for Excel Instruction Manual is the instruction manual that explains how to use the Excel spreadsheet ARS-Media for Excel application. ARS-Media for Excel Instruction Manual is provided as a pdf file....
its references list. To use SMARTS, users construct text files of 20-30 lines of simple text and ' output consists of spreadsheet-compatible American Standard Code for Information Interchange (ASCII) text
Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.
ERIC Educational Resources Information Center
Frey, Douglas D.
1990-01-01
Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)
How Spreadsheets Boost Productivity.
ERIC Educational Resources Information Center
Ross, James
1988-01-01
Explains the use of computerized bookkeeping systems called spreadsheets to perform mathematical and accounting functions such as totaling expenditures, averaging test grades, and transferring funds. Advises about adapting spreadsheet programs and discusses several essential features, including linkage, macro functions, and sharing capabilities.…
Economic Comparison of Processes Using Spreadsheet Programs
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.
1986-01-01
Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.
Modeling the Milky Way: Spreadsheet Science.
ERIC Educational Resources Information Center
Whitmer, John C.
1990-01-01
Described is the generation of a scale model of the solar system and the milky way galaxy using a computer spreadsheet program. A sample spreadsheet including cell formulas is provided. Suggestions for using this activity as a teaching technique are included. (CW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
St. Onge, Melinda
The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT) was developed as a way to distill large amounts of geothermal project data into an objective, reportable data set that can be used to communicate with experts and non-experts. GeoRePORT summarizes (1) resource grade and certainty and (2) project readiness. This Excel file allows users to easily navigate through the resource grade attributes, using drop-down menus to pick grades and project readiness, and then easily print and share the summary with others. This spreadsheet is the first draft, for which we are soliciting expert feedback. The spreadsheet will be updated basedmore » on this feedback to increase usability of the tool. If you have any comments, please feel free to contact us.« less
Petrogenetic Modeling with a Spreadsheet Program.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how interactive programs for scientific modeling may be created by using spreadsheet software such as LOTUS 1-2-3. Lists the advantages of using this method. Discusses fractional distillation, batch partial melting, and combination models as examples. (CW)
Exploring Difference Equations with Spreadsheets.
ERIC Educational Resources Information Center
Walsh, Thomas P.
1996-01-01
When using spreadsheets to explore real-world problems involving periodic change, students can observe what happens at each period, generate a graph, and learn how changing the starting quantity or constants affects results. Spreadsheet lessons for high school students are presented that explore mathematical modeling, linear programming, and…
XAFS Data Interchange: A single spectrum XAFS data file format.
Ravel, B; Newville, M
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
XAFS Data Interchange: A single spectrum XAFS data file format
NASA Astrophysics Data System (ADS)
Ravel, B.; Newville, M.
2016-05-01
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... massive emails, word processing documents, PDF files, spreadsheets, presentations, database entries, and....pdf . PURPOSES: OGC-EDMS provides OGC with a method to initiate, track, and manage the collection...
The Use of Lotus 1-2-3 Macros in Engineering Calculations.
ERIC Educational Resources Information Center
Rosen, Edward M.
1990-01-01
Described are the use of spreadsheet programs in chemical engineering calculations using Lotus 1-2-3 macros. Discusses the macro commands, subroutine operations, and solution of partial differential equation. Provides examples of the subroutine programs and spreadsheet solution. (YP)
Charpentier, Ronald R.; Klett, T.R.; Obuch, R.C.; Brewton, J.D.
1996-01-01
This CD-ROM contains files in support of the 1995 USGS National assessment of United States oil and gas resources (DDS-30), which was published separately and summarizes the results of a 3-year study of the oil and gas resources of the onshore and state waters of the United States. The study describes about 560 oil and gas plays in the United States; confirmed and hypothetical, conventional and unconventional. A parallel study of the Federal offshore is being conducted by the U.S. Minerals Management Service. This CD-ROM contains files in multiple formats, so that almost any computer user can import them into word processors and spreadsheets. The tabular data include some tables not released in DDS-30. No proprietary data are released on this CD-ROM, but some tables of summary statistics from the proprietary files are provided. The complete text of DDS-30 is also available, as well as many figures. Also included are some of the programs used in the assessment, in source code and with supporting documentation. A companion CD-ROM (DDS-35) includes the map data and the same text data, but none of the tabular data or assessment programs.
ROE Carbon Storage - Forest Biomass
This polygon dataset depicts the density of forest biomass in counties across the United States, in terms of metric tons of carbon per square mile of land area. These data were provided in spreadsheet form by the U.S. Department of Agriculture (USDA) Forest Service. To produce the Web mapping application, EPA joined the spreadsheet with a shapefile of U.S. county (and county equivalent) boundaries downloaded from the U.S. Census Bureau. EPA calculated biomass density based on the area of each county polygon. These data sets were converted into a single polygon feature class inside a file geodatabase.
Gravity Data for West-Central Colorado
Richard Zehner
2012-04-06
Modeled Bouger-Corrected Gravity data was extracted from the Pan American Center for Earth and Environmental Studies Gravity Database of the U.S. at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was opened in an Excel spreadsheet. This spreadsheet data was then converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and gravity (in milligals). This data was then converted to grid and then contoured using ESRI Spatial Analyst. Data from From University of Texas: Pan American Center for Earth and Environmental Studies
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... a separate document, our preferred file format is Microsoft Word. If you attach multiple comments (such as form letters), our preferred format is a Microsoft Excel spreadsheet. (2) By Hard Copy: Submit...
Contains frequently asked questions: Is there an Email Support Group for WASP, Do I Need Admin Rights to Install, How to Run WASP after Installation, Can I use my WASP7 File, Attaching to an Excel Spreadsheet or Access Database, Converting QUAL2K to WASP
Introduction to Classroom Sprego
ERIC Educational Resources Information Center
Csernoch, Mária; Biró, Piroska
2016-01-01
Sprego is programming with spreadsheet functions. The present paper provides introductory Sprego examples which have so far only been available in Hungarian. Spreadsheet environments offer both a programming tool which best serves beginner and end-user programmers' interest, and an approach which lightens the burden of coding and language details.…
Ellis, Alisha M.; Marot, Marci E.; Wheaton, Cathryn J.; Bernier, Julie C.; Smith, Christopher G.
2016-02-03
This report is an archive for sedimentological data derived from the surface sediment of Chincoteague Bay. Data are available for the spring (March/April 2014) and fall (October 2014) samples collected. Downloadable data are provided as Excel spreadsheets and as JPEG files. Additional files include ArcGIS shapefiles of the sampling sites, detailed results of sediment grain-size analyses, and formal Federal Geographic Data Committee metadata (data downloads).
NASA Astrophysics Data System (ADS)
Gaik Tay, Kim; Cheong, Tau Han; Foong Lee, Ming; Kek, Sie Long; Abdul-Kahar, Rosmila
2017-08-01
In the previous work on Euler’s spreadsheet calculator for solving an ordinary differential equation, the Visual Basic for Application (VBA) programming was used, however, a graphical user interface was not developed to capture users input. This weakness may make users confuse on the input and output since those input and output are displayed in the same worksheet. Besides, the existing Euler’s spreadsheet calculator is not interactive as there is no prompt message if there is a mistake in inputting the parameters. On top of that, there are no users’ instructions to guide users to input the derivative function. Hence, in this paper, we improved previous limitations by developing a user-friendly and interactive graphical user interface. This improvement is aimed to capture users’ input with users’ instructions and interactive prompt error messages by using VBA programming. This Euler’s graphical user interface spreadsheet calculator is not acted as a black box as users can click on any cells in the worksheet to see the formula used to implement the numerical scheme. In this way, it could enhance self-learning and life-long learning in implementing the numerical scheme in a spreadsheet and later in any programming language.
Data acquisition and real-time control using spreadsheets: interfacing Excel with external hardware.
Aliane, Nourdine
2010-07-01
Spreadsheets have become a popular computational tool and a powerful platform for performing engineering calculations. Moreover, spreadsheets include a macro language, which permits the inclusion of standard computer code in worksheets, and thereby enable developers to greatly extend spreadsheets' capabilities by designing specific add-ins. This paper describes how to use Excel spreadsheets in conjunction to Visual Basic for Application programming language to perform data acquisition and real-time control. Afterwards, the paper presents two Excel applications with interactive user interfaces developed for laboratory demonstrations and experiments in an introductory course in control. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
A document-centric approach for developing the tolAPC ontology.
Blfgeh, Aisha; Warrender, Jennifer; Hilkens, Catharien M U; Lord, Phillip
2017-11-28
There are many challenges associated with ontology building, as the process often touches on many different subject areas; it needs knowledge of the problem domain, an understanding of the ontology formalism, software in use and, sometimes, an understanding of the philosophical background. In practice, it is very rare that an ontology can be completed by a single person, as they are unlikely to combine all of these skills. So people with these skills must collaborate. One solution to this is to use face-to-face meetings, but these can be expensive and time-consuming for teams that are not co-located. Remote collaboration is possible, of course, but one difficulty here is that domain specialists use a wide-variety of different "formalisms" to represent and share their data - by the far most common, however, is the "office file" either in the form of a word-processor document or a spreadsheet. Here we describe the development of an ontology of immunological cell types; this was initially developed by domain specialists using an Excel spreadsheet for collaboration. We have transformed this spreadsheet into an ontology using highly-programmatic and pattern-driven ontology development. Critically, the spreadsheet remains part of the source for the ontology; the domain specialists are free to update it, and changes will percolate to the end ontology. We have developed a new ontology describing immunological cell lines built by instantiating ontology design patterns written programmatically, using values from a spreadsheet catalogue. This method employs a spreadsheet that was developed by domain experts. The spreadsheet is unconstrained in its usage and can be freely updated resulting in a new ontology. This provides a general methodology for ontology development using data generated by domain specialists.
NASA Astrophysics Data System (ADS)
Le Roux, Jacobus P.; Demirbilek, Zeki; Brodalka, Marysia; Flemming, Burghard W.
2010-10-01
The generation and growth of waves in deep water is controlled by winds blowing over the sea surface. In fully developed sea states, where winds and waves are in equilibrium, wave parameters may be calculated directly from the wind velocity. We provide an Excel spreadsheet to compute the wave period, length, height and celerity, as well as horizontal and vertical particle velocities for any water depth, bottom slope, and distance below the reference water level. The wave profile and propagation can also be visualized for any water depth, modeling the sea surface change from sinusoidal to trochoidal and finally cnoidal profiles into shallow water. Bedload entrainment is estimated under both the wave crest and the trough, using the horizontal water particle velocity at the top of the boundary layer. The calculations are programmed in an Excel file called WAVECALC, which is available online to authorized users. Although many of the recently published formulas are based on theoretical arguments, the values agree well with several existing theories and limited field and laboratory observations. WAVECALC is a user-friendly program intended for sedimentologists, coastal engineers and oceanographers, as well as marine ecologists and biologists. It provides a rapid means to calculate many wave characteristics required in coastal and shallow marine studies, and can also serve as an educational tool.
Ground Magnetic Data for West-Central Colorado
Richard Zehner
2012-03-08
Modeled ground magnetic data was extracted from the Pan American Center for Earth and Environmental Studies database at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was then imported into an Excel spreadsheet. This spreadsheet data was converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and magnetic field strength in nano-Teslas. This point shapefile was then interpolated to an ESRI grid using an inverse-distance weighting method, using ESRI Spatial Analyst. The grid was used to create a contour map of magnetic field strength.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
Mathematical Modeling with MyMaps and Spreadsheets
ERIC Educational Resources Information Center
Weber, Victoria; Fortune, Nicholas; Williams, Derek; Whitehead, Ashley
2016-01-01
Software programs such as Tinkerplots ® or Geometer's Sketchpad ® can help students solve problems in mathematics classes, but may not be available to them after high school. In contrast, many students who become familiar with Internet tools and programs in office packages (word processing, spreadsheets, etc.) may use them daily to enhance their…
Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files
NASA Technical Reports Server (NTRS)
2005-01-01
The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).
Family Day Homes: Get Organized with Information Systems.
ERIC Educational Resources Information Center
Dague, Mindy
1999-01-01
Notes that record keeping and management are critical aspects of home day care centers. Highlights options for tools, including calendars, loose-leaf notebooks, ledgers, computer spreadsheet software, and file boxes. Provides guidelines for organizing information as well as particular information necessary regarding provider, parent, and child.…
D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markley, D.; /Fermilab
1998-06-09
This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less
A Simple Spreadsheet Program for the Calculation of Lattice-Site Distributions
ERIC Educational Resources Information Center
McCaffrey, John G.
2009-01-01
A simple spreadsheet program is presented that can be used by undergraduate students to calculate the lattice-site distributions in solids. A major strength of the method is the natural way in which the correct number of ions or atoms are present, or absent, at specific lattice distances. The expanding-cube method utilized is straightforward to…
Validation Results for LEWICE 2.0. [Supplement
NASA Technical Reports Server (NTRS)
Wright, William B.; Rutkowski, Adam
1999-01-01
Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.
TagDigger: user-friendly extraction of read counts from GBS and RAD-seq data.
Clark, Lindsay V; Sacks, Erik J
2016-01-01
In genotyping-by-sequencing (GBS) and restriction site-associated DNA sequencing (RAD-seq), read depth is important for assessing the quality of genotype calls and estimating allele dosage in polyploids. However, existing pipelines for GBS and RAD-seq do not provide read counts in formats that are both accurate and easy to access. Additionally, although existing pipelines allow previously-mined SNPs to be genotyped on new samples, they do not allow the user to manually specify a subset of loci to examine. Pipelines that do not use a reference genome assign arbitrary names to SNPs, making meta-analysis across projects difficult. We created the software TagDigger, which includes three programs for analyzing GBS and RAD-seq data. The first script, tagdigger_interactive.py, rapidly extracts read counts and genotypes from FASTQ files using user-supplied sets of barcodes and tags. Input and output is in CSV format so that it can be opened by spreadsheet software. Tag sequences can also be imported from the Stacks, TASSEL-GBSv2, TASSEL-UNEAK, or pyRAD pipelines, and a separate file can be imported listing the names of markers to retain. A second script, tag_manager.py, consolidates marker names and sequences across multiple projects. A third script, barcode_splitter.py, assists with preparing FASTQ data for deposit in a public archive by splitting FASTQ files by barcode and generating MD5 checksums for the resulting files. TagDigger is open-source and freely available software written in Python 3. It uses a scalable, rapid search algorithm that can process over 100 million FASTQ reads per hour. TagDigger will run on a laptop with any operating system, does not consume hard drive space with intermediate files, and does not require programming skill to use.
Emissions & Generation Resource Integrated Database (eGRID), eGRID2002 (with years 1996 - 2000 data)
The Emissions & Generation Resource Integrated Database (eGRID) is a comprehensive source of data on the environmental characteristics of almost all electric power generated in the United States. These environmental characteristics include air emissions for nitrogen oxides, sulfur dioxide, carbon dioxide, methane, nitrous oxide, and mercury; emissions rates; net generation; resource mix; and many other attributes. eGRID2002 (years 1996 through 2000 data) contains 16 Excel spreadsheets and the Technical Support Document, as well as the eGRID Data Browser, User's Manual, and Readme file. Archived eGRID data can be viewed as spreadsheets or by using the eGRID Data Browser. The eGRID spreadsheets can be manipulated by data users and enables users to view all the data underlying eGRID. The eGRID Data Browser enables users to view key data using powerful search features. Note that the eGRID Data Browser will not run on a Mac-based machine without Windows emulation.
ERIC Educational Resources Information Center
Abriata, Luciano A.
2011-01-01
A simple algorithm was implemented in a spreadsheet program to simulate the circular dichroism spectra of proteins from their secondary structure content and to fit [alpha]-helix, [beta]-sheet, and random coil contents from experimental far-UV circular dichroism spectra. The physical basis of the method is briefly reviewed within the context of…
Teaching Graphical Simulations of Fourier Series Expansion of Some Periodic Waves Using Spreadsheets
ERIC Educational Resources Information Center
Singh, Iqbal; Kaur, Bikramjeet
2018-01-01
The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave,…
Pressure Ratio to Thermal Environments
NASA Technical Reports Server (NTRS)
Lopez, Pedro; Wang, Winston
2012-01-01
A pressure ratio to thermal environments (PRatTlE.pl) program is a Perl language code that estimates heating at requested body point locations by scaling the heating at a reference location times a pressure ratio factor. The pressure ratio factor is the ratio of the local pressure at the reference point and the requested point from CFD (computational fluid dynamics) solutions. This innovation provides pressure ratio-based thermal environments in an automated and traceable method. Previously, the pressure ratio methodology was implemented via a Microsoft Excel spreadsheet and macro scripts. PRatTlE is able to calculate heating environments for 150 body points in less than two minutes. PRatTlE is coded in Perl programming language, is command-line-driven, and has been successfully executed on both the HP and Linux platforms. It supports multiple concurrent runs. PRatTlE contains error trapping and input file format verification, which allows clear visibility into the input data structure and intermediate calculations.
Edwardson, S R; Pejsa, J
1993-01-01
A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.
PYROLASER - PYROLASER OPTICAL PYROMETER OPERATING SYSTEM
NASA Technical Reports Server (NTRS)
Roberts, F. E.
1994-01-01
The PYROLASER package is an operating system for the Pyrometer Instrument Company's Pyrolaser. There are 6 individual programs in the PYROLASER package: two main programs, two lower level subprograms, and two programs which, although independent, function predominantly as macros. The package provides a quick and easy way to setup, control, and program a standard Pyrolaser. Temperature and emissivity measurements may be either collected as if the Pyrolaser were in the manual operations mode, or displayed on real time strip charts and stored in standard spreadsheet format for post-test analysis. A shell is supplied to allow macros, which are test-specific, to be easily added to the system. The Pyrolaser Simple Operation program provides full on-screen remote operation capabilities, thus allowing the user to operate the Pyrolaser from the computer just as it would be operated manually. The Pyrolaser Simple Operation program also allows the use of "quick starts". Quick starts provide an easy way to permit routines to be used as setup macros for specific applications or tests. The specific procedures required for a test may be ordered in a sequence structure and then the sequence structure can be started with a simple button in the cluster structure provided. One quick start macro is provided for continuous Pyrolaser operation. A subprogram, Display Continuous Pyr Data, is used to display and store the resulting data output. Using this macro, the system is set up for continuous operation and the subprogram is called to display the data in real time on strip charts. The data is simultaneously stored in a spreadsheet format. The resulting spreadsheet file can be opened in any one of a number of commercially available spreadsheet programs. The Read Continuous Pyrometer program is provided as a continuously run subprogram for incorporation of the Pyrolaser software into a process control or feedback control scheme in a multi-component system. The program requires the Pyrolaser to be set up using the Pyrometer String Transfer macro. It requires no inputs and provides temperature and emissivity as outputs. The Read Continuous Pyrometer program can be run continuously and the data can be sampled as often or as seldom as updates of temperature and emissivity are required. PYROLASER is written using the Labview software for use on Macintosh series computers running System 6.0.3 or later, Sun Sparc series computers running OpenWindows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatibles running Microsoft Windows 3.1 or later. Labview requires a minimum of 5Mb of RAM on a Macintosh, 24Mb of RAM on a Sun, and 8Mb of RAM on an IBM PC or compatible. The Labview software is a product of National Instruments (Austin,TX; 800-433-3488), and is not included with this program. The standard distribution medium for PYROLASER is a 3.5 inch 800K Macintosh format diskette. It is also available on a 3.5 inch 720K MS-DOS format diskette, a 3.5 inch diskette in UNIX tar format, and a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in Macintosh WordPerfect version 2.0.4 format is included on the distribution medium. Printed documentation is included in the price of the program. PYROLASER was developed in 1992.
The Computer Bulletin Board. Modified Gran Plots of Very Weak Acids on a Spreadsheet.
ERIC Educational Resources Information Center
Chau, F. T.; And Others
1990-01-01
Presented are two applications of computer technology to chemistry instruction: the use of a spreadsheet program to analyze acid-base titration curves and the use of database software to catalog stockroom inventories. (CW)
This page provides information and access to Standard Evaluation Procedures (SEPs) and Data Entry Spreadsheet Templates (DESTs) developed by EPA's Office of Chemical Safety and Pollution Prevention (OCSPP).
Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.
ERIC Educational Resources Information Center
Snow, Donald R.
1989-01-01
Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)
LICSS - a chemical spreadsheet in microsoft excel
2012-01-01
Background Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. Summary LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out. We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. Conclusions LICSS is an Excel-based chemical spreadsheet with a difference: • It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel • It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation • It is free and extensible LICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community. PMID:22301088
LICSS - a chemical spreadsheet in microsoft excel.
Lawson, Kevin R; Lawson, Jonty
2012-02-02
Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out.We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. LICSS is an Excel-based chemical spreadsheet with a difference:• It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel• It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation• It is free and extensibleLICSS is open source software and we hope sufficient detail is provided here to enable developers to add their own features and share with the community.
PAPARA(ZZ)I: An open-source software interface for annotating photographs of the deep-sea
NASA Astrophysics Data System (ADS)
Marcon, Yann; Purser, Autun
PAPARA(ZZ)I is a lightweight and intuitive image annotation program developed for the study of benthic megafauna. It offers functionalities such as free, grid and random point annotation. Annotations may be made following existing classification schemes for marine biota and substrata or with the use of user defined, customised lists of keywords, which broadens the range of potential application of the software to other types of studies (e.g. marine litter distribution assessment). If Internet access is available, PAPARA(ZZ)I can also query and use standardised taxa names directly from the World Register of Marine Species (WoRMS). Program outputs include abundances, densities and size calculations per keyword (e.g. per taxon). These results are written into text files that can be imported into spreadsheet programs for further analyses. PAPARA(ZZ)I is open-source and is available at http://papara-zz-i.github.io. Compiled versions exist for most 64-bit operating systems: Windows, Mac OS X and Linux.
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
AlleleCoder: a PERL script for coding codominant polymorphism data for PCA analysis
USDA-ARS?s Scientific Manuscript database
A useful biological interpretation of diploid heterozygotes is in terms of the dose of the common allele (0, 1 or 2 copies). We have developed a PERL script that converts FASTA files into coded spreadsheets suitable for Principal Component Analysis (PCA). In combination with R and R Commander, two- ...
Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files
NASA Technical Reports Server (NTRS)
2004-01-01
The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).
Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya
2016-02-01
The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.
2015-12-01
Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps, stratigraphic columns and cross-sections be produced digitally continues our integration on the use of digital technologies throughout the curriculum. Initial evaluation suggests that students using the Apps more quickly progress towards synthesis and interpretation of the data as well as a deeper understanding of complex 4D field relationships.
Academic Testing and Grading with Spreadsheet Software.
ERIC Educational Resources Information Center
Ho, James K.
1987-01-01
Explains how spreadsheet software can be used in the design and grading of academic tests and in assigning grades. Macro programs and menu-driven software are highlighted and an example using IBM PCs and Lotus 1-2-3 software is given. (Author/LRW)
Smith, Christopher G.; Marot, Marci E.; Ellis, Alisha M.; Wheaton, Cathryn J.; Bernier, Julie C.; Adams, C. Scott
2015-09-15
This report serves as an archive for sedimentological and radiochemical data derived from the surface sediments and marsh cores collected March 26–April 4, 2014. Select surficial data are available for the additional sampling periods October 21–30, 2014. Downloadable data are available as Excel spreadsheets and as JPEG files. Additional files include: Field documentation, x-radiographs, photographs, detailed results of sediment grain size analyses, and formal Federal Geographic Data Committee metadata (data downloads).
Thornber, Carl R.; Sherrod, David R.; Siems, David F.; Heliker, Christina C.; Meeker, Gregory P.; Oscarson, Robert L.; Kauahikaua, James P.
2002-01-01
This report presents major-element geochemical data for glasses and whole-rock aliquots among 523 lava samples collected near the vent on Kilauea's east rift zone between September 1994 and October 2001. Information on sample collection, analysis techniques and analytical standard reproducibility are presented as a PDF file, which also includes a detailed explantion of the categories of sample information presented in the database spreadsheet. The sample database is downloadable as a separate Microsoft Excel file.
Marot, Marci E.; Smith, Christopher G.; Ellis, Alisha M.; Wheaton, Cathryn J.
2016-06-23
This report serves as an archive for sedimentological and radiochemical data derived from the surface sediments and box cores. Downloadable data are available as Excel spreadsheets, PDF files, and JPEG files, and include sediment core data plots and x-radiographs, as well as physical-properties, grain-size, alpha-spectroscopy, and gamma-spectroscopy data. Federal Geographic Data Committee metadata are available for analytical datasets in the data downloads page of this report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.
Teaching graphical simulations of Fourier series expansion of some periodic waves using spreadsheets
NASA Astrophysics Data System (ADS)
Singh, Iqbal; Kaur, Bikramjeet
2018-05-01
The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave, half wave rectifier and full wave rectifier signals.
ERIC Educational Resources Information Center
Sims, Paul A.
2010-01-01
An approach is presented that utilizes a spreadsheet to allow students to explore different means of calculating and visualizing how the charge on peptides and proteins varies as a function of pH. In particular, the concept of isoelectric point is developed to allow students to compare the results of their spreadsheet calculations with those of…
Software for Testing Electroactive Structural Components
NASA Technical Reports Server (NTRS)
Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar
2003-01-01
A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-11
... execution, order or cancel information, which can be exported into a spreadsheet for review. TradeInfo.... This is the same fee assessed to NOM Participants and PHLX members for this service. Use of TradeInfo... allocate trades to the appropriate accounts and sub- accounts for clearing. The Options Clearing...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Use. (e) Format and delivery. (1) Electronic format only. Reports of use must be maintained and delivered in electronic format only, as prescribed in paragraphs (e)(2) through (8) of this section. A hard... spreadsheet templates. All report of use data files must be delivered in ASCII format. However, to facilitate...
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Metsalu, Tauno; Vilo, Jaak
2015-01-01
The Principal Component Analysis (PCA) is a widely used method of reducing the dimensionality of high-dimensional data, often followed by visualizing two of the components on the scatterplot. Although widely used, the method is lacking an easy-to-use web interface that scientists with little programming skills could use to make plots of their own data. The same applies to creating heatmaps: it is possible to add conditional formatting for Excel cells to show colored heatmaps, but for more advanced features such as clustering and experimental annotations, more sophisticated analysis tools have to be used. We present a web tool called ClustVis that aims to have an intuitive user interface. Users can upload data from a simple delimited text file that can be created in a spreadsheet program. It is possible to modify data processing methods and the final appearance of the PCA and heatmap plots by using drop-down menus, text boxes, sliders etc. Appropriate defaults are given to reduce the time needed by the user to specify input parameters. As an output, users can download PCA plot and heatmap in one of the preferred file formats. This web server is freely available at http://biit.cs.ut.ee/clustvis/. PMID:25969447
Bradley, D. Nathan
2012-01-01
The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data, develops a plan-view plot, water-surface profile, cross-section plots, and develops the SAC input file. The SAC GUI also develops HEC-2 files that can be imported into HEC–RAS.
Contemporary issues in HIM. The application layer--III.
Wear, L L; Pinkert, J R
1993-07-01
We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.
Excel Yourself with Personalised Email Messages
ERIC Educational Resources Information Center
McClean, Stephen
2008-01-01
Combining the Excel spreadsheet with an email program provides a very powerful tool for sending students personalised emails. Most email clients now support a Mail Merge facility whereby a generic template is created and information unique to each student record in the spreadsheet is filled into that template, generating tens if not hundreds of…
A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow
NASA Technical Reports Server (NTRS)
Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.
2005-01-01
An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.
A Spreadsheet for the Mixing of a Row of Jets with a Confined Crossflow. Supplement
NASA Technical Reports Server (NTRS)
Holderman, J. D.; Smith, T. D.; Clisset, J. R.; Lear, W. E.
2005-01-01
An interactive computer code, written with a readily available software program, Microsoft Excel (Microsoft Corporation, Redmond, WA) is presented which displays 3 D oblique plots of a conserved scalar distribution downstream of jets mixing with a confined crossflow, for a single row, double rows, or opposed rows of jets with or without flow area convergence and/or a non-uniform crossflow scalar distribution. This project used a previously developed empirical model of jets mixing in a confined crossflow to create an Microsoft Excel spreadsheet that can output the profiles of a conserved scalar for jets injected into a confined crossflow given several input variables. The program uses multiple spreadsheets in a single Microsoft Excel notebook to carry out the modeling. The first sheet contains the main program, controls for the type of problem to be solved, and convergence criteria. The first sheet also provides for input of the specific geometry and flow conditions. The second sheet presents the results calculated with this routine to show the effects on the mixing of varying flow and geometric parameters. Comparisons are also made between results from the version of the empirical correlations implemented in the spreadsheet and the versions originally written in Applesoft BASIC (Apple Computer, Cupertino, CA) in the 1980's.
Space-Plane Spreadsheet Program
NASA Technical Reports Server (NTRS)
Mackall, Dale
1993-01-01
Basic Hypersonic Data and Equations (HYPERDATA) spreadsheet computer program provides data gained from three analyses of performance of space plane. Equations used to perform analyses derived from Newton's second law of physics, derivation included. First analysis is parametric study of some basic factors affecting ability of space plane to reach orbit. Second includes calculation of thickness of spherical fuel tank. Third produces ratio between volume of fuel and total mass for each of various aircraft. HYPERDATA intended for use on Macintosh(R) series computers running Microsoft Excel 3.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
Orbiter Flying Qualities (OFQ) Workstation user's guide
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.
1988-01-01
This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, A.W.
1990-04-01
This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost becomemore » tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies.« less
Electronic spreadsheet vs. manual payroll.
Kiley, M M
1991-01-01
Medical groups with direct employees must employ someone or contract with a company to compute payroll, writes Michael Kiley, Ph.D., M.P.H. However, many medical groups, including small ones, own a personal or minicomputer to handle accounts receivable. Kiley explains, in detail, how this same computer and a spreadsheet program also can be used to perform payroll functions.
Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models
ERIC Educational Resources Information Center
Moro-Egido, Ana I.; Pedauga, Luis E.
2017-01-01
In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…
Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Tapir: A web interface for transit/eclipse observability
NASA Astrophysics Data System (ADS)
Jensen, Eric
2013-06-01
Tapir is a set of tools, written in Perl, that provides a web interface for showing the observability of periodic astronomical events, such as exoplanet transits or eclipsing binaries. The package provides tools for creating finding charts for each target and airmass plots for each event. The code can access target lists that are stored on-line in a Google spreadsheet or in a local text file.
Roosevelt Hot Springs, Utah FORGE X-Ray Diffraction Data
Nash, Greg; Jones, Clay
2018-02-07
This dataset contains X-ray diffraction (XRD) data taken from wells and outcrops as part of the DOE GTO supported Utah FORGE project located near Roosevelt Hot Springs. It contains an Excel spreadsheet with the XRD data, a text file with sample site names, types, and locations in UTM, Zone 12, NAD83 coordinates, and a GIS shapefile of the sample locations with attributes.
Sousa, Luiz Cláudio Demes da Mata; Filho, Herton Luiz Alves Sales; Von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; Neto, Pedro de Alcântara dos Santos; de Castro, José Adail Fonseca; do Monte, Semíramis Jamil Hadad
2011-12-01
The global challenge for solid organ transplantation programs is to distribute organs to the highly sensitized recipients. The purpose of this work is to describe and test the functionality of the EpHLA software, a program that automates the analysis of acceptable and unacceptable HLA epitopes on the basis of the HLAMatchmaker algorithm. HLAMatchmaker considers small configurations of polymorphic residues referred to as eplets as essential components of HLA-epitopes. Currently, the analyses require the creation of temporary files and the manual cut and paste of laboratory tests results between electronic spreadsheets, which is time-consuming and prone to administrative errors. The EpHLA software was developed in Object Pascal programming language and uses the HLAMatchmaker algorithm to generate histocompatibility reports. The automated generation of reports requires the integration of files containing the results of laboratory tests (HLA typing, anti-HLA antibody signature) and public data banks (NMDP, IMGT). The integration and the access to this data were accomplished by means of the framework called eDAFramework. The eDAFramework was developed in Object Pascal and PHP and it provides data access functionalities for software developed in these languages. The tool functionality was successfully tested in comparison to actual, manually derived reports of patients from a renal transplantation program with related donors. We successfully developed software, which enables the automated definition of the epitope specificities of HLA antibodies. This new tool will benefit the management of recipient/donor pairs selection for highly sensitized patients. Copyright © 2011 Elsevier B.V. All rights reserved.
The Maryland power plant research program internet resource for precipitation chemistry data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corio, L.A.; Jones, W.B.; Sherwell, J.
1999-07-01
The Maryland Department of Natural Resources Power Plant Research Program (PPRP) initiated a project in 1998 to make available on the World Wide Web (WWW), precipitation chemistry data from monitoring sites located in the Chesapeake Bay watershed. To that end, PPRP obtained, from various organizations, background information on atmospheric deposition monitoring programs (some of which are still on-going), as well as special studies. For those programs and studies with available precipitation chemistry data of known quality (data were not available for all programs and studies), PPRP obtained, processed, and uploaded the data to its WWW site (www.versar.com/pprp/features/aciddep/aciddep.htm). These data canmore » either be viewed on the web site or downloaded as a zipped file in either comma-delimited or Excel spreadsheet format. PPRP also provides descriptions of the monitoring programs/studies, including information on measurement methods and quality assurance procedures, where available. For the few monitoring programs (e.g., NADP) with existing web sites that allow on-line access to data, PPRP provides links to these sites. PPRP currently is working with the National Oceanic and Atmospheric Administration (NOAA) Air Resources Laboratory (ARL) in a cooperative effort to make more precipitation chemistry data easily available to the scientific community.« less
Popoola, Segun I; Atayero, Aderemi A; Badejo, Joke A; Odukoya, Jonathan A; Omole, David O; Ajayi, Priscilla
2018-06-01
In this data article, we present and analyze the demographic data of undergraduates admitted into engineering programs at Covenant University, Nigeria. The population distribution of 2649 candidates admitted into Chemical Engineering, Civil Engineering, Computer Engineering, Electrical and Electronics Engineering, Information and Communication Engineering, Mechanical Engineering, and Petroleum Engineering programs between 2002 and 2009 are analyzed by gender, age, and state of origin. The data provided in this data article were retrieved from the student bio-data submitted to the Department of Admissions and Student Records (DASR) and Center for Systems and Information Services (CSIS) by the candidates during the application process into the various engineering undergraduate programs. These vital information is made publicly available, after proper data anonymization, to facilitate empirical research in the emerging field of demographics analytics in higher education. A Microsoft Excel spreadsheet file is attached to this data article and the data is thoroughly described for easy reuse. Descriptive statistics and frequency distributions of the demographic data are presented in tables, plots, graphs, and charts. Unrestricted access to these demographic data will facilitate reliable and evidence-based research findings for sustainable education in developing countries.
A Database of Woody Vegetation Responses to Elevated Atmospheric CO2 (NDP-072)
Curtis, Peter S [The Ohio State Univ., Columbus, OH (United States); Cushman, Robert M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brenkert, Antoinette L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
1999-01-01
To perform a statistically rigorous meta-analysis of research results on the response by woody vegetation to increased atmospheric CO2 levels, a multiparameter database of responses was compiled. Eighty-four independent CO2-enrichment studies, covering 65 species and 35 response parameters, met the necessary criteria for inclusion in the database: reporting mean response, sample size, and variance of the response (either as standard deviation or standard error). Data were retrieved from the published literature and unpublished reports. This numeric data package contains a 29-field data set of CO2-exposure experiment responses by woody plants (as both a flat ASCII file and a spreadsheet file), files listing the references to the CO2-exposure experiments and specific comments relevant to the data in the data set, and this documentation file (which includes SAS and Fortran codes to read the ASCII data file; SAS is a registered trademark of the SAS Institute, Inc., Cary, North Carolina 27511).
ERIC Educational Resources Information Center
Ray, Darrell L.
2013-01-01
Students often enter biology programs deficient in the math and computational skills that would enhance their attainment of a deeper understanding of the discipline. To address some of these concerns, I developed a series of spreadsheet simulation exercises that focus on some of the mathematical foundations of scientific inquiry and the benefits…
RENEW v3.2 user's manual, maintenance estimation simulation for Space Station Freedom Program
NASA Technical Reports Server (NTRS)
Bream, Bruce L.
1993-01-01
RENEW is a maintenance event estimation simulation program developed in support of the Space Station Freedom Program (SSFP). This simulation uses reliability and maintainability (R&M) and logistics data to estimate both average and time dependent maintenance demands. The simulation uses Monte Carlo techniques to generate failure and repair times as a function of the R&M and logistics parameters. The estimates are generated for a single type of orbital replacement unit (ORU). The simulation has been in use by the SSFP Work Package 4 prime contractor, Rocketdyne, since January 1991. The RENEW simulation gives closer estimates of performance since it uses a time dependent approach and depicts more factors affecting ORU failure and repair than steady state average calculations. RENEW gives both average and time dependent demand values. Graphs of failures over the mission period and yearly failure occurrences are generated. The averages demand rate for the ORU over the mission period is also calculated. While RENEW displays the results in graphs, the results are also available in a data file for further use by spreadsheets or other programs. The process of using RENEW starts with keyboard entry of the R&M and operational data. Once entered, the data may be saved in a data file for later retrieval. The parameters may be viewed and changed after entry using RENEW. The simulation program runs the number of Monte Carlo simulations requested by the operator. Plots and tables of the results can be viewed on the screen or sent to a printer. The results of the simulation are saved along with the input data. Help screens are provided with each menu and data entry screen.
COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA
Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.
2011-01-01
Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793
Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joe Moore
This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Least-Squares Neutron Spectral Adjustment with STAYSL PNNL
NASA Astrophysics Data System (ADS)
Greenwood, L. R.; Johnson, C. D.
2016-02-01
The STAYSL PNNL computer code, a descendant of the STAY'SL code [1], performs neutron spectral adjustment of a starting neutron spectrum, applying a least squares method to determine adjustments based on saturated activation rates, neutron cross sections from evaluated nuclear data libraries, and all associated covariances. STAYSL PNNL is provided as part of a comprehensive suite of programs [2], where additional tools in the suite are used for assembling a set of nuclear data libraries and determining all required corrections to the measured data to determine saturated activation rates. Neutron cross section and covariance data are taken from the International Reactor Dosimetry File (IRDF-2002) [3], which was sponsored by the International Atomic Energy Agency (IAEA), though work is planned to update to data from the IAEA's International Reactor Dosimetry and Fusion File (IRDFF) [4]. The nuclear data and associated covariances are extracted from IRDF-2002 using the third-party NJOY99 computer code [5]. The NJpp translation code converts the extracted data into a library data array format suitable for use as input to STAYSL PNNL. The software suite also includes three utilities to calculate corrections to measured activation rates. Neutron self-shielding corrections are calculated as a function of neutron energy with the SHIELD code and are applied to the group cross sections prior to spectral adjustment, thus making the corrections independent of the neutron spectrum. The SigPhi Calculator is a Microsoft Excel spreadsheet used for calculating saturated activation rates from raw gamma activities by applying corrections for gamma self-absorption, neutron burn-up, and the irradiation history. Gamma self-absorption and neutron burn-up corrections are calculated (iteratively in the case of the burn-up) within the SigPhi Calculator spreadsheet. The irradiation history corrections are calculated using the BCF computer code and are inserted into the SigPhi Calculator workbook for use in correcting the measured activities. Output from the SigPhi Calculator is automatically produced, and consists of a portion of the STAYSL PNNL input file data that is required to run the spectral adjustment calculations. Within STAYSL PNNL, the least-squares process is performed in one step, without iteration, and provides rapid results on PC platforms. STAYSL PNNL creates multiple output files with tabulated results, data suitable for plotting, and data formatted for use in subsequent radiation damage calculations using the SPECTER computer code (which is not included in the STAYSL PNNL suite). All components of the software suite have undergone extensive testing and validation prior to release and test cases are provided with the package.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less
Heredia-López, Francisco J; Álvarez-Cervera, Fernando J; Collí-Alfaro, José G; Bata-García, José L; Arankowsky-Sandoval, Gloria; Góngora-Alfaro, José L
2016-12-01
Continuous spontaneous alternation behavior (SAB) in a Y-maze is used for evaluating working memory in rodents. Here, the design of an automated Y-maze equipped with three infrared optocouplers per arm, and commanded by a reduced instruction set computer (RISC) microcontroller is described. The software was devised for recording only true entries and exits to the arms. Experimental settings are programmed via a keyboard with three buttons and a display. The sequence of arm entries and the time spent in each arm and the neutral zone (NZ) are saved as a text file in a non-volatile memory for later transfer to a USB flash memory. Data files are analyzed with a program developed under LabVIEW® environment, and the results are exported to an Excel® spreadsheet file. Variables measured are: latency to exit the starting arm, sequence and number of arm entries, number of alternations, alternation percentage, and cumulative times spent in each arm and NZ. The automated Y-maze accurately detected the SAB decrease produced in rats by the muscarinic antagonist trihexyphenidyl, and its reversal by caffeine, having 100 % concordance with the alternation percentages calculated by two trained observers who independently watched videos of the same experiments. Although the values of time spent in the arms and NZ measured by the automated system had small discrepancies with those calculated by the observers, Bland-Altman analysis showed 95 % concordance in three pairs of comparisons, while in one it was 90 %, indicating that this system is a reliable and inexpensive alternative for the study of continuous SAB in rodents.
GENPLOT: A formula-based Pascal program for data manipulation and plotting
NASA Astrophysics Data System (ADS)
Kramer, Matthew J.
Geochemical processes involving alteration, differentiation, fractionation, or migration of elements may be elucidated by a number of discrimination or variation diagrams (e.g., AFM, Harker, Pearce, and many others). The construction of these diagrams involves arithmetic combination of selective elements (involving major, minor, or trace elements). GENPLOT utilizes a formula-based algorithm (an expression parser) which enables the program to manipulate multiparameter databases and plot XY, ternary, tetrahedron, and REE type plots without needing to change either the source code or rearranging databases. Formulae may be any quadratic expression whose variables are the column headings of the data matrix. A full-screen editor with limited equations and arithmetic functions (spreadsheet) has been incorporated into the program to aid data entry and editing. Data are stored as ASCII files to facilitate interchange of data between other programs and computers. GENPLOT was developed in Turbo Pascal for the IBM and compatible computers but also is available in Apple Pascal for the Apple Ile and Ill. Because the source code is too extensive to list here (about 5200 lines of Pascal code), the expression parsing routine, which is central to GENPLOT's flexibility is incorporated into a smaller demonstration program named SOLVE. The following paper includes a discussion on how the expression parser works and a detailed description of GENPLOT's capabilities.
ISPATOM: A Generic Real-Time Data Processing Tool Without Programming
NASA Technical Reports Server (NTRS)
Dershowitz, Adam
2007-01-01
Information Sharing Protocol Advanced Tool of Math (ISPATOM) is an application program allowing for the streamlined generation of comps, which subscribe to streams of incoming telemetry data, perform any necessary computations on the data, then send the data to other programs for display and/or further processing in NASA mission control centers. Heretofore, the development of comps was difficult, expensive, and time-consuming: Each comp was custom written manually, in a low-level computing language, by a programmer attempting to follow requirements of flight controllers. ISPATOM enables a flight controller who is not a programmer to write a comp by simply typing in one or more equation( s) at a command line or retrieving the equation(s) from a text file. ISPATOM then subscribes to the necessary input data, performs all of necessary computations, and sends out the results. It sends out new results whenever the input data change. The use of equations in ISPATOM is no more difficult than is entering equations in a spreadsheet. The time involved in developing a comp is thus limited to the time taken to decide on the necessary equations. Thus, ISPATOM is a real-time dynamic calculator.
A Database of Herbaceous Vegetation Responses to Elevated Atmospheric CO2 (NDP-073)
Jones, Michael H [The Ohio State Univ., Columbus, OH (United States); Curtis, Peter S [The Ohio State Univ., Columbus, OH (United States); Cushman, Robert M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brenkert, Antoinette L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
1999-01-01
To perform a statistically rigorous meta-analysis of research results on the response by herbaceous vegetation to increased atmospheric CO2 levels, a multiparameter database of responses was compiled from the published literature. Seventy-eight independent CO2-enrichment studies, covering 53 species and 26 response parameters, reported mean response, sample size, and variance of the response (either as standard deviation or standard error). An additional 43 studies, covering 25 species and 6 response parameters, did not report variances. This numeric data package accompanies the Carbon Dioxide Information Analysis Center's (CDIAC's) NDP- 072, which provides similar information for woody vegetation. This numeric data package contains a 30-field data set of CO2- exposure experiment responses by herbaceous plants (as both a flat ASCII file and a spreadsheet file), files listing the references to the CO2-exposure experiments and specific comments relevant to the data in the data sets, and this documentation file (which includes SAS and Fortran codes to read the ASCII data file; SAS is a registered trademark of the SAS Institute, Inc., Cary, North Carolina 27511).
Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannik, Tim; Bell, Evaleigh; Dixon, Kenneth
MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.
User Instructions for the Policy Analysis Modeling System (PAMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.
PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.
Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool
2017-06-01
For instance, the requirements for a pen seem straight forward; however, they may vary depending on the context in which the pen will be used...the interactions between the operational elements, specify which tasks are dependent on others and the order of executing task, and estimate how...configuration file to call that spreadsheet. This requirement can be met depending on the situation. If the nodes and arcs are pre-defined and readily
Calibration of work zone impact analysis software for Missouri.
DOT National Transportation Integrated Search
2013-12-01
This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...
Cuffney, Thomas F.; Brightbill, Robin A.
2011-01-01
The Invertebrate Data Analysis System (IDAS) software was developed to provide an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The IDAS software is a stand-alone program for personal computers that run Microsoft Windows(Registered). It allows users to read data downloaded from the NAWQA Program Biological Transactional Database (Bio-TDB) or to import data from other sources either as Microsoft Excel(Registered) or Microsoft Access(Registered) files. The program consists of five modules: Edit Data, Data Preparation, Calculate Community Metrics, Calculate Diversities and Similarities, and Data Export. The Edit Data module allows the user to subset data on the basis of taxonomy or sample type, extract a random subsample of data, combine or delete data, summarize distributions, resolve ambiguous taxa (see glossary) and conditional/provisional taxa, import non-NAWQA data, and maintain and create files of invertebrate attributes that are used in the calculation of invertebrate metrics. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa on the basis of laboratory processing notes, delete pupae or terrestrial adults, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa on the basis of the number of sites where a taxon occurs and (or) the abundance of a taxon in a sample, and resolve taxonomic ambiguities by one of four methods. The Calculate Community Metrics module allows the user to calculate 184 community metrics, including metrics based on organism tolerances, functional feeding groups, and behavior. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data Export module allows the user to export data to other software packages (CANOCO, Primer, PC-ORD, MVSP) and produce tables of community data that can be imported into spreadsheet, database, graphics, statistics, and word-processing programs. The IDAS program facilitates the documentation of analyses by keeping a log of the data that are processed, the files that are generated, and the program settings used to process the data. Though the IDAS program was developed to process NAWQA Program invertebrate data downloaded from Bio-TDB, the Edit Data module includes tools that can be used to convert non-NAWQA data into Bio-TDB format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used to process data generated outside of the NAWQA Program.
2017-09-01
Figure 58. Click on run ................................................................................................61 Figure 59. Top view of...engines, helicopter rotors, and turbine blades , and so forth Creating Marks Readable with a Scanner 4. Simple techniques to follow: Make the light...spreadsheet with data Figure 58. Click on Menu bar and find “View” then click on “Macros.” Click on run Figure 59. 62 Top view of xml spreadsheet
A day in the life of a monitor!
Shah, Kunal
2012-01-01
When at a site, the monitor will meet with the Study Coordinator, review the hospital medical records, use the internet database or paper to 'monitor' their data versus their medical records, issue queries, check master files, count tablets or vials, provide the update to the doctor, and so on. When not traveling, the monitor will work in the office, printing letters, filing documents collected from sites, writing reports and follow-up letters, responding to e-mails, calling sites, to follow-up on the pending action items, in addition to calling sites not visited recently, attending study teleconferences, attending study and company training programs, reading standard operating procedures, completing excel spreadsheets or company specific software systems, and so on. The monitor is loaded with all these different types of work requirements and most importantly each and every task is important and time bound. Different skill sets are required for different tasks and the monitor plays different roles, while doing different tasks. This article enlists the tasks that are required to be done by the monitor, the different roles played by the monitor while doing these tasks, analyze which is the most important day for a monitor, what are the tasks performed during this day, and what knowledge and skills are required for performing these tasks.
The Particle Physics Playground website: tutorials and activities using real experimental data
NASA Astrophysics Data System (ADS)
Bellis, Matthew; CMS Collaboration
2016-03-01
The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.
The meaning of diagnostic test results: a spreadsheet for swift data analysis.
Maceneaney, P M; Malone, D E
2000-03-01
To design a spreadsheet program to: (a) analyse rapidly diagnostic test result data produced in local research or reported in the literature; (b) correct reported predictive values for disease prevalence in any population; (c) estimate the post-test probability of disease in individual patients. Microsoft Excel(TM)was used. Section A: a contingency (2 x 2) table was incorporated into the spreadsheet. Formulae for standard calculations [sample size, disease prevalence, sensitivity and specificity with 95% confidence intervals, predictive values and likelihood ratios (LRs)] were linked to this table. The results change automatically when the data in the true or false negative and positive cells are changed. Section B: this estimates predictive values in any population, compensating for altered disease prevalence. Sections C-F: Bayes' theorem was incorporated to generate individual post-test probabilities. The spreadsheet generates 95% confidence intervals, LRs and a table and graph of conditional probabilities once the sensitivity and specificity of the test are entered. The latter shows the expected post-test probability of disease for any pre-test probability when a test of known sensitivity and specificity is positive or negative. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/Rad-data99.xls A spreadsheet is useful for contingency table data analysis and assessment of the clinical meaning of diagnostic test results. Copyright 2000 The Royal College of Radiologists.
Valentine, Page C.; Gallea, Leslie B.; Blackwood, Dann S.; Twomey, Erin R.
2010-01-01
The U.S. Geological Survey, in collaboration with National Oceanic and Atmospheric Administration's National Marine Sanctuary Program, conducted seabed mapping and related research in the Stellwagen Bank National Marine Sanctuary region from 1993 to 2004. The mapped area is approximately 3,700 km (1,100 nmi) in size and was subdivided into 18 quadrangles. An extensive series of sea-floor maps of the region based on multibeam sonar surveys has been published as paper maps and online in digital format (PDF, EPS, PS). In addition, 2,628 seabed-sediment samples were collected and analyzed and are in the usSEABED: Atlantic Coast Offshore Surficial Sediment Data Release. This report presents for viewing and downloading the more than 10,600 still seabed photographs that were acquired during the project. The digital images are provided in thumbnail, medium (1536 x 1024 pixels), and high (3071 x 2048) resolution. The images can be viewed by quadrangle on the U.S. Geological Survey Woods Hole Coastal and Marine Science Center's photograph database. Photograph metadata are embedded in each image in Exchangeable Image File Format and also provided in spreadsheet format. Published digital topographic maps and descriptive text for seabed features are included here for downloading and serve as context for the photographs. An interactive topographic map for each quadrangle shows locations of photograph stations, and each location is linked to the photograph database. This map also shows stations where seabed sediment was collected for texture analysis; the results of grain-size analysis and associated metadata are presented in spreadsheet format.
Parkhurst, David L.; Kipp, Kenneth L.; Engesgaard, Peter; Charlton, Scott R.
2004-01-01
The computer program PHAST simulates multi-component, reactive solute transport in three-dimensional saturated ground-water flow systems. PHAST is a versatile ground-water flow and solute-transport simulator with capabilities to model a wide range of equilibrium and kinetic geochemical reactions. The flow and transport calculations are based on a modified version of HST3D that is restricted to constant fluid density and constant temperature. The geochemical reactions are simulated with the geochemical model PHREEQC, which is embedded in PHAST. PHAST is applicable to the study of natural and contaminated ground-water systems at a variety of scales ranging from laboratory experiments to local and regional field scales. PHAST can be used in studies of migration of nutrients, inorganic and organic contaminants, and radionuclides; in projects such as aquifer storage and recovery or engineered remediation; and in investigations of the natural rock-water interactions in aquifers. PHAST is not appropriate for unsaturated-zone flow, multiphase flow, density-dependent flow, or waters with high ionic strengths. A variety of boundary conditions are available in PHAST to simulate flow and transport, including specified-head, flux, and leaky conditions, as well as the special cases of rivers and wells. Chemical reactions in PHAST include (1) homogeneous equilibria using an ion-association thermodynamic model; (2) heterogeneous equilibria between the aqueous solution and minerals, gases, surface complexation sites, ion exchange sites, and solid solutions; and (3) kinetic reactions with rates that are a function of solution composition. The aqueous model (elements, chemical reactions, and equilibrium constants), minerals, gases, exchangers, surfaces, and rate expressions may be defined or modified by the user. A number of options are available to save results of simulations to output files. The data may be saved in three formats: a format suitable for viewing with a text editor; a format suitable for exporting to spreadsheets and post-processing programs; or in Hierarchical Data Format (HDF), which is a compressed binary format. Data in the HDF file can be visualized on Windows computers with the program Model Viewer and extracted with the utility program PHASTHDF; both programs are distributed with PHAST. Operator splitting of the flow, transport, and geochemical equations is used to separate the three processes into three sequential calculations. No iterations between transport and reaction calculations are implemented. A three-dimensional Cartesian coordinate system and finite-difference techniques are used for the spatial and temporal discretization of the flow and transport equations. The non-linear chemical equilibrium equations are solved by a Newton-Raphson method, and the kinetic reaction equations are solved by a Runge-Kutta or an implicit method for integrating ordinary differential equations. The PHAST simulator may require large amounts of memory and long Central Processing Unit (CPU) times. To reduce the long CPU times, a parallel version of PHAST has been developed that runs on a multiprocessor computer or on a collection of computers that are networked. The parallel version requires Message Passing Interface, which is currently (2004) freely available. The parallel version is effective in reducing simulation times. This report documents the use of the PHAST simulator, including running the simulator, preparing the input files, selecting the output files, and visualizing the results. It also presents four examples that verify the numerical method and demonstrate the capabilities of the simulator. PHAST requires three input files. Only the flow and transport file is described in detail in this report. The other two files, the chemistry data file and the database file, are identical to PHREEQC files and the detailed description of these files is found in the PHREEQC documentation.
User's Manual for Space Debris Surfaces (SD_SURF)
NASA Technical Reports Server (NTRS)
Elfer, N. C.
1996-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design which is best suited to the predominant penetration mechanism. The analysis also indicates the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs and Microsoft EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII version 1.2a or 1.3 (Cosmic released). The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs.
NASA Astrophysics Data System (ADS)
Locock, Andrew J.; Mitchell, Roger H.
2018-04-01
Perovskite mineral oxides commonly exhibit extensive solid-solution, and are therefore classified on the basis of the proportions of their ideal end-members. A uniform sequence of calculation of the end-members is required if comparisons are to be made between different sets of analytical data. A Microsoft Excel spreadsheet has been programmed to assist with the classification and depiction of the minerals of the perovskite- and vapnikite-subgroups following the 2017 nomenclature of the perovskite supergroup recommended by the International Mineralogical Association (IMA). Compositional data for up to 36 elements are input into the spreadsheet as oxides in weight percent. For each analysis, the output includes the formula, the normalized proportions of 15 end-members, and the percentage of cations which cannot be assigned to those end-members. The data are automatically plotted onto the ternary and quaternary diagrams recommended by the IMA for depiction of perovskite compositions. Up to 200 analyses can be entered into the spreadsheet, which is accompanied by data calculated for 140 perovskite compositions compiled from the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etmektzoglou, A; Mishra, P; Svatos, M
Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less
Maceneaney, P M; Malone, D E
2000-12-01
To design a spreadsheet program to analyse interventional radiology (IR) data rapidly produced in local research or reported in the literature using 'evidence-based medicine' (EBM) parameters of treatment benefit and harm. Microsoft Excel(TM)was used. The spreadsheet consists of three worksheets. The first shows the 'Levels of Evidence and Grades of Recommendations' that can be assigned to therapeutic studies as defined by the Oxford Centre for EBM. The second and third worksheets facilitate the EBM assessment of therapeutic benefit and harm. Validity criteria are described. These include the assessment of the adequacy of sample size in the detection of possible procedural complications. A contingency (2 x 2) table for raw data on comparative outcomes in treated patients and controls has been incorporated. Formulae for EBM calculations are related to these numerators and denominators in the spreadsheet. The parameters calculated are benefit - relative risk reduction, absolute risk reduction, number needed to treat (NNT). Harm - relative risk, relative odds, number needed to harm (NNH). Ninety-five per cent confidence intervals are calculated for all these indices. The results change automatically when the data in the therapeutic outcome cells are changed. A final section allows the user to correct the NNT or NNH in their application to individual patients. This spreadsheet can be used on desktop and palmtop computers. The MS Excel(TM)version can be downloaded via the Internet from the URL ftp://radiography.com/pub/TxHarm00.xls. A spreadsheet is useful for the rapid analysis of the clinical benefit and harm from IR procedures.
R.D. Fight; J.M. Cahill; T.D. Fahey
1992-01-01
The DFPRUNE spreadsheet program is designed to estimate the expected financial return from pruning coast Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. menziesii. It is a significant revision of the PRUNE-SIM program. The PRUNE-SIM program was based on the average product recovery for unpruned logs from a single stand...
NASA Technical Reports Server (NTRS)
Vu, Duc; Sandor, Michael; Agarwal, Shri
2005-01-01
CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.
A new, low-cost sun photometer for student use
NASA Astrophysics Data System (ADS)
Espinoza, A.; Pérez-Álvarez, H.; Parra-Vilchis, J. I.; Fauchey-López, E.; Fernando-González, L.; Faus-Landeros, G. E.; Celarier, E. A.; Robinson, D. Q.; Zepeda-Galbez, R.
2011-12-01
We have designed a sun photometer for the measurement of aerosol optical thickness (AOT) at 505 nm and 620 nm, using custom-made glass filters (9.5 nm bandpass, FWHM) and photodiodes. The recommended price-point (US150 - US200) allowed us to incorporate technologies such as microcontrollers, a sun target, a USB port for data uploading, nonvolatile memory to contain tables of up to 127 geolocation profiles, extensive calibration data, and a log of up to 2,000 measurements. The instrument is designed to be easy to use, and to provide instant display of AOT estimates. A diffuser in the fore-optics limits the sensitivity to pointing error. We have developed postprocessing software to refine the AOT estimates, format a spreadsheet file, and upload the data to the GLOBE website. We are currently finalizing hardware and firmware, and conducting extensive calibration/validation experiments. These instruments will soon be in production and available to the K-12 education community, including and especially the GLOBE program.
The centrality of meta-programming in the ES-DOC eco-system
NASA Astrophysics Data System (ADS)
Greenslade, Mark
2017-04-01
The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding MIP objectives, reviewing citations, exploring component properties of configured models, visualising inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. This presentation underlines the centrality of meta-programming within the ES-DOC eco-system. We will demonstrate how agility is greatly enhanced by taking a meta-programming approach to representing data models and controlled vocabularies. Such an approach nicely decouples representations from encodings. Meta-models will be presented along with the associated tooling chain that forward engineers artefacts as diverse as: class hierarchies, IPython notebooks, mindmaps, configuration files, OWL & SKOS documents, spreadsheets …etc.
NASA Astrophysics Data System (ADS)
Zou, Yan-Rong; Wang, Lianyuan; Shuai, Yanhua; Peng, Ping'an
2005-08-01
A new kinetic model and an Excel © spreadsheet program for modeling the stable carbon isotope composition of natural gases is provided in this paper. The model and spreadsheet could be used to describe and predict the variances in stable carbon isotope of natural gases under both experimental and geological conditions with heating temperature or geological time. It is a user-friendly convenient tool for the modeling of isotope variation with time under experimental and geological conditions. The spreadsheet, based on experimental data, requires the input of the kinetic parameters of gaseous hydrocarbons generation. Some assumptions are made in this model: the conventional (non-isotope species) kinetic parameters represent the light isotope species; the initial isotopic value is the same for all parallel chemical reaction of gaseous hydrocarbons generation for simplicity, the re-exponential factor ratio, 13A/ 12A, is a constant, and both heavy and light isotope species have similar activation energy distribution. These assumptions are common in modeling of isotope ratios. The spreadsheet is used for searching the best kinetic parameters of the heavy isotope species to reach the minimum errors compared with experimental data, and then extrapolating isotopic changes to the thermal history of sedimentary basins. A short calculation example on the variation in δ13C values of methane is provided in this paper to show application to geological conditions.
Simplifying CEA through Excel, VBA, and Subeq
NASA Technical Reports Server (NTRS)
Foster, Ryan
2004-01-01
Many people use compound equilibrium programs for very different reasons, varying from refrigerators to light bulbs to rockets. A commonly used equilibrium program is CEA. CEA can take various inputs such as pressure, temperature, and volume along with numerous reactants and run them through equilibrium equations to obtain valuable output information, including products formed and their relative amounts. A little over a year ago, Bonnie McBride created the program subeq with the goal to simplify the calling of CEA. Subeq was also designed to be called by other programs, including Excel, through the use of Visual Basic for Applications (VBA). The largest advantage of using Excel is that it allows the user to input the information in a colorful and user-friendly environment while allowing VBA to run subeq, which is in the form of a FORTRAN DLL (Dynamic Link Library). Calling subeq in this form makes it much faster than if it were converted to VBA. Since subeq requires such large lists of reactant and product names, all of which can't be passed in as an array, subeq had to be changed to accept very long strings of reactants and products. To pass this string and adjust the transfer of input and output parameters, the subeq DLL had to be changed. One program that does this is Compaq Visual FORTRAN, which allows DLLs to be edited, debugged, and compiled. Compaq Visual FORTRAN uses FORTRAN 90/95, which has additional features to that of FORTRAN 77. My goals this summer include finishing up the excel spreadsheet of subeq, which I started last summer, and putting it on the Internet so that others can use it without having to download my spreadsheet. To finish up the spreadsheet I will need to work on debugging current options and problems. I will also work on making it as robust as possible, so that all errors that may arise will be clearly communicated to the user. New features will be added old ones will be changed as I receive comments from people using the spreadsheet. To implement this onto the Internet, I will need to develop an XML input/output format and learn how to write HTML.
Software Reviews: Programs Worth a Second Look.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1989
1989-01-01
Reviews three software programs: (1) "Microsoft Works 2.0": word processing, data processing, and telecommunications, grades 7 and up; (2) "AppleWorks GS": word processor, database, spreadsheet, graphics, and telecommunications, grades 3-12, Apple IIGS; (3) "Choices, Choices: On the Playground, Taking Responsibility":…
Mathemagical Computing: Order of Operations and New Software.
ERIC Educational Resources Information Center
Ecker, Michael W.
1989-01-01
Describes mathematical problems which occur when using the computer as a calculator. Considers errors in BASIC calculation and the order of mathematical operations. Identifies errors in spreadsheet and calculator programs. Comments on sorting programs and provides a source for Mathemagical Black Holes. (MVL)
An automated graphics tool for comparative genomics: the Coulson plot generator
2013-01-01
Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its original purpose, the software can be used to visualize any dataset where entity occupancy is compared between different classes. Availability CPG software is available at sourceforge http://sourceforge.net/projects/coulson and http://dl.dropbox.com/u/6701906/Web/Sites/Labsite/CPG.html PMID:23621955
Excel spreadsheet in teaching numerical methods
NASA Astrophysics Data System (ADS)
Djamila, Harimi
2017-09-01
One of the important objectives in teaching numerical methods for undergraduates’ students is to bring into the comprehension of numerical methods algorithms. Although, manual calculation is important in understanding the procedure, it is time consuming and prone to error. This is specifically the case when considering the iteration procedure used in many numerical methods. Currently, many commercial programs are useful in teaching numerical methods such as Matlab, Maple, and Mathematica. These are usually not user-friendly by the uninitiated. Excel spreadsheet offers an initial level of programming, which it can be used either in or off campus. The students will not be distracted with writing codes. It must be emphasized that general commercial software is required to be introduced later to more elaborated questions. This article aims to report on a teaching numerical methods strategy for undergraduates engineering programs. It is directed to students, lecturers and researchers in engineering field.
NASA Astrophysics Data System (ADS)
Fauzi, Ahmad
2017-11-01
Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.
NASA Astrophysics Data System (ADS)
Sokolova, Tatiana S.; Dorogokupets, Peter I.; Dymshits, Anna M.; Danilov, Boris S.; Litasov, Konstantin D.
2016-09-01
We present Microsoft Excel spreadsheets for calculation of thermodynamic functions and P-V-T properties of MgO, diamond and 9 metals, Al, Cu, Ag, Au, Pt, Nb, Ta, Mo, and W, depending on temperature and volume or temperature and pressure. The spreadsheets include the most common pressure markers used in in situ experiments with diamond anvil cell and multianvil techniques. The calculations are based on the equation of state formalism via the Helmholtz free energy. The program was developed using Visual Basic for Applications in Microsoft Excel and is a time-efficient tool to evaluate volume, pressure and other thermodynamic functions using T-P and T-V data only as input parameters. This application is aimed to solve practical issues of high pressure experiments in geosciences and mineral physics.
Spreadsheet macros for coloring sequence alignments.
Haygood, M G
1993-12-01
This article describes a set of Microsoft Excel macros designed to color amino acid and nucleotide sequence alignments for review and preparation of visual aids. The colored alignments can then be modified to emphasize features of interest. Procedures for importing and coloring sequences are described. The macro file adds a new menu to the menu bar containing sequence-related commands to enable users unfamiliar with Excel to use the macros more readily. The macros were designed for use with Macintosh computers but will also run with the DOS version of Excel.
The Microcomputer in the Administrative Office.
ERIC Educational Resources Information Center
Huntington, Fred
1983-01-01
Discusses microcomputer uses for administrative computing in education at site level and central office and recommends that administrators start with a word processing program for time management, an electronic spreadsheet for financial accounting, a database management system for inventories, and self-written programs to alleviate paper…
R.D. Fight; J.M. Cahill; T.A. Snellgrove; T.D. Fahey
1987-01-01
PRUNE-SIM is a spreadsheet template (program) that allows users to simulate a financial analysis of pruning coast Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. menziesii). The program estimates the increase in product value resulting from pruning the butt 17-foot log. Product recovery information is based on actual...
Refinery spreadsheet highlights microcomputer process applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, M.A.
1984-01-23
Microcomputer applications in the process areas at Chevron U.S.A. refineries and at the Chevron Research Co. illustrate how the microcomputer has changed the way we do our jobs. This article will describe major uses of the microcomputer as a personal work tool in Chevron process areas. It will also describe how and why many of Chevron's microcomputer applications were developed and their characteristics. One of our earliest microcomputer applications, developed in late 1981, was an electronic spreadsheet program using a small desktop microcomputer. It was designed to help a refinery planner prepare monthly plans for a small portion of onemore » of our major refineries. This particular microcomputer had a tiny 4-in. screen, and the reports were several strips of print-out from the microcomputer's 3-in.-wide internal printer taped together. In spite of these archaic computing conditions, it was a successful application. It automated what had been very tedious and time-consuming calculations with a pencil, a calculator, and a great deal of erasing. It eliminated filling out large ''horseblanket'' reports. The electronic spreadsheet was also flexible; the planner could easily change the worksheet to match new operating constraints, new process conditions, and new feeds and products. Fortunately, within just a few months, this application graduated to a similar electronic spreadsheet program on a new, more powerful microcomputer. It had a bigger display screen and a letter-size printer. The same application is still in use today, although it has been greatly enhanced and altered to match extensive plant modifications. And there are plans to expand it again onto yet another, more powerful microcomputer.« less
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1995-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design that is best suited to the predominant penetration mechanism. The analysis also suggests the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs or Microsoft-EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII. The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs. Examples will be presented of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration.
Numerous features have been included to facilitate the modeling process, from model setup and data input, presentation and analysis of results, to easy export of results to spreadsheet programs for additional analysis.
Lowers, Heather; Bern, Amy M.
2009-01-01
This report presents data on particle characterization analyzed by scanning electron microscopy on Libby amphibole collected by the U.S. Geological Survey in 2000 (LA2000) and amosite material collected by RTI International (RTI amosite). The particle characterization data were generated to support a portion of the Libby Action Plan. Prior to analysis, the raw LA2000 and RTI amosite materials were subjected to a preparation step. Each sample was water-elutriated by U.S. Environmental Protection Agency (USEPA) Office of Research and Development, Research Triangle Park using the methods generally described in another published report and then delivered to the U.S. Geological Survey, Denver Microbeam Laboratory for analysis. Data presented here represent analyses performed by the U.S. Geological Survey, Denver Microbeam Laboratory and USEPA National Enforcement Investigations Center. This report consists of two Excel spreadsheet files developed by USEPA, Region 8 Superfund Technical Assistance Unit and describe the particle size characterization of the LA2000 and RTI amosite, respectively. Multiple tabs and data entry cells exist in each spreadsheet and are defined herein.
ALOG: A spreadsheet-based program for generating artificial logs
Matthew F. Winn; Randolph H. Wynne; Philip A. Araman
2004-01-01
Log sawing simulation computer programs can be valuable tools for training sawyers as well as for testing different sawing patterns. Most available simulation programs rely on databases from which to draw logs and can be very costly and time-consuming to develop. ALOG (Artificial LOg Generator) is a Microsoft Excel®-based computer program that was developed to...
A Microsoft Excel® 2010 Based Tool for Calculating Interobserver Agreement
Azulay, Richard L
2011-01-01
This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel®) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work. PMID:22649578
A microsoft excel(®) 2010 based tool for calculating interobserver agreement.
Reed, Derek D; Azulay, Richard L
2011-01-01
This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel(®)) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work.
Understanding Solubility through Excel Spreadsheets
NASA Astrophysics Data System (ADS)
Brown, Pamela
2001-02-01
This article describes assignments related to the solubility of inorganic salts that can be given in an introductory general chemistry course. Le Châtelier's principle, solubility, unit conversion, and thermodynamics are tied together to calculate heats of solution by two methods: heats of formation and an application of the van't Hoff equation. These assignments address the need for math, graphing, and computer skills in the chemical technology program by developing skill in the use of Microsoft Excel to prepare spreadsheets and graphs and to perform linear and nonlinear curve-fitting. Background information on the value of understanding and predicting solubility is provided.
Designing a data portal for synthesis modeling
NASA Astrophysics Data System (ADS)
Holmes, M. A.
2006-12-01
Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.
ERIC Educational Resources Information Center
Batt, Russell H., Ed.
1990-01-01
Described is how spreadsheet and problem solver microcomputer programs may assist students in performing mathematical calculations. Discussed is the application of the equation solver "MathCAD" to various areas in the undergraduate curriculum. (KR)
Adams, Bruce D; Whitlock, Warren L
2004-04-01
In 1997, The American Heart Association in association with representatives of the International Committee on Resuscitation (ILCOR) published recommended guidelines for reviewing, reporting and conducting in-hospital cardiopulmonary resuscitation (CPR) outcomes using the "Utstein style". Using these guidelines, we developed two Microsoft Office based database management programs that may be useful to the resuscitation community. We developed a user-friendly spreadsheet based on MS Office Excel. The user enters patient variables such as name, age, and diagnosis. Then, event resuscitation variables such as time of collapse and CPR team arrival are entered from a "code flow sheet". Finally, outcome variables such as patient condition at different time points are recorded. The program then makes automatic calculations of average response times, survival rates and other important outcome measurements. Also using the Utstein style, we developed a database program based on MS Office Access. To promote free public access to these programs, we established at a website. These programs will help hospitals track, analyze, and present their CPR outcomes data. Clinical CPR researchers might also find the programs useful because they are easily modified and have statistical functions.
Teresa E. Jordan
2015-09-30
This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.
GPFA-AB_Phase1RiskAnalysisTask5DataUpload
Teresa E. Jordan
2015-09-30
This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.
Learning about Tasks Computers Can Perform. ERIC Digest.
ERIC Educational Resources Information Center
Brosnan, Patricia A.
Knowing what different kinds of computer equipment can do is the first step in choosing the computer that is right for you. This digest describes a developmental progression of computer capabilities. First the basic three software programs (word processing, spreadsheets, and database programs) are discussed using examples. Next, an explanation of…
ERIC Educational Resources Information Center
Marchetti, Honey
A work-study student assistant was employed at the Carnegie Mellon University Engineering and Science Library to help prepare documentation for a new library program. The student, a junior professional writing major, used the Apple Macintosh microcomputer to design a brochure, billing worksheet, and spreadsheet for the new program. On completion…
48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.
Code of Federal Regulations, 2013 CFR
2013-10-01
... of the information, to expedite review of the proposal, submit an IBM-compatible software or storage... offeror used another spreadsheet program, indicate the software program used to create this information... submission of a compatible software or device will expedite review, failure to submit a disk will not affect...
48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.
Code of Federal Regulations, 2014 CFR
2014-10-01
... of the information, to expedite review of the proposal, submit an IBM-compatible software or storage... offeror used another spreadsheet program, indicate the software program used to create this information... submission of a compatible software or device will expedite review, failure to submit a disk will not affect...
Evaluating Technology Integration in the Elementary School: A Site-Based Approach.
ERIC Educational Resources Information Center
Mowe, Richard
This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…
The Office of the Materials Division
NASA Technical Reports Server (NTRS)
Ramsey, amanda J.
2004-01-01
I was assigned to the Materials Division, which consists of the following branches; the Advanced Metallics Branch/5120-RMM, Ceramics Branch/5130-RMC, Polymers Branch/5150-RMP, and the Durability and Protective Coatings Branch/5160-RMD. Mrs. Pamela Spinosi is my assigned mentor. She was assisted by Ms.Raysa Rodriguez/5100-RM and Mrs.Denise Prestien/5100-RM, who are both employed by InDyne, Inc. My primary assignment this past summer was working directly with Ms. Rodriguez, assisting her with setting up the Integrated Financial Management Program (IFMP) 5130-RMC/Branch procedures and logs. These duties consisted of creating various spreadsheets for each individual branch member, which were updated daily. It was not hard to familiarize myself with these duties since this is my second summer working with Ms Rodriguez at NASA Glenn Research Center. RMC ordering laboratory, supplies and equipment for the Basic Materials Laboratory (Building 106) using the IF'MP/Purchase Card (P-card), a NASA-wide software program. I entered into the IFMP/Travel and Requisitions System, new Travel Authorizations for the 5130-RMC Civil Servant Branch Members. I also entered and completed Travel Vouchers for the 5130-RMC Ceramics Branch. I assisted in the Division Office creating new Emergency Contact list for the Materials Division. I worked with Dr. Hugh Gray, the Division Chief, and Dr. Ajay Misra, the 5130-RMC Branch Chief, on priority action items, with a close deadline, for a large NASA Proposal. Another project was working closely with Ms. Rodriguez in organizing and preparing for Dr. Ajay K. Misra's SESCDP (two year detail). This consisted of organizing files, file folders, personal information, and recording all data material onto CD's and printing all presentations for display in binders. I attended numerous Branch meetings, and observed many changes in the Branch Management organization.
Wu, Sheng-Nan
2004-03-31
The purpose of this study was to develop a method to simulate the cardiac action potential using a Microsoft Excel spreadsheet. The mathematical model contained voltage-gated ionic currents that were modeled using either Beeler-Reuter (B-R) or Luo-Rudy (L-R) phase 1 kinetics. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet. The capability of spreadsheet iteration was used in these simulations. It does not require any prior knowledge of computer programming, although the use of the macro language can speed up the calculation. The normal configuration of the cardiac ventricular action potential can be well simulated in the B-R model that is defined by four individual ionic currents, each representing the diffusion of ions through channels in the membrane. The contribution of Na+ inward current to the rate of depolarization is reproduced in this model. After removal of Na+ current from the model, a constant current stimulus elicits an oscillatory change in membrane potential. In the L-R phase 1 model where six types of ionic currents were defined, the effect of extracellular K+ concentration on changes both in the time course of repolarization and in the time-independent K+ current can be demonstrated, when the solutions are implemented in Excel. Using the simulation protocols described here, the users can readily study and graphically display the underlying properties of ionic currents to see how changes in these properties determine the behavior of the heart cell. The method employed in these simulation protocols may also be extended or modified to other biological simulation programs.
Sprecher, D J; Ley, W B; Whittier, W D; Bowen, J M; Thatcher, C D; Pelzer, K D; Moore, J M
1989-07-15
A computer spreadsheet was developed to predict the economic impact of a management decision to use B-mode ultrasonographic ovine pregnancy diagnosis. The spreadsheet design and spreadsheet cell formulas are provided. The program used the partial farm budget technique to calculate net return (NR) or cash flow changes that resulted from the decision to use ultrasonography. Using the program, either simple pregnancy diagnosis or pregnancy diagnosis with the ability to determine singleton or multiple pregnancies may be compared with no flock ultrasonographic pregnancy diagnosis. A wide range of user-selected regional variables are used to calculate the cash flow changes associated with the ultrasonography decisions. A variable may be altered through a range of values to conduct a sensitivity analysis of predicted NR. Example sensitivity analyses are included for flock conception rate, veterinary ultrasound fee, and the price of corn. Variables that influence the number of cull animals and the cost of ultrasonography have the greatest impact on predicted NR. Because the determination of singleton or multiple pregnancies is more time consuming, its economic practicality in comparison with simple pregnancy diagnosis is questionable. The value of feed saved by identifying and separately feeding ewes with singleton pregnancies is not offset by the increased ultrasonography cost.
Microcomputer Scheduling of Reference Desk Staff.
ERIC Educational Resources Information Center
Cornick, Donna; Owen, Willy
1988-01-01
Presents a model that can accommodate staff preferences when determining a reference desk schedule using a microcomputer, the Lotus 1-2-3 spreadsheet software, and the linear programing software LP83. (eight references) (MES)
Historic Land Use and Carbon Estimates for South and Southeast Asia: 1880-1980 (NDP-046)
Richards, John F. [Duke Univ., Durham, NC (United States); Flint, Elizabeth P. [Duke Univ., Durham, NC (United States); Daniels, Richard C. [Carbon Dioxide Information Analysis Center (CDIAC)
1994-01-01
This data base contains estimates of land use change and the carbon content of vegetation for South and Southeast Asia for the years 1880, 1920, 1950, 1970, and 1980. These data were originally collected for climate modelers so they could reduce the uncertainty associated with the magnitude and time course of historical land use change and of carbon release. For this data base, South and Southeast Asia is defined as encompassing nearly 8 × 106 km2 of the earth's land surface and includes the countries of India, Sri Lanka, Bangladesh, Myanmar (Burma), Thailand, Laos, Kampuchea (Cambodia), Vietnam, Malaysia, Brunei, Singapore, Indonesia, and the Philippines.The most important change in land use over this 100-year period was the conversion of 107 × 106 ha of forest/woodland to categories with lower biomass. Land thus transformed accounted for 13.5% of the total area of the study region. The estimated total carbon content of live vegetation in South and Southeast Asia has dropped progressively, from 59 × 109 Mg in 1880 to 27 × 109 Mg in 1980. Throughout the study period, the carbon stock in forests was greater than the carbon content in all other categories combined, although its share of the total declined progressively from 81% in 1880 to 73% in 1980. The data base was developed in Lotus 1-2-3TM by using a sequential bookkeeping model. The source data were obtained at the local and regional level for each country from official agricultural and economic statistics (e.g., the United Nations Food and Agriculture Organization); historical geographic and demographic texts, reports, and articles; and any other available source. Because of boundary changes through time and disparities between the validity, availability, and scale of the data for each country, the data were aggregated into 94 ecological zones. The resulting data base contains land use and carbon information for 94 ecological zones and national totals for 13 countries.The directory to which the above link leads provides 90 Lotus 1-2-3TM files, three ARC/INFOTM export files, and five ASCII data files. We advise users to use the file transfer protocol (FTP) to download the binary spreadsheet *.wk1 files; please consult the ndp046.txt documentation file or Accessing CDIAC via FTP for instructions. In addition to these, a descriptive file that explains the contents and format of each data file and four FORTRAN and SAS TM retrieval programs for use with the ASCII data files are included.
Spreadsheet Applications using VisiCalc and Lotus 1-2-3 Programs.
ERIC Educational Resources Information Center
Cortland-Madison Board of Cooperative Educational Services, Cortland, NY.
The VisiCalc program is visual calculation on a computer making use of an electronic worksheet that is beneficial to the business user in dealing with numerous accounting and clerical procedures. The Lotus 1-2-3 program begins with VisiCalc and improves upon it by adding graphics and a database as well as more efficient ways to manipulate and…
Applying EXCEL Solver to a watershed management goal-programming problem
J. E. de Steiguer
2000-01-01
This article demonstrates the application of EXCEL® spreadsheet linear programming (LP) solver to a watershed management multiple use goal programming (GP) problem. The data used to demonstrate the application are from a published study for a watershed in northern Colorado. GP has been used by natural resource managers for many years. However, the GP solution by means...
Dusel-Bacon, Cynthia; Slack, John F.; Koenig, Alan E.; Foley, Nora K.; Oscarson, Robert L.; Gans, Kathleen D.
2011-01-01
This Open-File Report presents geochemical data for outcrop and drill-core samples from volcanogenic massive sulfide deposits and associated metaigneous and metasedimentary rocks in the Wood River area of the Bonnifield mining district, northern Alaska Range, east-central Alaska. The data consist of major- and trace-element whole-rock geochemical analyses, and major- and trace-element analyses of sulfide minerals determined by electron microprobe and laser ablation—inductively coupled plasma—mass spectrometry (LA-ICP-MS) techniques. The PDF consists of text, appendix explaining the analytical methods used for the analyses presented in the data tables, a sample location map, and seven data tables. The seven tables are also available as spreadsheets in several file formats. Descriptions and discussions of the Bonnifield deposits are given in Dusel-Bacon and others (2004, 2005, 2006, 2007, 2010).
Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.
Zauber, Henrik; Schulze, Waltraud X
2012-11-02
The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.
Mojo Hand, a TALEN design tool for genome editing applications.
Neff, Kevin L; Argue, David P; Ma, Alvin C; Lee, Han B; Clark, Karl J; Ekker, Stephen C
2013-01-16
Recent studies of transcription activator-like (TAL) effector domains fused to nucleases (TALENs) demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org). We describe the algorithm and its implementation. The features of Mojo Hand include (1) automatic download of genomic data from the National Center for Biotechnology Information, (2) analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3) selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4) output files designed for subsequent TALEN construction using the Golden Gate assembly method. Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.
Wang, Li Yan; O'Brien, Mary Jane; Maughan, Erin D
2016-11-01
This paper describes a user-friendly, Excel spreadsheet model and two data collection instruments constructed by the authors to help states and districts perform cost-benefit analyses of school nursing services delivered by full-time school nurses. Prior to applying the model, states or districts need to collect data using two forms: "Daily Nurse Data Collection Form" and the "Teacher Survey." The former is used to record daily nursing activities, including number of student health encounters, number of medications administered, number of student early dismissals, and number of medical procedures performed. The latter is used to obtain estimates for the time teachers spend addressing student health issues. Once inputs are entered in the model, outputs are automatically calculated, including program costs, total benefits, net benefits, and benefit-cost ratio. The spreadsheet model, data collection tools, and instructions are available at the NASN website ( http://www.nasn.org/The/CostBenefitAnalysis ).
NASA Astrophysics Data System (ADS)
Eso, R.; Safiuddin, L. O.; Agusu, L.; Arfa, L. M. R. F.
2018-04-01
We propose a teaching instrument demonstrating the circular membrane waves using the excel interactive spreadsheets with the Visual Basic for Application (VBA) programming. It is based on the analytic solution of circular membrane waves involving Bessel function. The vibration modes and frequencies are determined by using Bessel approximation and initial conditions. The 3D perspective based on the spreadsheets functions and facilities has been explored to show the 3D moving objects in transitional or rotational processes. This instrument is very useful both in teaching activity and learning process of wave physics. Visualizing of the vibration of waves in the circular membrane which is showing a very clear manner of m and n vibration modes of the wave in a certain frequency has been compared and matched to the experimental result using resonance method. The peak of deflection varies in time if the initial condition was working and have the same pattern with matlab simulation in zero initial velocity
State Energy Efficiency Program Evaluation Inventory
2013-01-01
The focus of this inventory, some of which has been placed into a searchable spreadsheet, is to support the National Energy Modeling System (NEMS) and to research cost information in state-mandated energy efficiency program evaluations – e.g., for use in updating analytic and modeling assumptions used by the U.S. Energy Information Administration (EIA).
Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations
ERIC Educational Resources Information Center
Sung, Christopher Teh Boon
2011-01-01
Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…
Costing nursing education programs. It's as easy as 1-2-3.
Fisher, M L; Hume, R; Emerick, R
1998-01-01
Staff development departments are pressured to reveal the costs of their educational programs and to compete with outside vendors for programming. The process of implementing a spreadsheet template for costing out staff development programs is described. The template is easy to use and supports "what if" analysis. This model allows educators to evaluate cost implications of curricular decisions and to better negotiate with internal and external customers.
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
Beyond the Mechanics of Spreadsheets: Using Design Instruction to Address Spreadsheet Errors
ERIC Educational Resources Information Center
Schneider, Kent N.; Becker, Lana L.; Berg, Gary G.
2017-01-01
Given that the usage and complexity of spreadsheets in the accounting profession are expected to increase, it is more important than ever to ensure that accounting graduates are aware of the dangers of spreadsheet errors and are equipped with design skills to minimize those errors. Although spreadsheet mechanics are prevalent in accounting…
Rand, Hugh; Shumway, Martin; Trees, Eija K.; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E.; Defibaugh-Chavez, Stephanie; Carleton, Heather A.; Klimke, William A.; Katz, Lee S.
2017-01-01
Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools—we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines. PMID:29372115
Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S
2017-01-01
As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools-we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hillson, Nathan
j5 automates and optimizes the design of the molecular biological process of cloning/constructing DNA. j5 enables users to benefit from (combinatorial) multi-part scar-less SLIC, Gibson, CPEC, Golden Gate assembly, or variants thereof, for which automation software does not currently exist, without the intense labor currently associated with the process. j5 inputs a list of the DNA sequences to be assembled, along with a Genbank, FASTA, jbei-seq, or SBOL v1.1 format sequence file for each DNA source. Given the list of DNA sequences to be assembled, j5 first determines the cost-minimizing assembly strategy for each part (direct synthesis, PCR/SOE, or oligo-embedding),more » designs DNA oligos with Primer3, adds flanking homology sequences (SLIC, Gibson, and CPEC; optimized with Primer3 for CPEC) or optimized overhang sequences (Golden Gate) to the oligos and direct synthesis pieces, and utilizes BLAST to check against oligo mis-priming and assembly piece incompatibility events. After identifying DNA oligos that are already contained within a local collection for reuse, the program estimates the total cost of direct synthesis and new oligos to be ordered. In the instance that j5 identifies putative assembly piece incompatibilities (multiple pieces with high flanking sequence homology), the program suggests hierarchical subassemblies where possible. The program outputs a comma-separated value (CSV) file, viewable via Excel or other spreadsheet software, that contains assembly design information (such as the PCR/SOE reactions to perform, their anticipated sizes and sequences, etc.) as well as a properly annotated genbank file containing the sequence resulting from the assembly, and appends the local oligo library with the oligos to be ordered j5 condenses multiple independent assembly projects into 96-well format for high-throughput liquid-handling robotics platforms, and generates configuration files for the PR-PR biology-friendly robot programming language. j5 thus provides a new way to design DNA assembly procedures much more productively and efficiently, not only in terms of time, but also in terms of cost. To a large extent, however, j5 does not allow people to do something that could not be done before by hand given enough time and effort. An exception to this is that, since the very act of using j5 to design the DNA assembly process standardizes the experimental details and workflow, j5 enables a single person to concurrently perform the independent DNA construction tasks of an entire group of researchers. Currently, this is not readily possible, since separate researchers employ disparate design strategies and workflows, and furthermore, their designs and workflows are very infrequently fully captured in an electronic format which is conducive to automation.« less
Cuffney, Thomas F.
2003-01-01
The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.
Supplemental knowledge acquisition through external product interface for CLIPS
NASA Technical Reports Server (NTRS)
Saito, Tim; Ebaud, Stephen; Loftin, Bowen R.
1990-01-01
Traditionally, the acquisition of knowledge for expert systems consisted of the interview process with the domain or subject matter expert (SME), observation of domain environment, and information gathering and research which constituted a direct form of knowledge acquisition (KA). The knowledge engineer would be responsible for accumulating pertinent information and/or knowledge from the SME(s) for input into the appropriate expert system development tool. The direct KA process may (or may not) have included forms of data or documentation to incorporate from the SME's surroundings. The differentiation between direct KA and supplemental KA (indirect) would be the difference in the use of data. In acquiring supplemental knowledge, the knowledge engineer would access other types of evidence (manuals, documents, data files, spreadsheets, etc.) that would support the reasoning or premises of the SME. When an expert makes a decision in a particular task, one tool that may have been used to justify a recommendation, would have been a spreadsheet total or column figure. Locating specific decision points from that data within the SME's framework would constitute supplemental KA. Data used for a specific purpose in one system or environment would be used as supplemental knowledge for another, specifically a CLIPS project.
ALOG user's manual: A Guide to using the spreadsheet-based artificial log generator
Matthew F. Winn; Philip A. Araman; Randolph H. Wynne
2012-01-01
Computer programs that simulate log sawing can be valuable training tools for sawyers, as well as a means oftesting different sawing patterns. Most available simulation programs rely on diagrammed-log databases, which canbe very costly and time consuming to develop. Artificial Log Generator (ALOG) is a user-friendly Microsoft® Excel®...
Autoplot: a Browser for Science Data on the Web
NASA Astrophysics Data System (ADS)
Faden, J.; Weigel, R. S.; West, E. E.; Merka, J.
2008-12-01
Autoplot (www.autoplot.org) is software for plotting data from many different sources and in many different file formats. Data from CDF, CEF, Fits, NetCDF, and OpenDAP can be plotted, along with many other sources such as ASCII tables and Excel spreadsheets. This is done by adapting these various data formats and APIs into a common data model that borrows from the netCDF and CDF data models. Autoplot uses a web browser metaphor to simplify use. The user specifies a parameter URL, for example a CDF file accessible via http with a parameter name appended, and the file resource is downloaded and the parameter is rendered in a scientifically meaningful way. When data span multiple files, the user can use a file name template in the URL to aggregate (combine) a set of remote files. So the problem of aggregating data across file boundaries is handled on the client side, allowing simple web servers to be used. The das2 graphics library provides rich controls for exploring the data. Scripting is supported through Python, providing not just programmatic control, but for calculating new parameters in a language that will look familiar to IDL and Matlab users. Autoplot is Java-based software, and will run on most computers without a burdensome installation process. It can also used as an applet or as a servlet that serves static images. Autoplot was developed as part of the Virtual Radiation Belt Observatory (ViRBO) project, and is also being used for the Virtual Magnetospheric Observatory (VMO). It is expected that this flexible, general-purpose plotting tool will be useful for allowing a data provider to add instant visualization capabilities to a directory of files or for general use in the Virtual Observatory environment.
Making the business case for telemedicine: an interactive spreadsheet.
McCue, Michael J; Palsbo, Susan E
2006-04-01
The objective of this study was to demonstrate the business case for telemedicine in nonrural areas. We developed an interactive spreadsheet to conduct multiple financial analyses under different capital investment, revenue, and expense scenarios. We applied the spreadsheet to the specific case of poststroke rehabilitation in urban settings. The setting involved outpatient clinics associated with a freestanding rehabilitation hospital in Oklahoma. Our baseline scenario used historical financial data from face-to-face encounters as the baseline for payer and volume mix. We assumed a cost of capital of 10% to finance the project. The outcome measures were financial breakeven points and internal rate of return. A total of 340 telemedicine visits will generate a positive net cash flow each year. The project is expected to recoup the initial investment by the fourth year, produce a positive present value dollar return of more than $2,000, and earn rate of return of 20%, which exceeds the hospital's cost of capital. The business case is demonstrated for this scenario. Urban telemedicine programs can be financially self-sustaining without accounting for reductions in travel time by providers or patients. Urban telemedicine programs can be a sound business investment and not depend on grants or subsidies for start-up funding. There are several key decision points that affect breakeven points and return on investment. The best business strategy is to approach the decision as whether or not to build a new clinic.
Computer Literacy for Teachers.
ERIC Educational Resources Information Center
Sarapin, Marvin I.; Post, Paul E.
Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…
(abstract) Simple Spreadsheet Thermal Models for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Nash, A. E.
1994-01-01
Self consistent circuit analog thermal models, that can be run in commercial spreadsheet programs on personal computers, have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. These models have been used to analyze the Cryogenic Telescope Test Facility (CTTF). The facility will be on line in early 1995 for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison of the model predictions and actual performance of this facility will be presented.
Simple Spreadsheet Thermal Models for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Nash, Alfred
1995-01-01
Self consistent circuit analog thermal models that can be run in commercial spreadsheet programs on personal computers have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. these models have been used to analyze the SIRTF Telescope Test Facility (STTF). The facility has been brought on line for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison between the models' predictions and actual performance of this facility will be presented.
Determination of Needed Spreadsheet Competencies for Business Personnel in the Mid-South States.
ERIC Educational Resources Information Center
Rogers, Betty S.; Arn, Joseph V.
1993-01-01
A survey of 209 Mid-South businesses determined spreadsheet usage, what competencies are needed for entry-level and continued employment, and sources of spreadsheet training. Recommended that, because of their widespread use, spreadsheets should be taught to all business students. (Author/JOW)
Clynne, Michael A.; Muffler, L.J.P.; Siems, D.F.; Taggart, J.E.; Bruggman, Peggy
2008-01-01
This open-file report presents WDXRF major-element chemical data for late Pliocene to Holocene volcanic rocks collected from Lassen Volcanic National Park and vicinity, California. Data for Rb, Sr, Ba, Y, Zr, Nb, Ni, Cr, Zn and Cu obtained by EDXRF are included for many samples. Data are presented in an EXCEL spreadsheet and are keyed to rock units as displayed on the Geologic Map of Lassen Volcanic National Park and vicinity (Clynne and Muffler, in press). Location of the samples is given in latitude and longitude in degrees and decimal minutes and in decimal degrees.
HEC-RAS 2.2 for backwater and scour analysis - phase one
DOT National Transportation Integrated Search
2000-09-01
The Kansas Department of Transportation (KDOT) and most bridge consultants in Kansas have been using the DOS-WSPRO program and the KDOT scour spreadsheets to perform bridge hydraulics and scour analysis for the past several years. Unfortunately, DOS-...
SPREADSHEET BASED SCALING CALCULATIONS AND MEMBRANE PERFORMANCE
Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total...
Computerized Budget Monitoring.
ERIC Educational Resources Information Center
Stein, Julian U.; Rowe, Joe N.
1989-01-01
This article discusses the importance of budget monitoring in fiscal management; describes ways in which computerized budget monitoring increases accuracy, efficiency, and flexibility; outlines steps in the budget process; and presents sample reports, generated using the Lotus 1-2-3 spreadsheet and graphics program. (IAH)
Evaluation of work zone enhancement software programs.
DOT National Transportation Integrated Search
2009-09-01
The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...
DOT National Transportation Integrated Search
2017-03-01
The performance-planning tool developed as part of this project is intended for use with the guidebook for establishing and using rural performance based transportation system assessment, monitoring, planning, and programming to support the rural pla...
COMPUTING SI AND CCPP USING SPREADSHEET PROGRAMS
Lotus 1-2-3 worksheets for calculating the calcite saturation index (SI) and calcium carbonate precipitation potential of a water sample are described. A simplified worksheet illustrates the principles of the method, and a more complex worksheet suitable for modeling most potabl...
Popoola, Segun I; Atayero, Aderemi A; Badejo, Joke A; John, Temitope M; Odukoya, Jonathan A; Omole, David O
2018-04-01
Empirical measurement, monitoring, analysis, and reporting of learning outcomes in higher institutions of developing countries may lead to sustainable education in the region. In this data article, data about the academic performances of undergraduates that studied engineering programs at Covenant University, Nigeria are presented and analyzed. A total population sample of 1841 undergraduates that studied Chemical Engineering (CHE), Civil Engineering (CVE), Computer Engineering (CEN), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mechanical Engineering (MEE), and Petroleum Engineering (PET) within the year range of 2002-2014 are randomly selected. For the five-year study period of engineering program, Grade Point Average (GPA) and its cumulative value of each of the sample were obtained from the Department of Student Records and Academic Affairs. In order to encourage evidence-based research in learning analytics, detailed datasets are made publicly available in a Microsoft Excel spreadsheet file attached to this article. Descriptive statistics and frequency distributions of the academic performance data are presented in tables and graphs for easy data interpretations. In addition, one-way Analysis of Variance (ANOVA) and multiple comparison post-hoc tests are performed to determine whether the variations in the academic performances are significant across the seven engineering programs. The data provided in this article will assist the global educational research community and regional policy makers to understand and optimize the learning environment towards the realization of smart campuses and sustainable education.
Wood fueled boiler financial feasibility user's manual
Robert Govett; Scott Bowe; Terry Mace; Steve Hubbard; John (Rusty) Dramm; Richard Bergman
2005-01-01
âWood Fueled Boiler Financial Feasibilityâ is a spreadsheet program designed for easy use on a personal computer. This program provides a starting point for interested parties to perform financial feasibility analysis of a steam boiler system for space heating or process heat. By allowing users to input the conditions applicable to their current or proposed fuel...
A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course
ERIC Educational Resources Information Center
Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.
2011-01-01
The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…
John Pitlick; Yantao Cui; Peter Wilcock
2009-01-01
This manual provides background information and instructions on the use of a spreadsheet-based program for Bedload Assessment in Gravel-bed Streams (BAGS). The program implements six bed load transport equations developed specifically for gravel-bed rivers. Transport capacities are calculated on the basis of field measurements of channel geometry, reach-average slope,...
Using Spreadsheets to Teach Statistics in Geography.
ERIC Educational Resources Information Center
Lee, M. P.; Soper, J. B.
1987-01-01
Maintains that teaching methods of statistical calculation in geography may be enhanced by using a computer spreadsheet. The spreadsheet format of rows and columns allows the data to be inspected and altered to demonstrate various statistical properties. The inclusion of graphics and database facilities further adds to the value of a spreadsheet.…
Simulation Software's Effect on College Students Spreadsheet Project Scores
ERIC Educational Resources Information Center
Atkinson, J. Kirk; Thrasher, Evelyn H.; Coleman, Phillip D.
2011-01-01
The purpose of this study is to explore the potential impact of support materials on student spreadsheet skill acquisition. Specifically, this study examines the use of an online spreadsheet simulation tool versus a printed book across two independent student groups. This study hypothesizes that the online spreadsheet simulation tool will have a…
BIOCHLOR: NATURAL ATTENUATION DECISION SUPPORT SYSTEM, USER'S MANUAL, VERSION 1.0
BIOCHLOR is an easy-to-use screening model that simulates remediation by natural attenuation (RNA) of dissolved solvents at chlorinated solvent release sites. The software, programmed in the Microsoft Excel spreadsheet environment and based on the Domenico analytical solute tran...
BIOSCREEN: NATURAL ATTENTUATION DECISION SUPPORT SYSTEM - USER'S MANUAL, VERSION 1.3
BIOSCREEN is an easy-to-use screening model which simulates remediation through natural attenuation (RNA) of dissolved hydrocarbons at petroleum fuel release sites. The software, programmed in the Microsoft Excel spreadsheet environment and based on the Domenico analytical solu...
Science: Database Programs and the Study of Seashells.
ERIC Educational Resources Information Center
McCurry, Niki; McCurry, Alan
1992-01-01
Discusses the dynamics and outcomes of an unplanned classroom activity that developed from the integration of the use of spreadsheets with the study of the characteristics of previously collected seashells, specifically their color, size, shape, texture, and any other obvious differences. (JJK)
Implementation of straight and curved steel girder erection design tools construction : summary.
DOT National Transportation Integrated Search
2010-11-05
Project 0-5574 Curved Plate Girder Design for Safe and Economical Construction, resulted in the : development of two design tools, UT Lift and UT Bridge. UT Lift is a spreadsheet-based program for analyzing : steel girders during lifting while ...
Measurements of striae in CR+ doped YAG laser crystals
NASA Astrophysics Data System (ADS)
Cady, Fredrick M.
1994-12-01
Striations in Czochralski (CZ) grown crystals have been observed in materials such as GaAs, silicon, photorefractive crystals used for data storage, potassium titanyl phosphate crystals and LiNbO3. Several techniques have been used for investigating these defects including electron microscopy, laser scanning tomography, selective photoetching, X-ray diffuse scattering, interference orthoscopy, laser interferometry and micro-Fourier transform infrared spectroscopy mapping. A 2mm thick sample of the material to be investigated is illuminated with light that is absorbed and non-absorbed by the ion concentration to be observed. The back surface of the sample is focused onto a solid-state image detector and images of the input beam and absorbed (and diffracted) beams are captured at two wavelengths. The variation of the coefficient of absorption asa function of distance on the sample can be derived from these measurements. A Big Sky Software Beamcode system is used to capture and display images. Software has been written to convert the Beamcode data files to a format that can be imported into a spreadsheet program such as Quatro Pro. The spreadsheet is then used to manipulate and display data. A model of the intensity map of the striae collected by the imaging system has been proposed and a data analysis procedure derived. From this, the variability of the attenuation coefficient alpha can be generated. Preliminary results show that alpha may vary by a factor of four or five over distances of 100 mu m. Potential errors and problems have been discovered and additional experiments and improvements to the experimental setup are in progress and we must now show that the measurement techniques and data analysis procedures provide 'real' information. Striae are clearly visible at all wavelengths including white light. Their basic spatial frequency does not change radically, at least when changing from blue to green to white light. Further experimental and theoretical work can be done to improve the data collection techniques and to verify the data analysis procedures.
Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking
NASA Technical Reports Server (NTRS)
Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward
2011-01-01
To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk calculation procedure.
Finding P-Values for F Tests of Hypothesis on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The calculation of the F statistic for a one-factor analysis of variance (ANOVA) and the construction of an ANOVA tables are easily implemented on a spreadsheet. This paper describes how to compute the p-value (observed significance level) for a particular F statistic on a spreadsheet. Decision making on a spreadsheet and applications to the…
Cognitive Skills, Domain Knowledge, and Self-Efficacy: Effects on Spreadsheet Quality
ERIC Educational Resources Information Center
Adkins, Joni K.
2011-01-01
Numerous studies have shown that spreadsheets used in companies often have errors which may affect the quality of the decisions made with these tools. Many businesses are unaware or choose to ignore the risks associated with spreadsheet use. The intent of this study was to learn more about the characteristics of spreadsheet end user developers,…
Spreadsheets and Bulgarian goats
NASA Astrophysics Data System (ADS)
Sugden, Steve
2012-10-01
We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a successful and lucid spreadsheet implementation.
DataSpread: Unifying Databases and Spreadsheets.
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-08-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.
DataSpread: Unifying Databases and Spreadsheets
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-01-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487
Methodology for National Water Savings Model and Spreadsheet Tool—Outdoor Water Use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Alison, A; Chen, Yuting; Dunham, Camilla
This report describes the method Lawrence Berkeley National Laboratory (LBNL) developed to estimate national impacts of the U.S. Environmental Protection Agency’s (EPA’s) WaterSense labeling program for weather-based irrigation controllers (WBIC). Estimated impacts include the national water savings attributable to the program and the net present value of the lifetime water savings for consumers of irrigation controllers.
Lotus 123 as a Gradebook: A Means of Increasing Teacher Productivity.
ERIC Educational Resources Information Center
Smith, Karen L.
1988-01-01
Examines the application of the spreadsheet program Lotus 1-2-3 to design a computerized gradebook that saves time in assessing students' strengths and weaknesses in a proficiency-oriented foreign language classroom. Sample entries are shown in text and Appendices. (Author/LMO)
Strategies of Successful Technology Integrators. Part I: Streamlining Classroom Management.
ERIC Educational Resources Information Center
McNally, Lynn; Etchison, Cindy
2000-01-01
Discussion of how to develop curriculum that successfully integrates technology into elementary and secondary school classrooms focuses on solutions for school and classroom management tasks. Highlights include Web-based solutions; student activities; word processing; desktop publishing; draw and paint programs; spreadsheets; and database…
Nonlinear least-squares data fitting in Excel spreadsheets.
Kemmer, Gerdi; Keller, Sandro
2010-02-01
We describe an intuitive and rapid procedure for analyzing experimental data by nonlinear least-squares fitting (NLSF) in the most widely used spreadsheet program. Experimental data in x/y form and data calculated from a regression equation are inputted and plotted in a Microsoft Excel worksheet, and the sum of squared residuals is computed and minimized using the Solver add-in to obtain the set of parameter values that best describes the experimental data. The confidence of best-fit values is then visualized and assessed in a generally applicable and easily comprehensible way. Every user familiar with the most basic functions of Excel will be able to implement this protocol, without previous experience in data fitting or programming and without additional costs for specialist software. The application of this tool is exemplified using the well-known Michaelis-Menten equation characterizing simple enzyme kinetics. Only slight modifications are required to adapt the protocol to virtually any other kind of dataset or regression equation. The entire protocol takes approximately 1 h.
Teaching physics using Microsoft Excel
NASA Astrophysics Data System (ADS)
Uddin, Zaheer; Ahsanuddin, Muhammad; Khan, Danish Ahmed
2017-09-01
Excel is both ubiquitous and easily understandable. Most people from every walk of life know how to use MS office and Excel spreadsheets. Students are also familiar with spreadsheets. Most students know how to use spreadsheets for data analysis. Besides basic use of Excel, some important aspects of spreadsheets are highlighted in this article. MS Excel can be used to visualize effects of various parameters in a physical system. It can be used as a simulating tool; simulation of wind data has been done through spreadsheets in this study. Examples of Lissajous figures and a damped harmonic oscillator are presented in this article.
WQEP - a computer spreadsheet program to evaluate water quality data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liddle, R.G.
1996-12-31
A flexible spreadsheet Water Quality Evaluation Program (WQEP) has been developed for mining companies, consultants, and regulators to interpret the results of water quality sampling. In order properly to evaluate hydrologic data, unit conversions and chemical calculations are done, quality control checks are needed, and a complete and up-to-date listing of water quality standards is necessary. This process is time consuming and tends not to be done for every sample. This program speeds the process by allowing the input of up to 115 chemical parameters from one sample. WQEP compares concentrations with EPA primary and secondary drinking water MCLs ormore » MCLG, EPA warmwater and Coldwater acute and chronic aquatic life criteria, irrigation criteria, livestock criteria, EPA human health criteria, and several other categories of criteria. The spreadsheet allows the input of State or local water standards of interest. Water quality checks include: anion/cations, TDS{sub m}/TDS{sub c} (where m=measured and c=calculated), EC{sub m}/EC{sub c}, EC{sub m}/ion sums, TDS{sub c}/EC ratio, TDS{sub m}/EC, EC vs. alkalinity, two hardness values, and EC vs. {Sigma} cations. WQEP computes the dissolved transport index of 23 parameters, computes ratios of 26 species for trend analysis, calculates non-carbonate alkalinity to adjust the bicarbonate concentration, and calculates 35 interpretive formulas (pE, SAR, S.I., unionized ammonia, ionized sulfide HS-, pK{sub x} values, etc.). Fingerprinting is conducted by automatic generation of stiff diagrams and ion histograms. Mass loading calculations, mass balance calculations, conversions of concentrations, ionic strength, and the activity coefficient and chemical activity of 33 parameters is calculated. This program allows a speedy and thorough evaluation of water quality data from metal mines, coal mining, and natural surface water systems and has been tested against hand calculations.« less
Computer-Aided Evaluation of Forage Management: Forage Manager.
ERIC Educational Resources Information Center
Panciera, M. T.; And Others
1993-01-01
Presents the Forage Manager spreadsheet, developed as a forage management teaching tool to integrate agronomic, livestock, and cost data to demonstrate the impact of forage management on livestock production costs. Teaching applications, examples involving agronomic data and conventional agronomic evaluation, and limitations of the program are…
ERIC Educational Resources Information Center
Curtis, Rick
This paper summarizes information about using computer hardware and software to aid in making purchase decisions that are based on user needs. The two major options in hardware are IBM-compatible machines and the Apple Macintosh line. The three basic software applications include word processing, database management, and spreadsheet applications.…
Lean Mean Times--Budgeting for School Media Technology.
ERIC Educational Resources Information Center
Johnson, Doug
1995-01-01
Discusses budgeting strategies for school media technology programs. Highlights include sources for school funding, school district budget information, control of the budget, how to write an effective budget, working with other community and school groups, local politics, and sidebars that discuss spreadsheets and maintenance budgets. (LRW)
NASA Technical Reports Server (NTRS)
Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.
2008-01-01
During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool, which is used for individual ground-to-satellite correspondences. The Google(tm) Earth file contains a set of placemarks and ground overlays to show participating students the area around their school that the satellite is measuring. This approach will be automated and made interactive by the S'COOL database expert and will also be used to help refine the latitude/longitude location of the participating schools. Once complete, these new data analysis tools will be posted on the S'COOL website for use by the project participants in schools around the US and the world.
Spreadsheet-Like Image Analysis
1992-08-01
1 " DTIC AD-A254 395 S LECTE D, ° AD-E402 350 Technical Report ARPAD-TR-92002 SPREADSHEET-LIKE IMAGE ANALYSIS Paul Willson August 1992 U.S. ARMY...August 1992 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS SPREADSHEET-LIKE IMAGE ANALYSIS 6. AUTHOR(S) Paul Willson 7. PERFORMING ORGANIZATION NAME(S) AND...14. SUBJECT TERMS 15. NUMBER OF PAGES Image analysis , nondestructive inspection, spreadsheet, Macintosh software, 14 neural network, signal processing
CalPro: a spreadsheet program for the management of California mixed-conifer stands.
Jingjing Liang; Joseph Buongiorno; Robert A. Monserud
2004-01-01
CalPro is an add-in program developed to work with Microsoft Excel to simulate the growth and management of uneven-aged mixed-conifer stands in California. Its built-in growth model was calibrated from 177 uneven-aged plots on industry and other private lands. Stands are described by the number of trees per acre in each of nineteen 2-inch diameter classes in...
Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Arms, S. C.
2015-12-01
Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.
Ayling, Pete; Hill, Robert; Jassam, Nuthar; Kallner, Anders; Khatami, Zahra
2017-11-01
Background A logical consequence of the introduction of robotics and high-capacity analysers has seen a consolidation to larger units. This requires new structures and quality systems to ensure that laboratories deliver consistent and comparable results. Methods A spreadsheet program was designed to accommodate results from up to 12 different instruments/laboratories and present IQC data, i.e. Levey-Jennings and Youden plots and comprehensive numerical tables of the performance of each item. Input of data was made possible by a 'data loader' by which IQC data from the individual instruments could be transferred to the spreadsheet program on line. Results A set of real data from laboratories is used to populate the data loader and the networking software program. Examples are present from the analysis of variance components, the Levey-Jennings and Youden plots. Conclusions This report presents a software package that allows the simultaneous management and detailed monitoring of the performance of up to 12 different instruments/laboratories in a fully interactive mode. The system allows a quality manager of networked laboratories to have a continuous updated overview of the performance. This software package has been made available at the ACB website.
Beyond [lambda][subscript max] Part 2: Predicting Molecular Color
ERIC Educational Resources Information Center
Williams, Darren L.; Flaherty, Thomas J.; Alnasleh, Bassam K.
2009-01-01
A concise roadmap for using computational chemistry programs (i.e., Gaussian 03W) to predict the color of a molecular species is presented. A color-predicting spreadsheet is available with the online material that uses transition wavelengths and peak-shape parameters to predict the visible absorbance spectrum, transmittance spectrum, chromaticity…
ERIC Educational Resources Information Center
Williams, Jeffery R.; Smith, Craig M.; Roe, Josh D.; Leatherman, John C.; Wilson, Robert M.
2012-01-01
"Watershed Manager" is a spreadsheet-based model that is used in extension education programs for learning about and selecting cost-effective watershed management practices to reduce soil, nitrogen, and phosphorus losses from cropland. It can facilitate Watershed Restoration and Protection Strategy (WRAPS) stakeholder groups' development…
1990-05-01
Obtain Thermistor Operating Characteristics ................................. 82 25. Ag+/Ci" Thermometric Titration ........................... 85 26...Experiment Program for Thermometric Titrations ............... 85 27. Appearance of the Spreadsheet in the Analysis Mode ............ 86 28...rate experiments, carbon dioxide exhalation monitoring, stream turbidity measurement, photosynthesis monitoring, pendulum timing, thermometric titrations
Simple Numerical Analysis of Longboard Speedometer Data
ERIC Educational Resources Information Center
Hare, Jonathan
2013-01-01
Simple numerical data analysis is described, using a standard spreadsheet program, to determine distance, velocity (speed) and acceleration from voltage data generated by a skateboard/longboard speedometer (Hare 2012 "Phys. Educ." 47 409-17). This simple analysis is an introduction to data processing including scaling data as well as…
ERIC Educational Resources Information Center
Heys, Chris
2008-01-01
Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.
Use of Computer-Based Case Studies in a Problem-Solving Curriculum.
ERIC Educational Resources Information Center
Haworth, Ian S.; And Others
1997-01-01
Describes the use of three case studies, on computer, to enhance problem solving and critical thinking among doctoral pharmacy students in a physical chemistry course. Students are expected to use specific computer programs, spreadsheets, electronic mail, molecular graphics, word processing, online literature searching, and other computer-based…
Handheld Computers in Education. Research Brief
ERIC Educational Resources Information Center
Education Partnerships, Inc., 2003
2003-01-01
For over the last 20 years, educators have been trying to find the best practice in using technology for student learning. Some of the most widely used applications with computers have been student learning of programming, word processing, Web research, spreadsheets, games, and Web design. The difficulty with integrating many of these activities…
Solving Rational Expectations Models Using Excel
ERIC Educational Resources Information Center
Strulik, Holger
2004-01-01
Simple problems of discrete-time optimal control can be solved using a standard spreadsheet software. The employed-solution method of backward iteration is intuitively understandable, does not require any programming skills, and is easy to implement so that it is suitable for classroom exercises with rational-expectations models. The author…
New Campus Crime Prevention Resources Available
ERIC Educational Resources Information Center
Campus Law Enforcement Journal, 2012
2012-01-01
The Campus Crime Prevention Committee has compiled a list of university and college crime prevention agencies and resources, which includes contact information, links to agency crime prevention web pages, and a list of resources they offer (i.e., brochures, guides, PowerPoint programs, videos, etc.) as well as a spreadsheet showing organizations…
Use of microcomputers for planning and managing silviculture habitat relationships.
B.G. Marcot; R.S. McNay; R.E. Page
1988-01-01
Microcomputers aid in monitoring, modeling, and decision support for integrating objectives of silviculture and wildlife habitat management. Spreadsheets, data bases, statistics, and graphics programs are described for use in monitoring. Stand growth models, modeling languages, area and geobased information systems, and optimization models are discussed for use in...
Spreadsheet-based engine data analysis tool - user's guide.
DOT National Transportation Integrated Search
2016-07-01
This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...
Modeling Steady-State Groundwater Flow Using Microcomputer Spreadsheets.
ERIC Educational Resources Information Center
Ousey, John Russell, Jr.
1986-01-01
Describes how microcomputer spreadsheets are easily adapted for use in groundwater modeling. Presents spreadsheet set-ups and the results of five groundwater models. Suggests that this approach can provide a basis for demonstrations, laboratory exercises, and student projects. (ML)
Using Spreadsheets to Produce Acid-Base Titration Curves.
ERIC Educational Resources Information Center
Cawley, Martin James; Parkinson, John
1995-01-01
Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)
In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for EV-CIS, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and EV-CIS Submitters.
NASA Astrophysics Data System (ADS)
Soderstrom, Ken; Alalawi, Ali
KLFromRecordingDays allows measurement of Kullback-Leibler (KL) distances between 2D probability distributions of vocal acoustic features. Greater KL distance measures reflect increased phonological divergence across the vocalizations compared. The software has been used to compare *.wav file recordings made by Sound Analysis Recorder 2011 of songbird vocalizations pre- and post-drug and surgical manipulations. Recordings from individual animals in *.wav format are first organized into subdirectories by recording day and then segmented into individual syllables uttered and acoustic features of these syllables using Sound Analysis Pro 2011 (SAP). KLFromRecordingDays uses syllable acoustic feature data output by SAP to a MySQL table to generate and compare "template" (typically pre-treatment) and "target" (typically post-treatment) probability distributions. These distributions are a series of virtual 2D plots of the duration of each syllable (as x-axis) to each of 13 other acoustic features measured by SAP for that syllable (as y-axes). Differences between "template" and "target" probability distributions for each acoustic feature are determined by calculating KL distance, a measure of divergence of the target 2D distribution pattern from that of the template. KL distances and the mean KL distance across all acoustic features are calculated for each recording day and output to an Excel spreadsheet. Resulting data for individual subjects may then be pooled across treatment groups and graphically summarized and used for statistical comparisons. Because SAP-generated MySQL files are accessed directly, data limits associated with spreadsheet output are avoided, and the totality of vocal output over weeks may be objectively analyzed all at once. The software has been useful for measuring drug effects on songbird vocalizations and assessing recovery from damage to regions of vocal motor cortex. It may be useful in studies employing other species, and as part of speech therapies tracking progress in producing distinct speech sounds in isolation.
Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.
2014-01-01
5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.
Jingjing Liang; Joseph Buongiorno; Robert A. Monserud
2006-01-01
WestProPlus is an add-in program developed to work with Microsoft Excel to simulate the growth and management of all-aged Douglas-firâwestern hemlock (Pseudotsuga menziesii (Mirb.) FrancoâTsuga heterophylla (Raf.) Sarg.) stands in Oregon and Washington. Its built-in growth model was calibrated from 2,706 permanent plots in the...
Preparation of School District Budgets with Microcomputer Electronic Spreadsheets.
ERIC Educational Resources Information Center
Hinitz, Herman J.
1996-01-01
Preparing a microcomputer electronic spreadsheet containing all relevant school district budgetary information is possible with currently available hardware and software (such as Lotus 1-2-3), despite random-access-memory limitations. Spreadsheets can provide financial summaries, inventory-control listings, scheduling alternatives,…
A Spreadsheet in the Mathematics Classroom.
ERIC Educational Resources Information Center
Watkins, Will; Taylor, Monty
1989-01-01
Demonstrates how spreadsheets can be used to implement linear system solving algorithms in college mathematics classes. Lotus 1-2-3 is described, a linear system of equations is illustrated using spreadsheets, and the interplay between applications, computations, and theory is discussed. (four references) (LRW)
The Iodine-Clock Reaction--A Spreadsheet Simulation To Test.
ERIC Educational Resources Information Center
Swain, P. A.
1997-01-01
Describes a spreadsheet activity for the iodine-clock reaction which follows the concentrations of all reactions and products for 200 seconds and gives the induction period. Explains that, although there are limitations to the spreadsheet, it is nevertheless illuminating. (Author/ASK)
A Java-based tool for creating KML files from GPS waypoints
NASA Astrophysics Data System (ADS)
Kinnicutt, P. G.; Rivard, C.; Rimer, S.
2008-12-01
Google Earth provides a free tool with powerful capabilities for visualizing geoscience images and data. Commercial software tools exist for doing sophisticated digitizing and spatial modeling , but for the purposes of presentation, visualization and overlaying aerial images with data Google Earth provides much of the functionality. Likewise, with current technologies in GPS (Global Positioning System) systems and with Google Earth Plus, it is possible to upload GPS waypoints, tracks and routes directly into Google Earth for visualization. However, older technology GPS units and even low-cost GPS units found today may lack the necessary communications interface to a computer (e.g. no Bluetooth, no WiFi, no USB, no Serial, etc.) or may have an incompatible interface, such as a Serial port but no USB adapter available. In such cases, any waypoints, tracks and routes saved in the GPS unit or recorded in a field notebook must be manually transferred to a computer for use in a GIS system or other program. This presentation describes a Java-based tool developed by the author which enables users to enter GPS coordinates in a user-friendly manner, then save these coordinates in a Keyhole MarkUp Language (KML) file format, for visualization in Google Earth. This tool either accepts user-interactive input or accepts input from a CSV (Comma Separated Value) file, which can be generated from any spreadsheet program. This tool accepts input in the form of lat/long or UTM (Universal Transverse Mercator) coordinates. This presentation describes this system's applicability through several small case studies. This free and lightweight tool simplifies the task of manually inputting GPS data into Google Earth for people working in the field without an automated mechanism for uploading the data; for instance, the user may not have internet connectivity or may not have the proper hardware or software. Since it is a Java application and not a web- based tool, it can be installed on one's field laptop and the GPS data can be manually entered without the need for internet connectivity. This tool provides a table view of the GPS data, but lacks a KML viewer to view the data overlain on top of an aerial view, as this viewer functionality is provided in Google Earth. The tool's primary contribution lies in its more convenient method for entering the GPS data manually when automated technologies are not available.
Documentation of a spreadsheet for time-series analysis and drawdown estimation
Halford, Keith J.
2006-01-01
Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user-defined periods. Raw or reduced drawdown estimates can be copied from the spreadsheet application or written to tab-delimited ASCII files.
Integrating Critical Spreadsheet Competencies into the Accounting Curriculum
ERIC Educational Resources Information Center
Walters, L. Melissa; Pergola, Teresa M.
2012-01-01
The American Institute of Certified Public Accountants (AICPA) and the International Accounting Education Standards Board (IAESB) identify spreadsheet technology as a key information technology (IT) competency for accounting professionals. However requisite spreadsheet competencies are not specifically defined by the AICPA or IAESB nor are they…
Decision Analysis Using Spreadsheets.
ERIC Educational Resources Information Center
Sounderpandian, Jayavel
1989-01-01
Discussion of decision analysis and its importance in a business curriculum focuses on the use of spreadsheets instead of commercial software packages for computer assisted instruction. A hypothetical example is given of a company drilling for oil, and suggestions are provided for classroom exercises using spreadsheets. (seven references) (LRW)
ERIC Educational Resources Information Center
Batt, Russell H., Ed.
1988-01-01
Notes two uses of computer spreadsheets in the chemistry classroom. Discusses the general use of the spreadsheet to easily provide changing parameters of equations and then replotting the results on the screen. Presents a molecular orbital spreadsheet calculation of the LCAO-MO approach. Supplies representative printouts and graphs. (MVL)
Introduction to Financial Projection Models. Business Management Instructional Software.
ERIC Educational Resources Information Center
Pomeroy, Robert W., III
This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…
ERIC Educational Resources Information Center
Computing Teacher, 1985
1985-01-01
Defines computer literacy and describes a computer literacy course which stresses ethics, hardware, and disk operating systems throughout. Core units on keyboarding, word processing, graphics, database management, problem solving, algorithmic thinking, and programing are outlined, together with additional units on spreadsheets, simulations,…
Using Spreadsheet Modeling to Teach Exchange Curves (Optimal Policy Curves) in Inventory Management
ERIC Educational Resources Information Center
Strakos, Joshua K.
2016-01-01
Inventory management is widely researched and the topic is taught in business programs across the spectrum of operations and supply chain management. However, the concepts are notoriously difficult for students to practice once they finish school and become managers responsible for inventory control. This article explains the structure and details…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-12
... use available information technology (for example: Spreadsheet programs, i.e., Microsoft Excel, web...) 231- 3221, or by email at [email protected] . SUPPLEMENTARY INFORMATION: I. Background A. Pre... part 1218, subpart I, titled ``Federal Coal Advance Royalty.'' B. The EPAct On August 8, 2005, the...
Serving Grades Over the Internet.
ERIC Educational Resources Information Center
Harris, James K.
This paper demonstrates a grade server that allows college students to access their grades over the Internet from the instructor's home page. Using a CGI (common gateway interface) program written in Visual Basic, the grades are read directly from an Excel spreadsheet and presented to the requester after he/she enters a password. The grade for…
How to Teach Programming Indirectly--Using Spreadsheet Application
ERIC Educational Resources Information Center
Tahy, Zsuzsanna Szalayné
2016-01-01
It is a question in many countries whether ICT and application usage should be taught. There are some problems with IT literacy: users do not understand the concepts of a software, they cannot solve problems, and moreover, using applications gives them more problems. Consequently, using ICT seems to slow work down. Experts suggest learning…
40 CFR 80.1164 - What are the attest engagement requirements under the RFS program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... volumes, contained in the inventory reconciliation analysis under § 80.133, and verify that the volumes reported to EPA agree with the volumes in the inventory reconciliation analysis. (iv) Compute and report as... reported to EPA. (v) Obtain the database, spreadsheet, or other documentation for all RINs used for...
40 CFR 80.1164 - What are the attest engagement requirements under the RFS program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... volumes, contained in the inventory reconciliation analysis under § 80.133, and verify that the volumes reported to EPA agree with the volumes in the inventory reconciliation analysis. (iv) Compute and report as... reported to EPA. (v) Obtain the database, spreadsheet, or other documentation for all RINs used for...
ARS-Media: A spreadsheet tool for calculating media recipes based on ion-specific constraints
USDA-ARS?s Scientific Manuscript database
ARS-Media is an ion solution calculator that uses Microsoft Excel to generate recipes of salts for complex ion mixtures specified by the user. Generating salt combinations (recipes) that result in pre-specified target ion values is a linear programming problem. Thus, the recipes are generated using ...
Fourment, Mathieu; Gibbs, Mark J
2008-02-05
Viruses of the Bunyaviridae have segmented negative-stranded RNA genomes and several of them cause significant disease. Many partial sequences have been obtained from the segments so that GenBank searches give complex results. Sequence databases usually use HTML pages to mediate remote sorting, but this approach can be limiting and may discourage a user from exploring a database. The VirusBanker database contains Bunyaviridae sequences and alignments and is presented as two spreadsheets generated by a Java program that interacts with a MySQL database on a server. Sequences are displayed in rows and may be sorted using information that is displayed in columns and includes data relating to the segment, gene, protein, species, strain, sequence length, terminal sequence and date and country of isolation. Bunyaviridae sequences and alignments may be downloaded from the second spreadsheet with titles defined by the user from the columns, or viewed when passed directly to the sequence editor, Jalview. VirusBanker allows large datasets of aligned nucleotide and protein sequences from the Bunyaviridae to be compiled and winnowed rapidly using criteria that are formulated heuristically.
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Mass-balance measurements in Alaska and suggestions for simplified observation programs
Trabant, D.C.; March, R.S.
1999-01-01
US Geological Survey glacier fieldwork in Alaska includes repetitious measurements, corrections for leaning or bending stakes, an ability to reliably measure seasonal snow as deep as 10 m, absolute identification of summer surfaces in the accumulation area, and annual evaluation of internal accumulation, internal ablation, and glacier-thickness changes. Prescribed field measurement and note-taking techniques help eliminate field errors and expedite the interpretative process. In the office, field notes are transferred to computerized spread-sheets for analysis, release on the World Wide Web, and archival storage. The spreadsheets have error traps to help eliminate note-taking and transcription errors. Rigorous error analysis ends when mass-balance measurements are extrapolated and integrated with area to determine glacier and basin mass balances. Unassessable errors in the glacier and basin mass-balance data reduce the value of the data set for correlations with climate change indices. The minimum glacier mass-balance program has at least three measurement sites on a glacier and the measurements must include the seasonal components of mass balance as well as the annual balance.
Teaching with Spreadsheets: An Example from Heat Transfer.
ERIC Educational Resources Information Center
Drago, Peter
1993-01-01
Provides an activity which measures the heat transfer through an insulated cylindrical tank, allowing the student to gain a better knowledge of both the physics involved and the working of spreadsheets. Provides both a spreadsheet solution and a maximum-minimum method of solution for the problem. (MVL)
Facilitating Analysis of Multiple Partial Data Streams
NASA Technical Reports Server (NTRS)
Maimone, Mark W.; Liebersbach, Robert R.
2008-01-01
Robotic Operations Automation: Mechanisms, Imaging, Navigation report Generation (ROAMING) is a set of computer programs that facilitates and accelerates both tactical and strategic analysis of time-sampled data especially the disparate and often incomplete streams of Mars Explorer Rover (MER) telemetry data described in the immediately preceding article. As used here, tactical refers to the activities over a relatively short time (one Martian day in the original MER application) and strategic refers to a longer time (the entire multi-year MER missions in the original application). Prior to installation, ROAMING must be configured with the types of data of interest, and parsers must be modified to understand the format of the input data (many example parsers are provided, including for general CSV files). Thereafter, new data from multiple disparate sources are automatically resampled into a single common annotated spreadsheet stored in a readable space-separated format, and these data can be processed or plotted at any time scale. Such processing or plotting makes it possible to study not only the details of a particular activity spanning only a few seconds, but also longer-term trends. ROAMING makes it possible to generate mission-wide plots of multiple engineering quantities [e.g., vehicle tilt as in Figure 1(a), motor current, numbers of images] that, heretofore could be found only in thousands of separate files. ROAMING also supports automatic annotation of both images and graphs. In the MER application, labels given to terrain features by rover scientists and engineers are automatically plotted in all received images based on their associated camera models (see Figure 2), times measured in seconds are mapped to Mars local time, and command names or arbitrary time-labeled events can be used to label engineering plots, as in Figure 1(b).
Parkhurst, David L.; Kipp, Kenneth L.; Charlton, Scott R.
2010-01-01
The computer program PHAST (PHREEQC And HST3D) simulates multicomponent, reactive solute transport in three-dimensional saturated groundwater flow systems. PHAST is a versatile groundwater flow and solute-transport simulator with capabilities to model a wide range of equilibrium and kinetic geochemical reactions. The flow and transport calculations are based on a modified version of HST3D that is restricted to constant fluid density and constant temperature. The geochemical reactions are simulated with the geochemical model PHREEQC, which is embedded in PHAST. Major enhancements in PHAST Version 2 allow spatial data to be defined in a combination of map and grid coordinate systems, independent of a specific model grid (without node-by-node input). At run time, aquifer properties are interpolated from the spatial data to the model grid; regridding requires only redefinition of the grid without modification of the spatial data. PHAST is applicable to the study of natural and contaminated groundwater systems at a variety of scales ranging from laboratory experiments to local and regional field scales. PHAST can be used in studies of migration of nutrients, inorganic and organic contaminants, and radionuclides; in projects such as aquifer storage and recovery or engineered remediation; and in investigations of the natural rock/water interactions in aquifers. PHAST is not appropriate for unsaturated-zone flow, multiphase flow, or density-dependent flow. A variety of boundary conditions are available in PHAST to simulate flow and transport, including specified-head, flux (specified-flux), and leaky (head-dependent) conditions, as well as the special cases of rivers, drains, and wells. Chemical reactions in PHAST include (1) homogeneous equilibria using an ion-association or Pitzer specific interaction thermodynamic model; (2) heterogeneous equilibria between the aqueous solution and minerals, ion exchange sites, surface complexation sites, solid solutions, and gases; and (3) kinetic reactions with rates that are a function of solution composition. The aqueous model (elements, chemical reactions, and equilibrium constants), minerals, exchangers, surfaces, gases, kinetic reactants, and rate expressions may be defined or modified by the user. A number of options are available to save results of simulations to output files. The data may be saved in three formats: a format suitable for viewing with a text editor; a format suitable for exporting to spreadsheets and postprocessing programs; and in Hierarchical Data Format (HDF), which is a compressed binary format. Data in the HDF file can be visualized on Windows computers with the program Model Viewer and extracted with the utility program PHASTHDF; both programs are distributed with PHAST.
Marot, Marci E.; Adams, C. Scott; Richwine, Kathryn A.; Smith, Christopher G.; Osterman, Lisa E.; Bernier, Julie C.
2014-01-01
Scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center conducted a time-series collection of shallow sediment cores from the back-barrier environments along the Chandeleur Islands, Louisiana from March 2012 through July 2013. The sampling efforts were part of a larger USGS study to evaluate effects on the geomorphology of the Chandeleur Islands following the construction of an artificial sand berm to reduce oil transport onto federally managed lands. The objective of this study was to evaluate the response of the back-barrier tidal and wetland environments to the berm. This report serves as an archive for sedimentological, radiochemical, and microbiological data derived from the sediment cores. Data are available for a time-series of four sampling periods: March 2012; July 2012; September 2012; and July 2013. Downloadable data are available as Excel spreadsheets and as JPEG files. Additional files include: ArcGIS shapefiles of the sampling sites, detailed results of sediment grain size analyses, and formal Federal Geographic Data Committee metadata.
The sedimentological characteristics and geochronology of the marshes of Dauphin Island, Alabama
Ellis, Alisha M.; Smith, Christopher G.; Marot, Marci E.
2018-03-22
In August 2015, scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center collected 11 push cores from the marshes of Dauphin Island and Little Dauphin Island, Alabama. Sample site environments included high marshes, low salt marshes, and salt flats, and varied in distance from the shoreline. The sampling efforts were part of a larger study to assess the feasibility and sustainability of proposed restoration efforts for Dauphin Island, Alabama, and to identify trends in shoreline erosion and accretion. The data presented in this publication can provide a basis for assessing organic and inorganic sediment accumulation rates and temporal changes in accumulation rates over multiple decades at multiple locations across the island. This study was funded by the National Fish and Wildlife Foundation, via the Gulf Environmental Benefit Fund. This report serves as an archive for the sedimentological and geochemical data derived from the marsh cores. Downloadable data are available and include Microsoft Excel spreadsheets (.xlsx), comma-separated values (.csv) text files, JPEG files, and formal Federal Geographic Data Committee metadata in a U.S. Geological Survey data release.
Teaching Raster GIS Operations with Spreadsheets.
ERIC Educational Resources Information Center
Raubal, Martin; Gaupmann, Bernhard; Kuhn, Werner
1997-01-01
Defines raster technology in its relationship to geographic information systems and notes that it is typically used with the application of remote sensing techniques and scanning devices. Discusses the role of spreadsheets in a raster model, and describes a general approach based on spreadsheets. Includes six computer-generated illustrations. (MJP)
Spreadsheet Design: An Optimal Checklist for Accountants
ERIC Educational Resources Information Center
Barnes, Jeffrey N.; Tufte, David; Christensen, David
2009-01-01
Just as good grammar, punctuation, style, and content organization are important to well-written documents, basic fundamentals of spreadsheet design are essential to clear communication. In fact, the very principles of good writing should be integrated into spreadsheet workpaper design and organization. The unique contributions of this paper are…
Computer Applications: Using Electronic Spreadsheets.
ERIC Educational Resources Information Center
Riley, Connee; And Others
This instructional unit is intended to assist teachers in helping students learn to use electronic spreadsheets. The 11 learning activities included, all of which are designed for use in conjunction with Multiplan Spreadsheet Software, are arranged in order of increasing difficulty. An effort has been made to include problems applicable to each of…
Manipulative and Numerical Spreadsheet Templates for the Study of Discrete Structures.
ERIC Educational Resources Information Center
Abramovich, Sergei
1998-01-01
Argues that basic components of discrete mathematics can be introduced to students through gradual elaboration of experiences with iconic spreadsheet-based simulations of concrete materials. Suggests that the study of homogeneous and heterogeneous patterns of manipulative spreadsheet templates allows for appreciation of the development of…
Excel Spreadsheets for Algebra: Improving Mental Modeling for Problem Solving
ERIC Educational Resources Information Center
Engerman, Jason; Rusek, Matthew; Clariana, Roy
2014-01-01
This experiment investigates the effectiveness of Excel spreadsheets in a high school algebra class. Students in the experiment group convincingly outperformed the control group on a post lesson assessment. The student responses, teacher observations involving Excel spreadsheet revealed that it operated as a mindtool, which formed the users'…
ERIC Educational Resources Information Center
Barreto, Humberto
2015-01-01
This article is not the usual Excel pedagogy fare in that it does not provide an application or example taught via a spreadsheet. Instead, it briefly reviews the history of spreadsheets in the economics classroom and explores the current environment, with an emphasis on modern learning theory. The conclusion is not surprising: spreadsheets improve…
The Growing Problems with Spreadsheet Budgeting
ERIC Educational Resources Information Center
Solomon, Jeff; Johnson, Stella; Wilcox, Leon; Olson, Tom
2010-01-01
The ubiquitous spreadsheet in some version has been the sole and unrivaled instrument of financial management for decades. And it has served well. The spreadsheet provides the flexibility to design a unique business process. It allows users to create formulas that execute complex calculations, and it is available in the globally standardized Excel…
Levels of Student Responses in a Spreadsheet-Based Environment
ERIC Educational Resources Information Center
Tabach, Michal; Friedlander, Alex
2004-01-01
The purpose of this report is to investigate the range of student responses in three domains--hypothesizing, organizing data, and algebraic generalization of patterns during their work on a spreadsheet-based activity. In a wider context, we attempted to investigate students' utilization schemes of spreadsheets in their learning of introductory…
User's guide: RPGrow$: a red pine growth and analysis spreadsheet for the Lake States.
Carol A. Hyldahl; Gerald H. Grossman
1993-01-01
Describes RPGrow$, a stand-level, interactive spreadsheet for projecting growth and yield and estimating financial returns of red pine plantations in the Lake States. This spreadsheet is based on published growth models for red pine. Financial analyses are based on discounted cash flow methods.
Spreadsheets and Bulgarian Goats
ERIC Educational Resources Information Center
Sugden, Steve
2012-01-01
We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a…
Lens ray diagrams with a spreadsheet
NASA Astrophysics Data System (ADS)
González, Manuel I.
2018-05-01
Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful mixture of standard Excel functions allows to display a realistic automated ray diagram. The suggested spreadsheet is intended as an auxiliary didactic tool for instructors who wish to teach their students to create their own ray diagrams.
A computerized faculty time-management system in an academic family medicine department.
Daugird, Allen J; Arndt, Jane E; Olson, P Richard
2003-02-01
The authors describe the development, implementation, and evaluation of a computerized faculty time-management system (FTMS) in the Department of Family Medicine at the University of North Carolina-Chapel Hill. The FTMS is presented as an integrated set of computerized spreadsheets used annually to allocate faculty time across all mission activities of the department. It was first implemented in 1996 and has been continuously developed since then. An iterative approach has been used to gain consensus among faculty about time resources needed for various tasks of all missions of the department. These time-resource assumptions are used in the computerized system. Faculty time is allocated annually by the department vice chair in negotiation with individual faculty, making sure that the activities planned do not exceed the work time each faculty member has available for the year. During this process, faculty preferences are balanced against department aggregate needs to meet mission commitments and obligations. The authors describe how the computerized FTMS is used for faculty time management and career development, department planning, budget planning, clinical scheduling, and mission cost accounting. They also describe barriers and potential abuses and the challenge of building an organizational culture willing to discuss faculty time openly and committed to developing a system perceived as fair and accurate. The spreadsheet file is available free from the authors for use in other departments.
Zalkind, D; Malec, B
1988-01-01
A national survey of alumni of AUPHA programs from the classes of 1983, 1984, and 1985 was undertaken to assess their experiences in management information systems education, both formally and on the job. The survey covered 38 AUPHA graduate member programs and resulted in 1,181 responses. Over 40 percent of the alumni indicated that they had had an introductory management information systems (MIS) course in a health administration program. Since graduation, almost 90 percent have had some significant on-the-job involvement with computers, computer-generated information, or MIS. More than one-third of the respondents felt that their MIS course work did not adequately prepare them for what was expected on the job. Alumni stressed that microcomputer software applications, such as spreadsheets and data bases, are important areas for student hands-on experiences. When asked the importance of certain areas to be included in a required introductory MIS course, the alumni also recommended spreadsheet analysis and design, report writing and data presentation, and other management areas. Additional comments suggested more access to personal computers (PCs), more relevance in the curriculum to the "real world," and the importance of MIS to the career paths of alumni. Faculty suggestions from a 1984-85 survey are compared with alumni responses in order to identify curricular changes needed. Recommendations are outlined for consideration.
Lens Ray Diagrams with a Spreadsheet
ERIC Educational Resources Information Center
González, Manuel I.
2018-01-01
Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful…
Spreadsheet Modeling of Electron Distributions in Solids
ERIC Educational Resources Information Center
Glassy, Wingfield V.
2006-01-01
A series of spreadsheet modeling exercises constructed as part of a new upper-level elective course on solid state materials and surface chemistry is described. The spreadsheet exercises are developed to provide students with the opportunity to interact with the conceptual framework where the role of the density of states and the Fermi-Dirac…
Designing Spreadsheet-Based Tasks for Purposeful Algebra
ERIC Educational Resources Information Center
Ainley, Janet; Bills, Liz; Wilson, Kirsty
2005-01-01
We describe the design of a sequence of spreadsheet-based pedagogic tasks for the introduction of algebra in the early years of secondary schooling within the Purposeful Algebraic Activity project. This design combines two relatively novel features to bring a different perspective to research in the use of spreadsheets for the learning and…
Andrew C. Oishi; David Hawthorne; Ram Oren
2016-01-01
Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...
Automating Partial Period Bond Valuation with Excel's Day Counting Functions
ERIC Educational Resources Information Center
Vicknair, David; Spruell, James
2009-01-01
An Excel model for calculating the actual price of bonds under a 30 day/month, 360 day/year day counting assumption by nesting the DAYS360 function within the PV function is developed. When programmed into an Excel spreadsheet, the model can accommodate annual and semiannual payment bonds sold on or between interest dates using six fundamental…
ERIC Educational Resources Information Center
Knee, David; And Others
This booklet is the eighth in a series of nine from the Teacher Training Institute at Hofstra University (New York) and contains descriptive information about two courses included in the institute's program. The first course, by David Knee, William McKeough, and Robert Silverstone, is "Discrete Mathematical Models," which deals with…
LamLum : a tool for evaluating the financial feasibility of laminated lumber plants
E.M. (Ted) Bilek; John F. Hunt
2006-01-01
A spreadsheet-based computer program called LamLum was created to analyze the economics of value- added laminated lumber manufacturing facilities. Such facilities manufacture laminations, typically from lower grades of structural lumber, then glue these laminations together to make various types of higher value laminated lumber products. This report provides the...
A Method for Measuring Collection Expansion Rates and Shelf Space Capacities.
ERIC Educational Resources Information Center
Sapp, Gregg; Suttle, George
1994-01-01
Describes an effort to quantify annual collection expansion and shelf space capacities with a computer spreadsheet program. Methods used to quantify the space taken at the beginning of the project; to estimate annual rate of collection growth; and to plot stack space and usage, volume equivalents and usage, and growth capacity are covered.…
ERIC Educational Resources Information Center
Clarke, Matthew A.; Giraldo, Carlos
2009-01-01
Chemical process simulation is one of the most fundamental skills that is expected from chemical engineers, yet relatively few graduates have the opportunity to learn, in depth, how a process simulator works, from programming the unit operations to the sequencing. The University of Calgary offers a "hands-on" postgraduate course in…
MicroComputer Software for Predicting Growth of Southern Timber Stands
Robert M. Farrar
1992-01-01
Sixteen BASIC programs and 21 electronic spreadsheet templates for microcomputers are presented with documentation and examples of use, This software permits simulation of the growth and yield of natural stands ofeven-aged southern pines, uneven-aged loblolly-shortleaf and shortleaf pines,even-aged yellow-poplar, and of certain planted pine stands for a variety of site...
NASA Technical Reports Server (NTRS)
Fanourakis, Sofia
2015-01-01
My main project was to determine and implement updates to be made to MODEAR (Mission Operations Data Enterprise Architecture Repository) process definitions to be used for CST-100 (Crew Space Transportation-100) related missions. Emphasis was placed on the scheduling aspect of the processes. In addition, I was to complete other tasks as given. Some of the additional tasks were: to create pass-through command look-up tables for the flight controllers, finish one of the MDT (Mission Operations Directorate Display Tool) displays, gather data on what is included in the CST-100 public data, develop a VBA (Visual Basic for Applications) script to create a csv (Comma-Separated Values) file with specific information from spreadsheets containing command data, create a command script for the November MCC-ASIL (Mission Control Center-Avionics System Integration Laboratory) testing, and take notes for one of the TCVB (Terminal Configured Vehicle B-737) meetings. In order to make progress in my main project I scheduled meetings with the appropriate subject matter experts, prepared material for the meetings, and assisted in the discussions in order to understand the process or processes at hand. After such discussions I made updates to various MODEAR processes and process graphics. These meetings have resulted in significant updates to the processes that were discussed. In addition, the discussions have helped the departments responsible for these processes better understand the work ahead and provided material to help document how their products are created. I completed my other tasks utilizing resources available to me and, when necessary, consulting with the subject matter experts. Outputs resulting from my other tasks were: two completed and one partially completed pass through command look-up tables for the fight controllers, significant updates to one of the MDT displays, a spreadsheet containing data on what is included in the CST-100 public data, a tool to create a csv file with specific information from spreadsheets containing command data, a command script for the November MCC-ASIL testing which resulted in a successful test day identifying several potential issues, and notes from one of the TCVB meetings that was used to keep the teams up to date on what was discussed and decided. I have learned a great deal working at NASA these last four months. I was able to meet and work with amazing individuals, further develop my technical knowledge, expand my knowledge base regarding human spaceflight, and contribute to the CST-100 missions. My work at NASA has strengthened my desire to continue my education in order to make further contributions to the field, and has given me the opportunity to see the advantages of a career at NASA.
A Spreadsheet for a 2 x 3 x 2 Log-Linear Analysis. AIR 1991 Annual Forum Paper.
ERIC Educational Resources Information Center
Saupe, Joe L.
This paper describes a personal computer spreadsheet set up to carry out hierarchical log-linear analyses, a type of analysis useful for institutional research into multidimensional frequency tables formed from categorical variables such as faculty rank, student class level, gender, or retention status. The spreadsheet provides a concrete vehicle…
Using Spreadsheets to Help Students Think Recursively
ERIC Educational Resources Information Center
Webber, Robert P.
2012-01-01
Spreadsheets lend themselves naturally to recursive computations, since a formula can be defined as a function of one of more preceding cells. A hypothesized closed form for the "n"th term of a recursive sequence can be tested easily by using a spreadsheet to compute a large number of the terms. Similarly, a conjecture about the limit of a series…
A Spreadsheet Tool for Learning the Multiple Regression F-Test, T-Tests, and Multicollinearity
ERIC Educational Resources Information Center
Martin, David
2008-01-01
This note presents a spreadsheet tool that allows teachers the opportunity to guide students towards answering on their own questions related to the multiple regression F-test, the t-tests, and multicollinearity. The note demonstrates approaches for using the spreadsheet that might be appropriate for three different levels of statistics classes,…
ERIC Educational Resources Information Center
Abramovich, Sergei
2016-01-01
The paper presents the use of spreadsheets integrated with digital tools capable of symbolic computations and graphic constructions in a master's level capstone course for secondary mathematics teachers. Such use of spreadsheets is congruent with the Type II technology applications framework aimed at the development of conceptual knowledge in the…
1988-09-01
scheduler’s knowledge of available employees ’ experience levels. If the scheduler lacks first-hand knowledge of employee experience levels, then assistance ...to start a new system of rotating primary employees and asked that this capability be included in the program . Yet he had only a vague idea 29 about...competitive with the DBASE prototype. The LOTUS 123 program was based around a spreadsheet that contained all the job, employee and schedule form data in a
NASA Astrophysics Data System (ADS)
Ariana, I. M.; Bagiada, I. M.
2018-01-01
Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).
Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools.
Villar, Alejandro; Zarrabeitia, María T; Fdez-Arroyabe, Pablo; Santurtún, Ana
2018-06-01
Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.
Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools
NASA Astrophysics Data System (ADS)
Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana
2018-03-01
Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.
Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools
NASA Astrophysics Data System (ADS)
Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana
2018-06-01
Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.
User Interactive Software for Analysis of Human Physiological Data
NASA Technical Reports Server (NTRS)
Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta
2006-01-01
Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpkins, A.A.
1996-09-01
AXAOTHER XL is an Excel Spreadsheet used to determine dose to the maximally exposed offsite individual during high-velocity straight winds or tornado conditions. Both individual and population doses may be considered. Potential exposure pathways are inhalation and plume shine. For high-velocity straight winds the spreadsheet has the capability to determine the downwind relative air concentration, however for the tornado conditions, the user must enter the relative air concentration. Theoretical models are discussed and hand calculations are performed to ensure proper application of methodologies. A section has also been included that contains user instructions for the spreadsheet.
Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M
2018-06-01
This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.
Odukoya, Jonathan A; Popoola, Segun I; Atayero, Aderemi A; Omole, David O; Badejo, Joke A; John, Temitope M; Olowo, Olalekan O
2018-04-01
In Nigerian universities, enrolment into any engineering undergraduate program requires that the minimum entry criteria established by the National Universities Commission (NUC) must be satisfied. Candidates seeking admission to study engineering discipline must have reached a predetermined entry age and met the cut-off marks set for Senior School Certificate Examination (SSCE), Unified Tertiary Matriculation Examination (UTME), and the post-UTME screening. However, limited effort has been made to show that these entry requirements eventually guarantee successful academic performance in engineering programs because the data required for such validation are not readily available. In this data article, a comprehensive dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university is presented and carefully analyzed. A total sample of 1445 undergraduates that were admitted between 2005 and 2009 to study Chemical Engineering (CHE), Civil Engineering (CVE), Computer Engineering (CEN), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mechanical Engineering (MEE), and Petroleum Engineering (PET) at Covenant University, Nigeria were randomly selected. Entry age, SSCE aggregate, UTME score, Covenant University Scholastic Aptitude Screening (CUSAS) score, and the Cumulative Grade Point Average (CGPA) of the undergraduates were obtained from the Student Records and Academic Affairs unit. In order to facilitate evidence-based evaluation, the robust dataset is made publicly available in a Microsoft Excel spreadsheet file. On yearly basis, first-order descriptive statistics of the dataset are presented in tables. Box plot representations, frequency distribution plots, and scatter plots of the dataset are provided to enrich its value. Furthermore, correlation and linear regression analyses are performed to understand the relationship between the entry requirements and the corresponding academic performance in engineering programs. The data provided in this article will help Nigerian universities, the NUC, engineering regulatory bodies, and relevant stakeholders to objectively evaluate and subsequently improve the quality of engineering education in the country.
NASA Technical Reports Server (NTRS)
Gakin, R.; Lewis, K.; Simmons, J.; Gchachu, K.; Karner, J. M.; Newsom, H. E.; Jones, R. H.
2003-01-01
Determining the origin and chemical composition of suspect extra terrestrial specimens has lead to meteorite identification research programs. Such programs, like the University of New Mexico-Southwestern Indian Polytechnic Institute partnership, are being inundated with many non-meteorites (meteor wrongs) sent in by interested individuals from all over the world. This meteorite identification program developed a spreadsheet that aids in identifying the types of minerals in a sample for physical properties, possible meteorite characteristics, minerals and rock properties, and possible man made characteristics. Samples that show meteorite distinctiveness are further analyzed via the Scanning Electron Microprobe (SEM).
MEMS product engineering: methodology and tools
NASA Astrophysics Data System (ADS)
Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer
2011-03-01
The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.
Gravity data from the San Pedro River Basin, Cochise County, Arizona
Kennedy, Jeffrey R.; Winester, Daniel
2011-01-01
The U.S. Geological Survey, Arizona Water Science Center in cooperation with the National Oceanic and Atmospheric Administration, National Geodetic Survey has collected relative and absolute gravity data at 321 stations in the San Pedro River Basin of southeastern Arizona since 2000. Data are of three types: observed gravity values and associated free-air, simple Bouguer, and complete Bouguer anomaly values, useful for subsurface-density modeling; high-precision relative-gravity surveys repeated over time, useful for aquifer-storage-change monitoring; and absolute-gravity values, useful as base stations for relative-gravity surveys and for monitoring gravity change over time. The data are compiled, without interpretation, in three spreadsheet files. Gravity values, GPS locations, and driving directions for absolute-gravity base stations are presented as National Geodetic Survey site descriptions.
Christopher P. Hansen; Mark A. Rumble; Joshua J. Millspaugh
2010-01-01
Monitoring ruffed grouse (Bonasa umbellus) in the Black Hills National Forest is a priority for forest managers due to the bird's status as the management indicator species for quaking aspen (Populus tremuloides) and its value to hunters and other recreational groups. We conducted drumming surveys, estimated occupancy, and assessed the influence of sampling and...
Fourment, Mathieu; Gibbs, Mark J
2008-01-01
Background Viruses of the Bunyaviridae have segmented negative-stranded RNA genomes and several of them cause significant disease. Many partial sequences have been obtained from the segments so that GenBank searches give complex results. Sequence databases usually use HTML pages to mediate remote sorting, but this approach can be limiting and may discourage a user from exploring a database. Results The VirusBanker database contains Bunyaviridae sequences and alignments and is presented as two spreadsheets generated by a Java program that interacts with a MySQL database on a server. Sequences are displayed in rows and may be sorted using information that is displayed in columns and includes data relating to the segment, gene, protein, species, strain, sequence length, terminal sequence and date and country of isolation. Bunyaviridae sequences and alignments may be downloaded from the second spreadsheet with titles defined by the user from the columns, or viewed when passed directly to the sequence editor, Jalview. Conclusion VirusBanker allows large datasets of aligned nucleotide and protein sequences from the Bunyaviridae to be compiled and winnowed rapidly using criteria that are formulated heuristically. PMID:18251994
DARPA Initiative in Concurrent Engineering (DICE). Phase 2
1990-07-31
XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for
Integrated developmental model of life-support capabilities in wheat
NASA Technical Reports Server (NTRS)
Darnell, R. L.; Obrien, C. O.
1994-01-01
The objective of this project was to develop a model for CO2, O2, H2O, and nitrogen use during the life cycle of wheat. Spreadsheets and accompanying graphs were developed to illustrate plant population reactions to environmental parameters established in the Controlled Ecological Life Support System (CELSS) program at Kennedy Space Center, Fl. The spreadsheets and graphs were produced using validated biomass production chamber (BPC) data from BWT931. Conditions of the BPC during the 83 day plant growth period were as follows: The BPC area is 27.8 m(exp 2), volume is 113 m(exp 3). Temperatures during the 83 day plant growth period ranged from 16.3 to 24.8 C during the light cycle (except for day 69, when the minimum and maximum temperatures were 7.7 C and 7.9 C, respectively) and 14.5 C and 23.6 C during the dark cycle (except for day 49, when the minimum and maximum temperatures were 11.1 C and 11.3 C, respectively). Relative humidity was 85 percent for the first seven days of plant growth, and 70 percent thereafter. The plant leaf canopy area was 10 m(exp 2). Presented is a list and explanation of each spreadsheet and accompanying graph(s), conditions under which the data were collected, and formulas used to obtain each result.
PC-SEAPAK - ANALYSIS OF COASTAL ZONE COLOR SCANNER AND ADVANCED VERY HIGH RESOLUTION RADIOMETER DATA
NASA Technical Reports Server (NTRS)
Mcclain, C. R.
1994-01-01
PC-SEAPAK is a user-interactive satellite data analysis software package specifically developed for oceanographic research. The program is used to process and interpret data obtained from the Nimbus-7/Coastal Zone Color Scanner (CZCS), and the NOAA Advanced Very High Resolution Radiometer (AVHRR). PC-SEAPAK is a set of independent microcomputer-based image analysis programs that provide the user with a flexible, user-friendly, standardized interface, and facilitates relatively low-cost analysis of oceanographic satellite data. Version 4.0 includes 114 programs. PC-SEAPAK programs are organized into categories which include CZCS and AVHRR level-1 ingest, level-2 analyses, statistical analyses, data extraction, remapping to standard projections, graphics manipulation, image board memory manipulation, hardcopy output support and general utilities. Most programs allow user interaction through menu and command modes and also by the use of a mouse. Most programs also provide for ASCII file generation for further analysis in spreadsheets, graphics packages, etc. The CZCS scanning radiometer aboard the NIMBUS-7 satellite was designed to measure the concentration of photosynthetic pigments and their degradation products in the ocean. AVHRR data is used to compute sea surface temperatures and is supported for the NOAA 6, 7, 8, 9, 10, 11, and 12 satellites. The CZCS operated from November 1978 to June 1986. CZCS data may be obtained free of charge from the CZCS archive at NASA/Goddard Space Flight Center. AVHRR data may be purchased through NOAA's Satellite Data Service Division. Ordering information is included in the PC-SEAPAK documentation. Although PC-SEAPAK was developed on a COMPAQ Deskpro 386/20, it can be run on most 386-compatible computers with an AT bus, EGA controller, Intel 80387 coprocessor, and MS-DOS 3.3 or higher. A Matrox MVP-AT image board with appropriate monitor and cables is also required. Note that the authors have received some reports of incompatibilities between the MVP-AT image board and ZENITH computers. Also, the MVP-AT image board is not necessarily compatible with 486-based systems; users of 486-based systems should consult with Matrox about compatibility concerns. Other PC-SEAPAK requirements include a Microsoft mouse (serial version), 2Mb RAM, and 100Mb hard disk space. For data ingest and backup, 9-track tape, 8mm tape and optical disks are supported and recommended. PC-SEAPAK has been under development since 1988. Version 4.0 was updated in 1992, and is distributed without source code. It is available only as a set of 36 1.2Mb 5.25 inch IBM MS-DOS format diskettes. PC-SEAPAK is a copyrighted product with all copyright vested in the National Aeronautics and Space Administration. Phar Lap's DOS_Extender run-time version is integrated into several of the programs; therefore, the PC-SEAPAK programs may not be duplicated. Three of the distribution diskettes contain DOS_Extender files. One of the distribution diskettes contains Media Cybernetics' HALO88 font files, also licensed by NASA for dissemination but not duplication. IBM is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. HALO88 is a registered trademark of Media Cybernetics, but the product was discontinued in 1991.
Improvements to the Magnetics Information Consortium (MagIC) Paleo and Rock Magnetic Database
NASA Astrophysics Data System (ADS)
Jarboe, N.; Minnett, R.; Tauxe, L.; Koppers, A. A. P.; Constable, C.; Jonestrask, L.
2015-12-01
The Magnetic Information Consortium (MagIC) database (http://earthref.org/MagIC/) continues to improve the ease of data uploading and editing, the creation of complex searches, data visualization, and data downloads for the paleomagnetic, geomagnetic, and rock magnetic communities. Online data editing is now available and the need for proprietary spreadsheet software is therefore entirely negated. The data owner can change values in the database or delete entries through an HTML 5 web interface that resembles typical spreadsheets in behavior and uses. Additive uploading now allows for additions to data sets to be uploaded with a simple drag and drop interface. Searching the database has improved with the addition of more sophisticated search parameters and with the facility to use them in complex combinations. A comprehensive summary view of a search result has been added for increased quick data comprehension while a raw data view is available if one desires to see all data columns as stored in the database. Data visualization plots (ARAI, equal area, demagnetization, Zijderveld, etc.) are presented with the data when appropriate to aid the user in understanding the dataset. MagIC data associated with individual contributions or from online searches may be downloaded in the tab delimited MagIC text file format for susbsequent offline use and analysis. With input from the paleomagnetic, geomagnetic, and rock magnetic communities, the MagIC database will continue to improve as a data warehouse and resource.
A web-based relational database for monitoring and analyzing mosquito population dynamics.
Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric
2008-07-01
Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.
Permanent-File-Validation Utility Computer Program
NASA Technical Reports Server (NTRS)
Derry, Stephen D.
1988-01-01
Errors in files detected and corrected during operation. Permanent File Validation (PFVAL) utility computer program provides CDC CYBER NOS sites with mechanism to verify integrity of permanent file base. Locates and identifies permanent file errors in Mass Storage Table (MST) and Track Reservation Table (TRT), in permanent file catalog entries (PFC's) in permit sectors, and in disk sector linkage. All detected errors written to listing file and system and job day files. Program operates by reading system tables , catalog track, permit sectors, and disk linkage bytes to vaidate expected and actual file linkages. Used extensively to identify and locate errors in permanent files and enable online correction, reducing computer-system downtime.
Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.
2018-01-01
Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737
Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D
2018-01-01
Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.
Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J
2012-01-01
Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.
A primer for biomedical scientists on how to execute model II linear regression analysis.
Ludbrook, John
2012-04-01
1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.
Reed, Shelby D.; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L.; Bowers, Margaret T.; Samsa, Gregory P.; Paul, Sara; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara J.
2011-01-01
Background Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Methods and Results Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers or health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. Conclusions The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions. PMID:22147884
Caine, Jonathan S.; Manning, Andrew H.; Verplanck, Philip L.; Bove, Dana J.; Kahn, Katherine Gurley; Ge, Shemin
2006-01-01
Integrated, multidisciplinary studies of the Handcart Gulch alpine watershed provide a unique opportunity to study and characterize the geology and hydrology of an alpine watershed along the Continental Divide. The study area arose out of the donation of four abandoned, deep mineral exploration boreholes to the U.S. Geological Survey for research purposes by Mineral Systems Inc. These holes were supplemented with nine additional shallow holes drilled by the U.S. Geological Survey along the Handcart Gulch trunk stream. All of the holes were converted into observation wells, and a variety of data and samples were measured and collected from each. This open-file report contains: (1) An overview of the research conducted to date in Handcart Gulch; (2) well location, construction, lithologic log, and water level data from the research boreholes; and (3) a brief synopsis of preliminary results. The primary purpose of this report is to provide a research overview as well as raw data from the boreholes. Interpretation of the data will be reported in future publications. The drill hole data were tabulated into a spreadsheet included with this digital open-file report.
Confirmatory factor analysis using Microsoft Excel.
Miles, Jeremy N V
2005-11-01
This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.
ERIC Educational Resources Information Center
Pye, Cory C.; Mercer, Colin J.
2012-01-01
The symbolic algebra program Maple and the spreadsheet Microsoft Excel were used in an attempt to reproduce the Gaussian fits to a Slater-type orbital, required to construct the popular STO-NG basis sets. The successes and pitfalls encountered in such an approach are chronicled. (Contains 1 table and 3 figures.)
Closed Loop Analysis Meta-Language Program (CLAMP)
2012-05-01
formats of Spreadsheets, XML, MCPML, or something else should be the ( anthropometry or other) experts’ productivity in: 1) crafting data 2) applying...FORCE MATERIEL COMMAND UNITED STATES AIR FORCE NOTICE AND SIGNATURE PAGE Using Government drawings, specifications, or other data included in...formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation; or convey any rights or
Individualized Human CAD Models: Anthropmetric Morphing and Body Tissue Layering
2014-07-31
Part Flow Chart of the Interaction among VBA Macros, Excel® Spreadsheet, and SolidWorks Front View of the Male and Female Soldier CAD Model...yellow highlighting. The spreadsheet is linked to the CAD model by macros created with the Visual Basic for Application ( VBA ) editor in Microsoft Excel...basically three working parts to the anthropometric morphing that are all interconnected ( VBA macros, Excel spreadsheet, and SolidWorks). The flow
ERIC Educational Resources Information Center
Ge, Yingbin; Rittenhouse, Robert C.; Buchanan, Jacob C.; Livingston, Benjamin
2014-01-01
We have designed an exercise suitable for a lab or project in an undergraduate physical chemistry course that creates a Microsoft Excel spreadsheet to calculate the energy of the S[subscript 0] ground electronic state and the S[subscript 1] and T[subscript 1] excited states of H[subscript 2]. The spreadsheet calculations circumvent the…
Automatic computer subprogram selection from application program libraries
NASA Technical Reports Server (NTRS)
Drozdowski, J. M.
1972-01-01
The program ALTLIB (ALTernate LIBrary) which allows a user access to an alternate subprogram library with a minimum effort is discussed. The ALTLIB program selects subprograms from an alternate library file and merges them with the user's program load file. Only subprograms that are called for (directly or indirectly) by the user's programs and that are available on the alternate library file will be selected. ALTLIB eliminates the need for elaborate control-card manipulations to add subprograms from a subprogram file. ALTLIB returns to the user his binary file and the selected subprograms in correct order for a call to the loader. The user supplies the alternate library file. Subprogram requests which are not satisfied from the alternate library file will be satisfied at load time from the system library.
NEMAR plotting computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.
A generic model for evaluating payor net cost savings from a disease management program.
McKay, Niccie L
2006-01-01
Private and public payors increasingly are turning to disease management programs as a means of improving the quality of care provided and controlling expenditures for individuals with specific medical conditions. This article presents a generic model that can be adapted to evaluate payor net cost savings from a variety of types of disease management programs, with net cost savings taking into account both changes in expenditures resulting from the program and the costs of setting up and operating the program. The model specifies the required data, describes the data collection process, and shows how to calculate the net cost savings in a spreadsheet format. An accompanying hypothetical example illustrates how to use the model.
ELIPGRID-PC: A PC program for calculating hot spot probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, J.R.
1994-10-01
ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer`s 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer`s published ELIPGRID results. An apparent error in the original ELIPGRIDmore » code has been uncovered and an appropriate modification incorporated into the new program.« less
ERIC Educational Resources Information Center
Smith, Michael
1990-01-01
Presents several examples of the iteration method using computer spreadsheets. Examples included are simple iterative sequences and the solution of equations using the Newton-Raphson formula, linear interpolation, and interval bisection. (YP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKone, T.E.; Enoch, K.G.
2002-08-01
CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
Medical imaging informatics based solutions for human performance analytics
NASA Astrophysics Data System (ADS)
Verma, Sneha; McNitt-Gray, Jill; Liu, Brent J.
2018-03-01
For human performance analysis, extensive experimental trials are often conducted to identify the underlying cause or long-term consequences of certain pathologies and to improve motor functions by examining the movement patterns of affected individuals. Data collected for human performance analysis includes high-speed video, surveys, spreadsheets, force data recordings from instrumented surfaces etc. These datasets are recorded from various standalone sources and therefore captured in different folder structures as well as in varying formats depending on the hardware configurations. Therefore, data integration and synchronization present a huge challenge while handling these multimedia datasets specifically for large datasets. Another challenge faced by researchers is querying large quantity of unstructured data and to design feedbacks/reporting tools for users who need to use datasets at various levels. In the past, database server storage solutions have been introduced to securely store these datasets. However, to automate the process of uploading raw files, various file manipulation steps are required. In the current workflow, this file manipulation and structuring is done manually and is not feasible for large amounts of data. However, by attaching metadata files and data dictionaries with these raw datasets, they can provide information and structure needed for automated server upload. We introduce one such system for metadata creation for unstructured multimedia data based on the DICOM data model design. We will discuss design and implementation of this system and evaluate this system with data set collected for movement analysis study. The broader aim of this paper is to present a solutions space achievable based on medical imaging informatics design and methods for improvement in workflow for human performance analysis in a biomechanics research lab.
Fortran graphics routines for the Macintosh
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shore, B.W.
1992-06-01
The Language Systems MPW Fortran is a popular Fortran compiler for the Macintosh. Unfortunately, it does not have any built-in calls to graphics routines (such as are available with Graflib on the NLTSS), so there is no simple way to make x-y plots from calls within Fortran. Instead, a file of data must be created and a commercial plotting routine (such as IGOR or KALEIDAGRAPH) or a spreadsheet with graphics (such as WINGZ) must be applied to post-process the data. The Macintosh does have available many built-in calls (to the Macintosh Toolbox) that allow drawing shapes and lines with quickdraw,more » but these are not designed for plotting functions and are difficult to learn to use. This work outlines some Fortran routines that can be called from LS Fortran to make the necessary calls to the Macintosh toolbox to create simple two-dimensional plots or contour plots. The source code DEMOGRAF.F shows how these routines may be used. DEMOGRAF.F simply demonstrates some Fortran subroutines that can be called with language systems MPW Fortran on the Macintosh to plot arrays of numbers. The subroutines essentially mimic the functionality that has been available at LTSS and NLTSS and UNICOS at LLNL. The graphics primitives are kept in four separate files, each containing several subroutines. The subroutines are compiled and stored in a library file, LIBgraf.o. Makefile is used to link this library to the source code. A discussion is included on requirements for interactive plotting of functions.« less
CALCULATIONAL TOOL FOR SKIN CONTAMINATION DOSE ESTIMATE
DOE Office of Scientific and Technical Information (OSTI.GOV)
HILL, R.L.
2005-03-31
A spreadsheet calculational tool was developed to automate the calculations performed for estimating dose from skin contamination. This document reports on the design and testing of the spreadsheet calculational tool.
NASA Technical Reports Server (NTRS)
Lee, C. H.
1978-01-01
The CELFE computer program and user's manual, together with the execution of the CELFE/NASTRAN system, are described. The execution procedure and the transfer of data between the CELFE and NASTRAN programs are controlled through the use of DATA files in the Univac 1100 system. Five data files are used to control the runstream and data transfer, and three files are used to hold the programs. These files are contained on a single tape. Changes in NASTRAN routines required by the present analysis are also discussed in this report. All the program listings, except the last two files (where the absolute and relocatable elements are stored), are included in the appendixes.
Successful Municipal Separate Storm Sewer System Programs Implemented in the Navy - NESDI #494
2014-06-01
account. Lastly, upon speaking with numerous stormwater personnel who use a spreadsheet software for data tracking, they recommended that staying well...existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding...an organized manner. In the long-term, a comprehensive electronic methodology is recommended to keep data organized, be more efficient and to keep
ERIC Educational Resources Information Center
Sims, Paul A.
2012-01-01
A brief history of the development of the empirical equation that is used by prominent, Internet-based programs to estimate (or calculate) the extinction coefficients of proteins is presented. In addition, an overview of a series of related assignments designed to help students understand the origin of the empirical equation is provided. The…
Research and Development for Robotic Transportable Waste to Energy System (TWES)
2012-01-01
Engineers, April 2003. NFESC UG-2039-ENV, Qualified Recycling Program (QRP) Guide; July 2000 (NOTAL) Paisley, M.A., Anson, D., “ Biomass Gasification ...Full Load Biomass Simulation .............................19 Figure 9. Spreadsheet-Based Heat and Mass Balance—Diesel Operation at 5:00 p.m...diesel fuel. Based on simulation of full-load biomass operation, the diesel-fueled test was expected to demonstrate a 75% net fuel-to-steam efficiency
LACIE performance predictor final operational capability program description, volume 2
NASA Technical Reports Server (NTRS)
1976-01-01
Given the swath table files, the segment set for one country and cloud cover data, the SAGE program determines how many times and under what conditions each segment is accessed by satellites. The program writes a record for each segment on a data file which contains the pertinent acquisition data. The weather data file can also be generated from a NASA supplied tape. The Segment Acquisition Selector Program (SACS) selects data from the segment reference file based upon data input manually and from a crop window file. It writes the extracted data to a data acquisition file and prints two summary reports. The POUT program reads from associated LACIE files and produces printed reports. The major types of reports that can be produced are: (1) Substrate Reference Data Reports, (2) Population Mean, Standard Deviation and Histogram Reports, (3) Histograms of Monte Carlo Statistics Reports, and (4) Frequency of Sample Segment Acquisitions Reports.
Fitting Planetary Orbits with a Spreadsheet.
ERIC Educational Resources Information Center
Bridges, Richard
1995-01-01
Describes how to fit binocular observations of the planets to a theoretical model of circular orbits using a modern computer spreadsheet, from which fundamental data about the solar system may be deduced. (AIM)
Strontium-90 Error Discovered in Subcontract Laboratory Spreadsheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. D. Brown A. S. Nagel
1999-07-31
West Valley Demonstration Project health physicists and environment scientists discovered a series of errors in a subcontractor's spreadsheet being used to reduce data as part of their strontium-90 analytical process.
medplot: a web application for dynamic summary and analysis of longitudinal medical data based on R.
Ahlin, Črt; Stupica, Daša; Strle, Franc; Lusa, Lara
2015-01-01
In biomedical studies the patients are often evaluated numerous times and a large number of variables are recorded at each time-point. Data entry and manipulation of longitudinal data can be performed using spreadsheet programs, which usually include some data plotting and analysis capabilities and are straightforward to use, but are not designed for the analyses of complex longitudinal data. Specialized statistical software offers more flexibility and capabilities, but first time users with biomedical background often find its use difficult. We developed medplot, an interactive web application that simplifies the exploration and analysis of longitudinal data. The application can be used to summarize, visualize and analyze data by researchers that are not familiar with statistical programs and whose knowledge of statistics is limited. The summary tools produce publication-ready tables and graphs. The analysis tools include features that are seldom available in spreadsheet software, such as correction for multiple testing, repeated measurement analyses and flexible non-linear modeling of the association of the numerical variables with the outcome. medplot is freely available and open source, it has an intuitive graphical user interface (GUI), it is accessible via the Internet and can be used within a web browser, without the need for installing and maintaining programs locally on the user's computer. This paper describes the application and gives detailed examples describing how to use the application on real data from a clinical study including patients with early Lyme borreliosis.
Documentation of spreadsheets for the analysis of aquifer-test and slug-test data
Halford, Keith J.; Kuniansky, Eve L.
2002-01-01
Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.
43 CFR 4.1362 - Where to file; when to file.
Code of Federal Regulations, 2010 CFR
2010-10-01
... APPEALS PROCEDURES Special Rules Applicable to Surface Coal Mining Hearings and Appeals Request for Review... Transfer, Assignment Or Sale of Rights Granted Under Permit (federal Program; Federal Lands Program... file; when to file. (a) The request for review shall be filed with the Hearings Division, Office of...
Geoscience data visualization and analysis using GeoMapApp
NASA Astrophysics Data System (ADS)
Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha
2013-04-01
Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.
FD_BH: a program for simulating electromagnetic waves from a borehole antenna
Ellefsen, Karl J.
2002-01-01
Program FD_BH is used to simulate the electromagnetic waves generated by an antenna in a borehole. The model representing the antenna may include metallic parts, a coaxial cable as a feed to the driving point, and resistive loading. The program is written in the C programming language, and the program has been tested on both the Windows and the UNIX operating systems. This Open-File Report describes • The contents and organization of the Zip file (section 2). • The program files, the installation of the program, the input files, and the execution of the program (section 3). • Address to which suggestions for improving the program may be sent (section 4).
Tell Me about Your Lemonade Stand
NASA Technical Reports Server (NTRS)
Tibbitts, Scott
2004-01-01
At STARSYS, we execute many Firm Fixed Price (FFP) programs for the development of mechanical systems for spacecraft. Contracting this way pre-suppose that we have the ability to establish and hold scope for a system that has yet to be defined. To do FFP, it is critical that we have program managers who are masters at cost control. Fortunately, we have some "masters" in our company. They just seem to have a knack for driving to a financial target. Doesn't matter if the program has contingency or not. Doesn't seem to matter if they use MS Project, Excel spread-sheets, or the back of an envelope. Doesn't even seem to matter whether the program is set up as a financial challenge or a winner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enghauser, Michael
2016-02-01
The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.
Software For Design And Analysis Of Tanks And Cylindrical Shells
NASA Technical Reports Server (NTRS)
Luz, Paul L.; Graham, Jerry B.
1995-01-01
Skin-stringer Tank Analysis Spreadsheet System (STASS) computer program developed for use as preliminary design software tool that enables quick-turnaround design and analysis of structural domes and cylindrical barrel sections in propellant tanks or other cylindrical shells. Determines minimum required skin thicknesses for domes and cylindrical shells to withstand material failure due to applied pressures (ullage and/or hydrostatic) and runs buckling analyses on cylindrical shells and skin-stringers. Implemented as workbook program, using Microsoft Excel v4.0 on Macintosh II. Also implemented using Microsoft Excel v4.0 for Microsoft Windows v3.1 IBM PC.
A programmable rules engine to provide clinical decision support using HTML forms.
Heusinkveld, J; Geissbuhler, A; Sheshelidze, D; Miller, R
1999-01-01
The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser.
Spreadsheets in Science Teaching.
ERIC Educational Resources Information Center
Elliot, Chris
1988-01-01
Described is the use of a spreadsheet to model dynamic phenomena using numerical iterative methods. Uses the discharge of a capacitor, simple and damped harmonic motion, and the flow of heat along a bar as examples. (Author/CW)
Spreadsheet Works: Graphing Functions on a Spreadsheet.
ERIC Educational Resources Information Center
Ramamurthi, V. S.
1989-01-01
Explains graphing functions when using LOTUS 1-2-3. Provides examples and explains keystroke entries needed to make the graphs. Notes up to six functions can be displayed on the same set of axes. (MVL)
Fitting Orbits to Jupiter's Moons with a Spreadsheet.
ERIC Educational Resources Information Center
Bridges, Richard
1995-01-01
Describes how a spreadsheet is used to fit a circular orbit model to observations of Jupiter's moons made with a small telescope. Kepler's Third Law and the inverse square law of gravity are observed. (AIM)
Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data
NASA Technical Reports Server (NTRS)
Maine, Richard E.
1987-01-01
This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.
Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025
NASA Astrophysics Data System (ADS)
Banegas, J. M.; Orué, M. W.
2016-07-01
Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
Listings of source programs and some illustrative examples of various ASCII data base files are presented. The listings are grouped into the following categories: main programs, subroutine programs, illustrative ASCII data base files. Within each category files are listed alphabetically.
Smartfiles: An OO approach to data file interoperability
NASA Technical Reports Server (NTRS)
Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John
1995-01-01
Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.
Snyder, Daniel T.; Haluska, Tana L.; Respini-Irwin, Darius
2013-01-01
The Shoreline Management Tool is a geographic information system (GIS) based program developed to assist water- and land-resource managers in assessing the benefits and effects of changes in surface-water stage on water depth, inundated area, and water volume. Additionally, the Shoreline Management Tool can be used to identify aquatic or terrestrial habitat areas where conditions may be suitable for specific plants or animals as defined by user-specified criteria including water depth, land-surface slope, and land-surface aspect. The tool can also be used to delineate areas for use in determining a variety of hydrologic budget components such as surface-water storage, precipitation, runoff, or evapotranspiration. The Shoreline Management Tool consists of two parts, a graphical user interface for use with Esri™ ArcMap™ GIS software to interact with the user to define scenarios and map results, and a spreadsheet in Microsoft® Excel® developed to display tables and graphs of the results. The graphical user interface allows the user to define a scenario consisting of an inundation level (stage), land areas (parcels), and habitats (areas meeting user-specified conditions) based on water depth, slope, and aspect criteria. The tool uses data consisting of land-surface elevation, tables of stage/volume and stage/area, and delineated parcel boundaries to produce maps (data layers) of inundated areas and areas that meet the habitat criteria. The tool can be run in a Single-Time Scenario mode or in a Time-Series Scenario mode, which uses an input file of dates and associated stages. The spreadsheet part of the tool uses a macro to process the results from the graphical user interface to create tables and graphs of inundated water volume, inundated area, dry area, and mean water depth for each land parcel based on the user-specified stage. The macro also creates tables and graphs of the area, perimeter, and number of polygons comprising the user-specified habitat areas within each parcel. The Shoreline Management Tool is highly transferable, using easily generated or readily available data. The capabilities of the tool are demonstrated using data from the lower Wood River Valley adjacent to Upper Klamath and Agency Lakes in southern Oregon.
Review of the Fiscal Year 2014 (FY14) Defense Environmental International Cooperation (DEIC) Program
2015-05-01
above, this spreadsheet lists all proposed projects by project number and title, the DEIC funds requested for each, the funding for approved projects...originating request, including DoD action officer, email address, and commercial and DSN phone number ; 3. Description – Explain why this...3 Numbers do not add to 100 percent (or $1,691,000) because they do not include funding to support overall DEIC
A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications
1991-12-01
prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the
ERIC Educational Resources Information Center
Clark, Joy L.; Hegji, Charles E.
1997-01-01
Notes that using spreadsheets to teach microeconomics principles enables learning by doing in the exploration of basic concepts. Introduction of increasingly complex topics leads to exploration of theory and managerial decision making. (SK)
Building Your Own Regression Model
ERIC Educational Resources Information Center
Horton, Robert, M.; Phillips, Vicki; Kenelly, John
2004-01-01
Spreadsheets to explore regression with an algebra 2 class in a medium-sized rural high school are presented. The use of spreadsheets can help students develop sophisticated understanding of mathematical models and use them to describe real-world phenomena.
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
NASA Technical Reports Server (NTRS)
Fanselow, J. L.; Vavrus, J. L.
1984-01-01
ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.
Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.
Easlon, Hsien Ming; Bloom, Arnold J
2014-07-01
Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.
Indoor air pollutants from household-product sources: Project report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sack, T.M.; Steele, D.H.
1991-09-01
A Gas Chromatography/Mass Spectrometry (GS/MS) data base obtained during the analysis of 1,159 household products for six common chlorocarbon solvents has been reanalyzed for the presence and concentration of 25 additional chemicals. Using computerized GS/MS software, 1,043 of the original GC/MS data files were recovered and analyzed for the presence of the additional chemicals. Of the 25 additional chemicals, those found most frequently in the household products include acetone (315 products), 2-butanone (200 products), methylcyclohexane (150 products), toluene (488 products), ethylbenzene (157 products), m-xylene (101 products), and o.p-xylene (93 products). A total of 63.6% of the products analyzed in themore » study contained one or more of the 25 additional analytes at concentrations greater than or equal to 0.1% by weight. The quantitative information presented in the report is also available on diskette in a spreadsheet format.« less
Elliott, Peggy E.; Moreo, Michael T.
2011-01-01
From 1951 to 2008, groundwater withdrawals totaled more than 25,000 million gallons from wells on and directly adjacent to the Nevada National Security Site. Total annual groundwater withdrawals ranged from about 30 million gallons in 1951 to as much as 1,100 million gallons in 1989. Annual withdrawals from individual wells ranged from 0 million gallons to more than 325 million gallons. Monthly withdrawal data for the wells were compiled in a Microsoft(copyright) Excel 2003 spreadsheet. Groundwater withdrawal data are a compilation of measured and estimated withdrawals obtained from published and unpublished reports, U.S. Geological Survey files, and/or data reported by other agencies. The withdrawal data were collected from 42 wells completed in 33 boreholes. A history of each well is presented in terms of its well construction, borehole lithology, withdrawals, and water levels.
Do Vampires Exist? Using Spreadsheets To Investigate a Common Folktale.
ERIC Educational Resources Information Center
Drier, Hollylynne Stohl
1999-01-01
Describes the use of spreadsheets in a third grade class to teach basic mathematical concepts by investigating the existence of vampires. Incorporates addition and multiplication skills, patterning, variables, formulas, exponential growth, and proof by contradiction. (LRW)
Life and dynamic capacity modeling for aircraft transmissions
NASA Technical Reports Server (NTRS)
Savage, Michael
1991-01-01
A computer program to simulate the dynamic capacity and life of parallel shaft aircraft transmissions is presented. Five basic configurations can be analyzed: single mesh, compound, parallel, reverted, and single plane reductions. In execution, the program prompts the user for the data file prefix name, takes input from a ASCII file, and writes its output to a second ASCII file with the same prefix name. The input data file includes the transmission configuration, the input shaft torque and speed, and descriptions of the transmission geometry and the component gears and bearings. The program output file describes the transmission, its components, their capabilities, locations, and loads. It also lists the dynamic capability, ninety percent reliability, and mean life of each component and the transmission as a system. Here, the program, its input and output files, and the theory behind the operation of the program are described.
Shuttle Data Center File-Processing Tool in Java
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Miller, Walter H.
2006-01-01
A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
75 FR 34625 - Administrative Remedy Program: Exception to Initial Filing Procedures
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-18
... Remedy Program: Exception to Initial Filing Procedures AGENCY: Bureau of Prisons, Justice. ACTION... Administrative Remedy Program to add an exception to initial filing of Administrative Remedy appeals at the institution level. The exception will state that formal administrative remedy requests regarding initial...
Lee, Young Han; Park, Eun Hae; Suh, Jin-Suck
2015-01-01
The objectives are: 1) to introduce a simple and efficient method for extracting region of interest (ROI) values from a Picture Archiving and Communication System (PACS) viewer using optical character recognition (OCR) software and a macro program, and 2) to evaluate the accuracy of this method with a PACS workstation. This module was designed to extract the ROI values on the images of the PACS, and created as a development tool by using open-source OCR software and an open-source macro program. The principal processes are as follows: (1) capture a region of the ROI values as a graphic file for OCR, (2) recognize the text from the captured image by OCR software, (3) perform error-correction, (4) extract the values including area, average, standard deviation, max, and min values from the text, (5) reformat the values into temporary strings with tabs, and (6) paste the temporary strings into the spreadsheet. This principal process was repeated for the number of ROIs. The accuracy of this module was evaluated on 1040 recognitions from 280 randomly selected ROIs of the magnetic resonance images. The input times of ROIs were compared between conventional manual method and this extraction module-assisted input method. The module for extracting ROI values operated successfully using the OCR and macro programs. The values of the area, average, standard deviation, maximum, and minimum could be recognized and error-corrected with AutoHotkey-coded module. The average input times using the conventional method and the proposed module-assisted method were 34.97 seconds and 7.87 seconds, respectively. A simple and efficient method for ROI value extraction was developed with open-source OCR and a macro program. Accurate inputs of various numbers from ROIs can be extracted with this module. The proposed module could be applied to the next generation of PACS or existing PACS that have not yet been upgraded. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Software Implements a Space-Mission File-Transfer Protocol
NASA Technical Reports Server (NTRS)
Rundstrom, Kathleen; Ho, Son Q.; Levesque, Michael; Sanders, Felicia; Burleigh, Scott; Veregge, John
2004-01-01
CFDP is a computer program that implements the CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol, which is an international standard for automatic, reliable transfers of files of data between locations on Earth and in outer space. CFDP administers concurrent file transfers in both directions, delivery of data out of transmission order, reliable and unreliable transmission modes, and automatic retransmission of lost or corrupted data by use of one or more of several lost-segment-detection modes. The program also implements several data-integrity measures, including file checksums and optional cyclic redundancy checks for each protocol data unit. The metadata accompanying each file can include messages to users application programs and commands for operating on remote file systems.
ERIC Educational Resources Information Center
Carson, S. R.
1998-01-01
Presents a method for using spreadsheets to model special relativistic phenomena based on the connection between electric and magnetic fields in special relativity. Uses the time dilation equation to carry out transformations between reference frames that show the connection between the fields quantitatively. (DDR)
DOT National Transportation Integrated Search
2014-03-01
This study resulted in the development of the GASCAP model (the Greenhouse Gas Assessment : Spreadsheet for Transportation Capital Projects). This spreadsheet model provides a user-friendly interface for determining the greenhouse gas (GHG) emissions...
ERIC Educational Resources Information Center
Ivancevich, Daniel M.; And Others
1996-01-01
Points out that political and economic pressures have sometimes caused the Financial Accounting Standards Board to alter standards. Presents a spreadsheet tool that demonstrates the economic consequences of adopting accounting standards. (SK)
76 FR 34124 - Civil Supersonic Aircraft Panel Discussion
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... and continuing to the second line in the second column, the Web site address should read as follows: https://spreadsheets.google.com/spreadsheet/viewform?formkey=dEFEdlRnYzBiaHZtTUozTHVtbkF4d0E6MQ . [FR...
Assessment of ODOT culvert load rating spreadsheets for use in Michigan.
DOT National Transportation Integrated Search
2013-01-01
The project Assessment of ODOT Culvert Load Rating Spreadsheets for use in Michigan was : a short time-frame project funded by the Michigan Department of Transportation (MDOT) : through the Center for Structural Durability (CSD) at Michigan Tec...
A TOOL FOR PLANNING AERIAL PHOTOGRAPHY
abstract The U.S. EPAs Pacific Coastal Ecology Branch has developed a tool in the form of an Excel. spreadsheet that facilitates planning aerial photography missions. The spreadsheet accepts various input parameters such as desired photo-scale and boundary coordinates of the stud...
Hand, Maureen; Augustine, Chad; Feldman, David; Kurup, Parthiv; Beiter, Philipp; O'Connor, Patrick
2017-08-21
Each year since 2015, NREL has presented Annual Technology Baseline (ATB) in a spreadsheet that contains detailed cost and performance data (both current and projected) for renewable and conventional technologies. The spreadsheet includes a workbook for each technology. This spreadsheet provides data for the 2017 ATB. In this edition of the ATB, offshore wind power has been updated to include 15 technical resource groups. And, two options are now provided for representing market conditions for project financing, including current market conditions and long-term historical conditions. For more information, see https://atb.nrel.gov/.
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
1999-10-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. Through these examples and additional exercises at the end of each chapter, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems.
Software for Automated Reading of STEP Files by I-DEAS(trademark)
NASA Technical Reports Server (NTRS)
Pinedo, John
2003-01-01
A program called "readstep" enables the I-DEAS(tm) computer-aided-design (CAD) software to automatically read Standard for the Exchange of Product Model Data (STEP) files. (The STEP format is one of several used to transfer data between dissimilar CAD programs.) Prior to the development of "readstep," it was necessary to read STEP files into I-DEAS(tm) one at a time in a slow process that required repeated intervention by the user. In operation, "readstep" prompts the user for the location of the desired STEP files and the names of the I-DEAS(tm) project and model file, then generates an I-DEAS(tm) program file called "readstep.prg" and two Unix shell programs called "runner" and "controller." The program "runner" runs I-DEAS(tm) sessions that execute readstep.prg, while "controller" controls the execution of "runner" and edits readstep.prg if necessary. The user sets "runner" and "controller" into execution simultaneously, and then no further intervention by the user is required. When "runner" has finished, the user should see only parts from successfully read STEP files present in the model file. STEP files that could not be read successfully (e.g., because of format errors) should be regenerated before attempting to read them again.
Converting from XML to HDF-EOS
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehead, Camilla Dunham; McNeil, Michael; Dunham_Whitehead, Camilla
2008-02-28
The U.S. Environmental Protection Agency (EPA) influences the market for plumbing fixtures and fittings by encouraging consumers to purchase products that carry the WaterSense label, which certifies those products as performing at low flow rates compared to unlabeled fixtures and fittings. As consumers decide to purchase water-efficient products, water consumption will decline nationwide. Decreased water consumption should prolong the operating life of water and wastewater treatment facilities.This report describes the method used to calculate national water savings attributable to EPA?s WaterSense program. A Microsoft Excel spreadsheet model, the National Water Savings (NWS) analysis model, accompanies this methodology report. Version 1.0more » of the NWS model evaluates indoor residential water consumption. Two additional documents, a Users? Guide to the spreadsheet model and an Impacts Report, accompany the NWS model and this methodology document. Altogether, these four documents represent Phase One of this project. The Users? Guide leads policy makers through the spreadsheet options available for projecting the water savings that result from various policy scenarios. The Impacts Report shows national water savings that will result from differing degrees of market saturation of high-efficiency water-using products.This detailed methodology report describes the NWS analysis model, which examines the effects of WaterSense by tracking the shipments of products that WaterSense has designated as water-efficient. The model estimates market penetration of products that carry the WaterSense label. Market penetration is calculated for both existing and new construction. The NWS model estimates savings based on an accounting analysis of water-using products and of building stock. Estimates of future national water savings will help policy makers further direct the focus of WaterSense and calculate stakeholder impacts from the program.Calculating the total gallons of water the WaterSense program saves nationwide involves integrating two components, or modules, of the NWS model. Module 1 calculates the baseline national water consumption of typical fixtures, fittings, and appliances prior to the program (as described in Section 2.0 of this report). Module 2 develops trends in efficiency for water-using products both in the business-as-usual case and as a result of the program (Section 3.0). The NWS model combines the two modules to calculate total gallons saved by the WaterSense program (Section 4.0). Figure 1 illustrates the modules and the process involved in modeling for the NWS model analysis.The output of the NWS model provides the base case for each end use, as well as a prediction of total residential indoor water consumption during the next two decades. Based on the calculations described in Section 4.0, we can project a timeline of water savings attributable to the WaterSense program. The savings increase each year as the program results in the installation of greater numbers of efficient products, which come to compose more and more of the product stock in households throughout the United States.« less
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
Long Endurance Underwater Power System
1989-09-01
Figures 7.3 through 7.8. Data analysis was done using the Excel spreadsheet program on an IBM compatible computer -w.d utsported to a MALcintosh for...shall be compared with data obtained from non-leaking gill carridges for a mor complete analysis . 4) The pump attached to the test setup was not...BACKGROUND 5 V. ANALYSIS 6 VI. RESULTS AND CONCLUSIONS 6 )Preferred Method 7 VI. REFERENCES 11 App- nd.x-.1 12 SECTION - II ALWATT HY.HG•N GENERATOR I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunsberger, Randolph; Tomberlin, Gregg; Gaul, Chris
As part of the Army Net-Zero Energy Installation program, the Fort Carson Army Base requested that NREL evaluate the feasibility of adding a biomass boiler to the district heating system served by Building 1860. We have also developed an Excel-spreadsheet-based decision support tool--specific to the historic loads served by Building 1860--with which users can perform what-if analysis on gas costs, biomass costs, and other parameters. For economic reasons, we do not recommend adding a biomass system at this time.
Using a Spreadsheet To Explore Melting, Dissolving and Phase Diagrams.
ERIC Educational Resources Information Center
Goodwin, Alan
2002-01-01
Compares phase diagrams relating to the solubilities and melting points of various substances in textbooks with those generated by a spreadsheet using data from the literature. Argues that differences between the diagrams give rise to new chemical insights. (Author/MM)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-62079; File No. 4-598] Program for Allocation... Order gives effect to the Plan filed with the Commission in File No. 4-598. The Parties shall notify all..., pursuant to Section 17(d) of the Act, that the Plan in File No. 4-598, between FINRA and EDGX, filed...
NASA Astrophysics Data System (ADS)
Hronusov, V. V.
2006-12-01
We suggest a method of using external public servers for rearranging, restructuring and rapid sharing of environmental data for the purpose of quick presentations in numerous GE clients. The method allows to add new philosophy for the presentation (publication) of the data (mostly static) stored in the public domain (e.g., Blue Marble, Visible Earth, etc). - The new approach is generated by publishing freely accessible spreadsheets which contain enough information and links to the data. Due to the fact that most of the large depositories of the data on the environmental monitoring have rather simple net address system as well as simple hierarchy mostly based on the date and type of the data, it is possible to develop the http-based link to the file which contains the data. Publication of new data on the server is recorded by a simple entering a new address into a cell in the spreadsheet. At the moment we use the EditGrid (www.editgrid.com) system as a spreadsheet platform. The generation of kml-codes is achieved on the basis of XML data and XSLT procedures. Since the EditGride environment supports "fetch" and similar commands, it is possible to create"smart-adaptive" KML generation on the fly based on the data streams from RSS and XML sources. The previous GIS-based methods could combine hi-definition data combined from various sources, but large- scale comparisons of dynamic processes have been usually out of reach of the technology. The suggested method allows unlimited number of GE clients to view, review and compare dynamic and static process of previously un-combinable sources, and on unprecedent scales. The ease of automated or computer-assisted georeferencing has already led to translation about 3000 raster public domain imagery, point and linear data sources into GE-language. In addition the suggested method allows a user to create rapid animations to demonstrate dynamic processes; roducts of high demand in education, meteorology, volcanology and potentially in a number of industries. In general it is possible to state that the new approach, which we have tested on numerous projects, saves times and energy in creating huge amounts of georeferenced data of various kinds, and thus provided an excellent tools for education and science.
A simple model of hysteresis behavior using spreadsheet analysis
NASA Astrophysics Data System (ADS)
Ehrmann, A.; Blachowicz, T.
2015-01-01
Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur.
Design and Evaluation of a Personal Diffusion Battery.
Vosburgh, Donna J H; Klein, Timothy; Sheehan, Maura; Anthony, T Renee; Peters, Thomas M
A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference ( R nano ) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor.
Design and Evaluation of a Personal Diffusion Battery
Vosburgh, Donna J. H.; Klein, Timothy; Sheehan, Maura; Anthony, T. Renee; Peters, Thomas M.
2016-01-01
A four-stage personal diffusion battery (pDB) was designed and constructed to measure submicron particle size distributions. The pDB consisted of a screen-type diffusion battery, solenoid valve system, and electronic controller. A data inversion spreadsheet was created to solve for the number median diameter (NMD), geometric standard deviation (GSD), and particle number concentration of unimodal aerosols using stage number concentrations from the pDB combined with a handheld condensation particle counter (pDB+CPC). The inversion spreadsheet included particle entry losses, theoretical penetrations across screens, the detection efficiency of the CPC, and constraints so the spreadsheet solved to values within the pDB range. Size distribution parameters (NMD, GSD, and number concentration) measured with the pDB+CPC with inversion spreadsheet were within 25% of those measured with a scanning mobility particle sizer (SMPS) for 5 of 12 polydisperse combustion aerosols. For three tests conducted with propylene torch exhaust, the pDB+CPC with inversion spreadsheet successfully identified that the NMD was smaller than the constraint value of 16 nm. The ratio of the nanoparticle portion of the aerosol compared to the reference (R nano) was calculated to determine the ability of pDB+CPC with inversion spreadsheet to measure the nanoparticle portion of the aerosols. The R nano ranged from 0.87 to 1.01 when the inversion solved and from 0.06 to 2.01 when the inversion solved to a constraint. The pDB combined with CPC has limited use as a personal monitor but combining the pDB with a different detector would allow for the pDB to be used as a personal monitor. PMID:26900207
Simplified risk assessment of noise induced hearing loss by means of 2 spreadsheet models.
Lie, Arve; Engdahl, Bo; Tambs, Kristian
2016-11-18
The objective of this study has been to test 2 spreadsheet models to compare the observed with the expected hearing loss for a Norwegian reference population. The prevalence rates of the Norwegian and the National Institute for Occupational Safety and Health (NIOSH) definitions of hearing outcomes were calculated in terms of sex and age, 20-64 years old, for a screened (with no occupational noise exposure) (N = 18 858) and unscreened (N = 38 333) Norwegian reference population from the Nord-Trøndelag Hearing Loss Study (NTHLS). Based on the prevalence rates, 2 different spreadsheet models were constructed in order to compare the prevalence rates of various groups of workers with the expected rates. The spreadsheets were then tested on 10 different occupational groups with varying degrees of hearing loss as compared to a reference population. Hearing of office workers, train drivers, conductors and teachers differed little from the screened reference values based on the Norwegian and the NIOSH criterion. The construction workers, miners, farmers and military had an impaired hearing and railway maintenance workers and bus drivers had a mildly impaired hearing. The spreadsheet models give a valid assessment of the hearing loss. The use of spreadsheet models to compare hearing in occupational groups with that of a reference population is a simple and quick method. The results are in line with comparable hearing thresholds, and allow for significance testing. The method is believed to be useful for occupational health services in the assessment of risk of noise induced hearing loss (NIHL) and the preventive potential in groups of noise-exposed workers. Int J Occup Med Environ Health 2016;29(6):991-999. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Ellis, Alisha M.; Smith, Christopher G.
2017-11-28
After Hurricane Sandy, scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center conducted a seasonal collection of estuarine, marsh, and sandy overwash surface sediments from Chincoteague Bay, Tom’s Cove, and the surrounding Assateague Island and Delmarva Peninsula in March–April and October 2014. Surplus surface sediment was analyzed for metals, percent carbon and nitrogen, δ13C, and δ15N as part of a complementary U.S. Geological Survey Coastal and Marine Geology Program Sea-level and Storm Impacts on Estuarine Environments and Shorelines project study. The geochemical subsample analyzed for metals and stable isotopes at each site may be used for comparison with past data sets, to create a modern baseline of the natural distribution of the area, to understand seasonal variability as it relates to the health of the local environment, and to assess marsh-to-bay interactions. The use of metals, stable carbon, and stable nitrogen isotopes allows for a more cohesive snapshot of factors influencing the environment and could aid in tracking environmental change.This report serves as an archive for chemical data derived from the surface sediment. Data are available for a seasonal comparison between the March–April 2014 and October 2014 sampling trips. Downloadable data are available as Microsoft Excel spreadsheets. These additional files include formal Federal Geographic Data Committee metadata (data downloads).
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.
1991-01-01
Ada Namelist Package, developed for Ada programming language, enables calling program to read and write FORTRAN-style namelist files. Features are: handling of any combination of types defined by user; ability to read vectors, matrices, and slices of vectors and matrices; handling of mismatches between variables in namelist file and those in programmed list of namelist variables; and ability to avoid searching entire input file for each variable. Principle benefits derived by user: ability to read and write namelist-readable files, ability to detect most file errors in initialization phase, and organization keeping number of instantiated units to few packages rather than to many subprograms.
A programmable rules engine to provide clinical decision support using HTML forms.
Heusinkveld, J.; Geissbuhler, A.; Sheshelidze, D.; Miller, R.
1999-01-01
The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser. Images Figure 1 PMID:10566470
MODY - calculation of ordered structures by symmetry-adapted functions
NASA Astrophysics Data System (ADS)
Białas, Franciszek; Pytlik, Lucjan; Sikora, Wiesława
2016-01-01
In this paper we focus on the new version of computer program MODY for calculations of symmetryadapted functions based on the theory of groups and representations. The choice of such a functional frame of coordinates for description of ordered structures leads to a minimal number of parameters which must be used for presentation of such structures and investigations of their properties. The aim of this work is to find those parameters, which are coefficients of a linear combination of calculated functions, leading to construction of different types of structure ordering with a given symmetry. A spreadsheet script for simplification of this work has been created and attached to the program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enghauser, Michael
2015-02-01
The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.
Clock Agreement Among Parallel Supercomputer Nodes
Jones, Terry R.; Koenig, Gregory A.
2014-04-30
This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.
47 CFR 76.953 - Limitation on filing a complaint.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Limitation on filing a complaint. 76.953... programming service or associated equipment may be filed against a cable operator only in the event of a rate... change for cable programming service or associated equipment may be filed against a cable operator only...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... approved permanently, this pilot program will expire on December 10, 2010. The instant rule filing proposes... program should be extended. Accordingly, pursuant to the instant rule filing, the expiration date of the... filed with the Commission, and all written communications relating to the proposed rule change between...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... will expire on January 31, 2012. The instant rule filing proposes an extension to the pilot program... to the instant rule filing, the expiration date of the pilot program referenced in [[Page 4843... filed with the Commission, and all written communications relating to the proposed rule change between...
Advanced Technology Multiple Criteria Decision Model.
1981-11-01
ratings of the sys- tem parameters; and (3), HEADER which contains information on the structure of the problem and titles. Two supporting programs develop...in these files are given in Section V.2. 2. DATA STRUCTURE TABLES This section describes the data files used in the system selection model program ...the supporting program PPP and an input file to UPPP and SSMP. Figure 13 shows the structure of this file. b. User’s preference package (UPP) UPP is
This SOP described the method used to automatically parse analytical data generated from gas chromatography/mass spectrometry (GC/MS) analyses into CTEPP summary spreadsheets and electronically import the summary spreadsheets into the CTEPP study database.
A Spreadsheet-based GIS tool for planning aerial photography
The U.S.EPA's Pacific Coastal Ecology Branch has developed a tool which facilitates planning aerial photography missions. This tool is an Excel spreadsheet which accepts various input parameters such as desired photo-scale and boundary coordinates of the study area and compiles ...
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
2000-01-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues
NASA Technical Reports Server (NTRS)
1981-01-01
The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.
ModelArchiver—A program for facilitating the creation of groundwater model archives
Winston, Richard B.
2018-03-01
ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.
PATSTAGS - PATRAN-STAGSC-1 TRANSLATOR
NASA Technical Reports Server (NTRS)
Otte, N. E.
1994-01-01
PATSTAGS translates PATRAN finite model data into STAGS (Structural Analysis of General Shells) input records to be used for engineering analysis. The program reads data from a PATRAN neutral file and writes STAGS input records into a STAGS input file and a UPRESS data file. It is able to support translations of nodal constraints, nodal, element, force and pressure data. PATSTAGS uses three files: the PATRAN neutral file to be translated, a STAGS input file and a STAGS pressure data file. The user provides the names for the neutral file and the desired names of the STAGS files to be created. The pressure data file contains the element live pressure data used in the STAGS subroutine UPRESS. PATSTAGS is written in FORTRAN 77 for DEC VAX series computers running VMS. The main memory requirement for execution is approximately 790K of virtual memory. Output blocks can be modified to output the data in any format desired, allowing the program to be used to translate model data to analysis codes other than STAGSC-1 (HQN-10967). This program is available in DEC VAX BACKUP format on a 9-track magnetic tape or TK50 tape cartridge. Documentation is included in the price of the program. PATSTAGS was developed in 1990. DEC, VAX, TK50 and VMS are trademarks of Digital Equipment Corporation.
SEGY to ASCII: Conversion and Plotting Program
Goldman, Mark R.
1999-01-01
This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/
As-built design specification for PARCLS
NASA Technical Reports Server (NTRS)
Tompkins, M. A. (Principal Investigator)
1981-01-01
The PARCLS program, part of the CLASFYG package, reads a parameter file created by the CLASFYG program and a pure pixel ground truth file in order to create to classification file of three separate crop categories in universal format.
Documenting AUTOGEN and APGEN Model Files
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.; DelGuericio, Chris c.
2008-01-01
A computer program called "autogen hypertext map generator" satisfies a need for documenting and assisting in visualization of, and navigation through, model files used in the AUTOGEN and APGEN software mentioned in the two immediately preceding articles. This program parses autogen script files, autogen model files, PERL scripts, and apgen activity-definition files and produces a hypertext map of the files to aid in the navigation of the model. This program also provides a facility for adding notes and descriptions, beyond what is in the source model represented by the hypertext map. Further, this program provides access to a summary of the model through variable, function, sub routine, activity and resource declarations as well as providing full access to the source model and source code. The use of the tool enables easy access to the declarations and the ability to traverse routines and calls while analyzing the model.
Buffer$--An Economic Analysis Tool
Gary Bentrup
2007-01-01
Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...
A spreadsheet that calculates meteor orbits
NASA Astrophysics Data System (ADS)
Langbroek, M.
2004-08-01
The author has written an MS Excel spreadsheet application called Metorb08.xls which calculates a meteor's orbital elements from its apparent radiant position and initial speed. It can be downloaded from URL http://home.wanadoo.nl/marco.langbroek along with a suite of other meteor-related Excel applications.
Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments
ERIC Educational Resources Information Center
Blayney, Paul; Freeman, Mark
2004-01-01
This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…
The Devil and Daniel's Spreadsheet
ERIC Educational Resources Information Center
Burke, Maurice J.
2012-01-01
"When making mathematical models, technology is valuable for varying assumptions, exploring consequences, and comparing predictions with data," notes the Common Core State Standards Initiative (2010, p. 72). This exploration of the recursive process in the Devil and Daniel Webster problem reveals that the symbolic spreadsheet fits this bill.…
Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation
for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home
When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.
2013-07-01
As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrationsmore » at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-16
... program will expire on December 10, 2010. The instant rule filing proposes to extend the pilot program... that the duration of this pilot program should be extended. Accordingly, pursuant to the instant rule... respect to the proposed rule change that are filed with the Commission, and all written communications...
Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.
Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G
2012-05-01
This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.
2011-04-30
internal constructs f l f t th h l i l li k l i (LLA)? 3 use u or managemen , roug ex ca n ana ys s LLA Methodology Can Help! Warfighters RDTE...information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports...categories of interest in various spreadsheets). This year, we started to develop LLA from a demonstration to an operational capability and facilitate a
2006-03-01
utilized; for normality, the Shapiro-Wilk test ; and finally, for constant variance the Breusch - Pagan test was used. The Durbin-Watson test results...against this violation, it is of concern with respect to the validity of this model. In order to execute Breusch - Pagan test , it is necessary to obtain...a SSE of 23,297.73. The ρ-value for the Breusch - Pagan test was obtained via use of a spreadsheet program (Microsoft’s Excel) and the expression
Introducing Artificial Neural Networks through a Spreadsheet Model
ERIC Educational Resources Information Center
Rienzo, Thomas F.; Athappilly, Kuriakose K.
2012-01-01
Business students taking data mining classes are often introduced to artificial neural networks (ANN) through point and click navigation exercises in application software. Even if correct outcomes are obtained, students frequently do not obtain a thorough understanding of ANN processes. This spreadsheet model was created to illuminate the roles of…
Introducing Simulation via the Theory of Records
ERIC Educational Resources Information Center
Johnson, Arvid C.
2011-01-01
While spreadsheet simulation can be a useful method by which to help students to understand some of the more advanced concepts in an introductory statistics course, introducing the simulation methodology at the same time as these concepts can result in student cognitive overload. This article describes a spreadsheet model that has been…
The Spreadsheet in an Educational Setting. Microcomputing Working Paper Series F 84-4.
ERIC Educational Resources Information Center
Wozny, Lucy
This overview of a specific spreadsheet, Microsoft's Multiplan for the Apple Macintosh microcomputer, emphasizes specific features that are important to the academic community, including the mathematical functions of algebra, trigonometry, and statistical analysis. Additional features are summarized, including data formats for both numerical and…
Forming Conjectures within a Spreadsheet Environment
ERIC Educational Resources Information Center
Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan
2006-01-01
This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus…
Domestic Disasters and Geospatial Technology for the Defense Logistics Agency
2014-12-01
total distance traveled and satisfy all fuel demands. This report used the Vehicle Routing Problem (VRP) Spreadsheet Solver, developed by Erdogan ...Security Affairs, 2(2), 5–10. Erdogan , G. (2013). VRP spreadsheet solver. Retrieved from VeRoLog: EURO Working Group on Vehicle Routing and Logistics
Interactive Spreadsheets in JCE Webware
ERIC Educational Resources Information Center
Coleman, William F.; Fedosky, Edward W.
2005-01-01
A description of the Microsoft Excel spreadsheet simulation, Anharmonicity.xls that can be used to smoothly and continuously switch a plotted function and its quadratic approximation is presented. It can be used in a classroom demonstration or incorporated into a student-centered computer-laboratory exercise to examine the qualitative behavior of…
Spreadsheet Applications: Prototyping an Innovative Blended Course
ERIC Educational Resources Information Center
Baker, J. Howard
2004-01-01
After teaching the advanced spreadsheet course at a major university in Louisiana as a traditional classroom course for a number of years, it was decided to create a prototype-blended course, with a considerable portion offered via distance education. This research, which uses a prototyping methodology, is exploratory in nature. Prototyping can…
LOTUS 1-2-3 and Decision Support: Allocating the Monograph Budget.
ERIC Educational Resources Information Center
Perry-Holmes, Claudia
1985-01-01
Describes the use of electronic spreadsheet software for library decision support systems using personal computers. Discussion covers templates, formulas for allocating the materials budget, LOTUS 1-2-3 and budget allocations, choosing a formula, the spreadsheet itself, graphing capabilities, and advantages and disadvantages of templates. Six…
Triangular Plots and Spreadsheet Software.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how the limitations of the built-in graphics capabilities of spreadsheet software can be overcome by making full use of the flexibility of the grahics options. Uses triangular plots with labeled field boundaries produced using Lotus 1-2-3 to demonstrate these techniques and their use in teaching geology. (CW)
Calculating the Variables of Finance on a Spreadsheet.
ERIC Educational Resources Information Center
Rochowicz, John A., Jr.
The different approaches for solving problems and learning mathematics with technology are invaluable. This paper describes how to determine the variables of the ordinary annuity equation with a spreadsheet. Examples of future value of annuity, sinking fund annuity, the number of periods necessary for periodic payments plus interest to accumulate…
Buyers Guide: Communications Software--Overview; Ratings Digest; Reviews; Benchmarks.
ERIC Educational Resources Information Center
Lockwood, Russ; And Others
1988-01-01
Contains articles which review communications software. Includes "Crosstalk Mark 4,""ProComm,""Freeway Advanced,""Windows InTalk,""Relay Silver," and "Smartcom III." Compares in terms of text proprietary, MCI upload, Test ASCII, Spreadsheet Proprietary, Text XMODEM, Spreadsheet XMODEM, MCI Download, Documentation, Support and Service, ease of use,…
Spreadsheet Analysis of Harvesting Systems
R.B. Rummer; B.L. Lanford
1987-01-01
Harvesting systems can be modeled and analyzed on microcomputers using commercially available "spreadsheet" software. The effect of system or external variables on the production rate or system cost can be evaluated and alternative systems can be easily examined. The tedious calculations associated with such analyses are performed by the computer. For users...
Constructing Meanings and Utilities within Algebraic Tasks
ERIC Educational Resources Information Center
Ainley, Janet; Bills, Liz; Wilson, Kirsty
2004-01-01
The Purposeful Algebraic Activity project aims to explore the potential of spreadsheets in the introduction to algebra and algebraic thinking. We discuss two sub-themes within the project: tracing the development of pupils' construction of meaning for variable from arithmetic-based activity, through use of spreadsheets, and into formal algebra,…
75 FR 27984 - Broadband Technology Opportunities Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
....: 0907141137-0222-10] RIN 0660-ZA28 Broadband Technology Opportunities Program AGENCY: National...; Reopening of Application Filing Window for Broadband Technology Opportunities Program Comprehensive... filing window for the Broadband Technology Opportunities Program (BTOP) that the agency established...
Evaluating the Effectiveness of the 2000-2001 NASA "Why?" Files Program
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Frank, Kari Lou; Ashcroft, Scott B.; Williams, Amy C.
2002-01-01
NASA 'Why?' Files, a research and standards-based, Emmy-award winning series of 60-minute instructional programs for grades 3-5, introduces students to NASA; integrates mathematics, science, and technology by using Problem-Based Learning (PBL), scientific inquiry, and the scientific method; and motivates students to become critical thinkers and active problem solvers. All four 2000-2001 NASA 'Why?' Files programs include an instructional broadcast, a lesson guide, an interactive web site, plus numerous instructional resources. In March 2001, 1,000 randomly selected program registrants participated in a survey. Of these surveys, 185 (154 usable) met the established cut-off date. Respondents reported that (1) they used the four programs in the 2000-2001 NASA 'Why?' Files series; (2) series goals and objectives were met; (3) programs met national mathematics, science, and technology standards; (4) program content was developmentally appropriate for grade level; and (5) programs enhanced/enriched the teaching of mathematics, science, and technology.
Simulation of axonal excitability using a Spreadsheet template created in Microsoft Excel.
Brown, A M
2000-08-01
The objective of this present study was to implement an established simulation protocol (A.M. Brown, A methodology for simulating biological systems using Microsoft Excel, Comp. Methods Prog. Biomed. 58 (1999) 181-90) to model axonal excitability. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet and does not require any programming skills or use of the macro language. Once the initial spreadsheet template has been set up the simulations described in this paper can be executed with a few simple keystrokes. The model axon contained voltage-gated ion channels that were modeled using Hodgkin Huxley style kinetics. The basic properties of axonal excitability modeled were: (1) threshold of action potential firing, demonstrating that not only are the stimulus amplitude and duration critical in the generation of an action potential, but also the resting membrane potential; (2) refractoriness, the phenomenon of reduced excitability immediately following an action potential. The difference between the absolute refractory period, when no amount of stimulus will elicit an action potential, and relative refractory period, when an action potential may be generated by applying increased stimulus, was demonstrated with regard to the underlying state of the Na(+) and K(+) channels; (3) temporal summation, a process by which two sub-threshold stimuli can unite to elicit an action potential was shown to be due to conductance changes outlasting the first stimulus and summing with the second stimulus-induced conductance changes to drive the membrane potential past threshold; (4) anode break excitation, where membrane hyperpolarization was shown to produce an action potential by removing Na(+) channel inactivation that is present at resting membrane potential. The simulations described in this paper provide insights into mechanisms of axonal excitation that can be carried out by following an easily understood protocol.
UNIX-BASED DATA MANAGEMENT SYSTEM FOR PROPAGATION EXPERIMENTS
NASA Technical Reports Server (NTRS)
Kantak, A. V.
1994-01-01
This collection of programs comprises The UNIX Based Data Management System for the Pilot Field Experiment (PiFEx) which is an attempt to mimic the Mobile Satellite (MSAT) scenario. The major purposes of PiFEx are to define the mobile communications channels and test the workability of new concepts used to design various components of the receiver system. The results of the PiFex experiment are large amounts of raw data which must be accessed according to a researcher's needs. This package provides a system to manage the PiFEx data in an interactive way. The system not only provides the file handling necessary to retrieve the desired data, but also several FORTRAN programs to generate some standard results pertaining to propagation data. This package assumes that the data file initially generated from the experiment has been already converted from binary to ASCII format. The Data Management system described here consists of programs divided into two categories: those programs that handle the PiFEx generated files and those that are used for number-crunching of these files. Five FORTRAN programs and one UNIX shell script file are used for file manipulation purposes. These activities include: calibration of the acquired data; and parsing of the large data file into datasets concerned with different aspects of the experiment such as the specific calibrated propagation data, dynamic and static loop error data, statistical data, and temperature and spatial data on the hardware used in the experiment. The five remaining FORTRAN programs are used to generate usable information about the data. Signal level probability, probability density of the signal fitting the Rician density function, frequency of the data's fade duration, and the Fourier transform of the data can all be generated from these data manipulation programs. In addition, a program is provided which generates a downloadable file from the signal levels and signal phases files for use with the plotting routine AKPLOT (NPO-16931). All programs in this package are written in either FORTRAN-77 or UNIX shell-scripts. The package does not include test data. The programs were developed in 1987 for use with a UNIX operating system on a DEC MicroVAX computer.
Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun
2015-01-01
It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following design patterns. http://ontorat.hegroup.org/.
Multidate Landsat lake quality monitoring program
NASA Technical Reports Server (NTRS)
Fisher, L. T.; Scarpace, F. L.; Thomsen, R. G.
1979-01-01
A unified package of files and programs has been developed to automate the multidate Landsat-derived analyses of water quality for about 3000 inland lakes throughout Wisconsin. A master lakes file which stores geographic information on the lakes, a file giving the latitudes and longitudes of control points for scene navigation, and a program to estimate control point locations and produce microfiche character maps for scene navigation are among the files and programs of the system. The use of ground coordinate systems to isolate irregular shaped areas which can be accessed at will appears to provide an economical means of restricting the size of the data set.
OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.
Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2013-02-15
Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.
Forming conjectures within a spreadsheet environment
NASA Astrophysics Data System (ADS)
Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan
2006-12-01
This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus influence patterns of social interaction, and how this interaction shapes the mathematical ideas that are engaged with. Notions of conjecture, along with the particular faculty of the spreadsheet setting, are considered with regard to the facilitation of mathematical thinking. Employing an interpretive perspective, a key focus is on how alternative pedagogical media and associated discursive networks influence the way that students form and test informal conjectures.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-06
... resulting from the Department's implementation of an electronic filing and documents management program... regulations that is entitled ``IA ACCESS Handbook On Electronic Filing Procedures'' (``IA ACCESS Handbook... management program named Import Administration Antidumping and Countervailing Duty Centralized Electronic...
ERIC Educational Resources Information Center
Lai, Chiu-Lin; Hwang, Gwo-Jen
2015-01-01
In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…
A Computer Spreadsheet for Locating Assistive Devices.
ERIC Educational Resources Information Center
Palmer, Catherine V.; Garstecki, Dean C.
1988-01-01
The article presents a directory of assistive devices for persons with hearing impairments in a grid format by distributor and type of device (alerting devices, telephone, TV/radio/stereo, personal communication, group communication, and other). The product locator is also available in spreadsheet form for either the Macintosh or IBM-PC computers.…
Working Together: Google Apps Goes to School
ERIC Educational Resources Information Center
Oishi, Lindsay
2007-01-01
Online collaboration and project-management tools allow people to work together without being in the same place at the same time. However, that is not all, Google Docs & Spreadsheets, for example, allows the creation of documents and spreadsheets just like in Microsoft Word and Excel, but with more collaborative capacity. Google Calendar lets…
Simulating Satellite and Space Probe Motion at High School with Spreadsheets
ERIC Educational Resources Information Center
Benacka, Jan
2017-01-01
This paper gives an account of an experiment in which thirty-three high school students of ages 17-19 developed spreadsheet numerical models of satellite and space probe motion. The models are free to download. A survey was carried out to find out the students' opinion of the lessons.
Using Spreadsheets to Teach Aspects of Biology Involving Mathematical Models
ERIC Educational Resources Information Center
Carlton, Kevin; Nicholls, Mike; Ponsonby, David
2004-01-01
Some aspects of biology, for example the Hardy-Weinberg simulation of population genetics or modelling heat flow in lizards, have an undeniable mathematical basis. Students can find the level of mathematical skill required to deal with such concepts to be an insurmountable hurdle to understanding. If not used effectively, spreadsheet models…
Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Kaynor, Robert K.
The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…
Negative Effects of Learning Spreadsheet Management on Learning Database Management
ERIC Educational Resources Information Center
Vágner, Anikó; Zsakó, László
2015-01-01
A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…
Spreadsheets as a Transparent Resource for Learning the Mathematics of Annuities
ERIC Educational Resources Information Center
Pournara, Craig
2009-01-01
The ability of mathematics teachers to decompress mathematics and to move between representations are two key features of mathematical knowledge that is usable for teaching. This article reports on four pre-service secondary mathematics teachers learning the mathematics of annuities. In working with spreadsheets students began to make sense of…
A Simple Spreadsheet Strikes a Nerve among Adjuncts
ERIC Educational Resources Information Center
Stratford, Michael
2012-01-01
Energized by his fellow adjunct professors who had gathered for a national meeting last month in Washington, District of Columbia, Joshua A. Boldt flew home to Athens, Georgia, opened his laptop, and created a Google document. On his personal blog, the writing instructor implored colleagues to contribute to the publicly editable spreadsheet,…
Studying Faculty Flows Using an Interactive Spreadsheet Model. AIR 1997 Annual Forum Paper.
ERIC Educational Resources Information Center
Kelly, Wayne
This paper describes a spreadsheet-based faculty flow model developed and implemented at the University of Calgary (Canada) to analyze faculty retirement, turnover, and salary issues. The study examined whether, given expected faculty turnover, the current salary increment system was sustainable in a stable or declining funding environment, and…
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
ERIC Educational Resources Information Center
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-01-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible…
ERIC Educational Resources Information Center
Agyei, Douglas D.; Voogt, Joke M.
2016-01-01
In this study, 12 pre-service mathematics teachers worked in teams to develop their knowledge and skills in using teacher-led spreadsheet demonstrations to help students explore mathematics concepts, stimulate discussions and perform authentic tasks through activity-based lessons. Pre-service teachers' lesson plans, their instruction of the…
Examining Errors in Simple Spreadsheet Modeling from Different Research Perspectives
ERIC Educational Resources Information Center
Kadijevich, Djordje M.
2012-01-01
By using a sample of 1st-year undergraduate business students, this study dealt with the development of simple (deterministic and non-optimization) spreadsheet models of income statements within an introductory course on business informatics. The study examined students' errors in doing this for business situations of their choice and found three…
Using Spreadsheets to Discover Meaning for Parameters in Nonlinear Models
ERIC Educational Resources Information Center
Green, Kris H.
2008-01-01
This paper explores the use of spreadsheets to develop an exploratory environment where mathematics students can develop their own understanding of the parameters of commonly encountered families of functions: linear, logarithmic, exponential and power. The key to this understanding involves opening up the definition of rate of change from the…
Using a Spreadsheet Scroll Bar to Solve Equilibrium Concentrations
ERIC Educational Resources Information Center
Raviolo, Andres
2012-01-01
A simple, conceptual method is described for using the spreadsheet scroll bar to find the composition of a system at chemical equilibrium. Simulation of any kind of chemical equilibrium can be carried out using this method, and the effects of different disturbances can be predicted. This simulation, which can be used in general chemistry…
Evolving Polygons and Spreadsheets: Connecting Mathematics across Grade Levels in Teacher Education
ERIC Educational Resources Information Center
Abramovich, Sergei; Brouwer, Peter
2009-01-01
This paper was prepared in response to the Conference Board of Mathematical Sciences recommendations for the preparation of secondary teachers. It shows how using trigonometry as a conceptual tool in spreadsheet-based applications enables one to develop mathematical understanding in the context of constructing geometric representations of unit…
ERIC Educational Resources Information Center
McPhee, C.; Bielick, S.; Masterton, M.; Flores, L.; Parmer, R.; Amchin, S.; Stern, S.; McGowan, H.
2015-01-01
The 2012 National Household Education Surveys Program (NHES:2012) Data File User's Manual provides documentation and guidance for users of the NHES:2012 data files. The manual provides information about the purpose of the study, the sample design, data collection procedures, data processing procedures, response rates, imputation, weighting and…
Music 4C, a multi-voiced synthesis program with instruments defined in C
NASA Astrophysics Data System (ADS)
Beauchamp, James W.
2003-04-01
Music 4C is a program which runs under Unix (including Linux) and provides a means for the synthesis of arbitrary signals as defined by the C code. The program is actually a loose translation of an earlier program, Music 4BF [H. S. Howe, Jr., Electronic Music Synthesis (Norton, 1975)]. A set of instrument definitions are driven by a numerical score which consists of a series of ``events.'' Each event gives an instrument name, start time and duration, and a number of parameters (e.g., pitch) which describe the event. Each instrument definition consists of event parameters, performance variables, initializations, and a synthesis algorithmic code. Thus, the synthetic signal, no matter how complex, is precisely defined. Moreover, the resulting sounds can be overlaid in any arbitrary pattern. The program serves as a mixer of algorithmically produced sounds or recorded sounds taken from sample files or synthesized from spectrum files. A score file can be entered by hand, generated from a program, translated from a MIDI file, or generated from an alpha-numeric score using an auxiliary program, Notepro. Output sample files are in wav, snd, or aiff format. The program is provided in the C source code for download.
Program to convert SUDS2ASC files to a single binary SEGY file
Goldman, Mark
2000-01-01
This program, SUDS2SEGY, converts and combines ASCII files created using SUDS2ASC Version 2.60, to a single SEGY file. SUDS2ASC has been used previously to create an ASCII file of three-component seismic data for an individual recording station. However, many seismic processing packages have difficulty reading in ASCII data. In addition, it may be cumbersome to process a separate file for each recording station, particularly if traces from different recording stations contain a different number of data samples and/or a different start time. This new program - SUDS2SEGY - combines these recording station files into a single SEGY file. In addition, SUDS2SEGY normalizes the trace times so that each trace starts at a given time and consists of a fixed number of samples. This normalization allows seismic data from many different stations to be read in as a single "data gather". SUDS2SEGY also produces a report summarizing the offset and maximum absolute amplitude for each component in a station file. These data are output separately to an ASCII file and can be subsequently input to a plotting package.
NASA Astrophysics Data System (ADS)
Grose, C. J.
2008-05-01
Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.
ProteinTracker: an application for managing protein production and purification
2012-01-01
Background Laboratories that produce protein reagents for research and development face the challenge of deciding whether to track batch-related data using simple file based storage mechanisms (e.g. spreadsheets and notebooks), or commit the time and effort to install, configure and maintain a more complex laboratory information management system (LIMS). Managing reagent data stored in files is challenging because files are often copied, moved, and reformatted. Furthermore, there is no simple way to query the data if/when questions arise. Commercial LIMS often include additional modules that may be paid for but not actually used, and often require software expertise to truly customize them for a given environment. Findings This web-application allows small to medium-sized protein production groups to track data related to plasmid DNA, conditioned media samples (supes), cell lines used for expression, and purified protein information, including method of purification and quality control results. In addition, a request system was added that includes a means of prioritizing requests to help manage the high demand of protein production resources at most organizations. ProteinTracker makes extensive use of existing open-source libraries and is designed to track essential data related to the production and purification of proteins. Conclusions ProteinTracker is an open-source web-based application that provides organizations with the ability to track key data involved in the production and purification of proteins and may be modified to meet the specific needs of an organization. The source code and database setup script can be downloaded from http://sourceforge.net/projects/proteintracker. This site also contains installation instructions and a user guide. A demonstration version of the application can be viewed at http://www.proteintracker.org. PMID:22574679
20 CFR 30.101 - In general, how is a survivor's claim filed?
Code of Federal Regulations, 2013 CFR
2013-04-01
... LABOR ENERGY EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 CLAIMS FOR COMPENSATION UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000, AS AMENDED Filing... who sustained an occupational illness or a covered illness must file a claim for compensation in...